Skip to main content

Foundations of Machine Learning

a
Course
Postgraduate
Semester
Electives
Subject Code
MA618

Syllabus

Machine learning basics: capacity, overfitting and underfitting, hyperparameters and validation sets, bias & variance; PAC model; Rademacher complexity; growth function; VC-dimension; fundamental concepts of artificial neural networks; single layer perceptron classifier; multi- layer feed forward networks; single layer feed-back networks; associative memories; introduc- tory concepts of reinforcement learning, Markhov decision process.

Text Books

Same as Reference

 

References

1. Mohri, M., Rostamizadedh, A., and Talwalkar, A., Foundations of Machine Learning, TheMIT Press (2012).

2. Jordon, M. I. and Mitchell, T. M., Machine Learning: Trends, perspectives, and pro- spects, Vol.349, Issue 6245, pp. 255-260, Science 2015.

3. Shawe-Taylor, J. and Cristianini, N., Kernel Methods for Pattern Analysis, Cambridge Univ. Press (2004).

4. Haykin, S., Neural Networks: A Comprehensive Foundation, 2nd ed., Prentice Hall (1998).

5. Hassoun, M. H., Fundamentals of Artificial Neural Networks, PHI Learning (2010).

6. Ripley, B. D., Pattern Recognition and Neural Networks, Cambridge Univ. Press (2008).

7. Sutton R. S. and Barto, A. G., Reinforcement Learning: An Introduction, The MIT Press (2017)

Course Outcomes (COs):
CO1: Understand fundamental concepts in machine learning, including neural networks, ensemble learning, overfitting, under fitting, bias-variance trade-off, and reinforcement learning.

CO2: Apply machine learning techniques practically.

CO3: Evaluate and interpret the performance of machine learning models, emphasizing techniques for assessing generalization capabilities and managing bias-variance trade-off.