Course
PostgraduateSemester
Sem. ISubject Code
MA618Subject Title
Foundations of Machine LearningSyllabus
Machine learning basics: capacity, overfitting and underfitting, hyperparameters and validation sets, bias & variance; PAC model; Rademacher complexity; growth function; VC-dimension; fundamental concepts of artificial neural networks; single layer perceptron classifier; multi-layer feed forward networks; single layer feed-back networks; associative memories; introductory concepts of reinforcement learning, Markhov decision process.
Text Books
Same as Reference
References
- Mohri, M., Rostamizadedh, A., and Talwalkar, A., Foundations of Machine Learning, TheMIT Press (2012).
- Jordon, M. I. and Mitchell, T. M., Machine Learning: Trends, perspectives, and prospects, Vol.349, Issue 6245, pp. 255-260, Science 2015.
- Shawe-Taylor, J. and Cristianini, N., Kernel Methods for Pattern Analysis, Cambridge Univ. Press (2004).
- Haykin, S., Neural Networks: A Comprehensive Foundation, 2nd ed., Prentice Hall (1998).
- Hassoun, M. H., Fundamentals of Artificial Neural Networks, PHI Learning (2010).
- Ripley, B. D., Pattern Recognition and Neural Networks, Cambridge Univ. Press (2008).
- Sutton R. S. and Barto, A. G., Reinforcement Learning: An Introduction, The MIT Press (2017)
Course Outcomes (COs):
CO1: Ensure students grasp fundamental concepts in machine learning, including neural networks, ensemble learning, overfitting, underfitting, bias-variance tradeoff, and reinforcement learning.
CO2: Enable students to apply machine learning techniques practically.
CO3: Equip students with the ability to evaluate and interpret the performance of machine learning models, emphasizing techniques for assessing generalization capabilities and managing bias-variance trade off.