Skip to main content

Optimization Methods for Machine Learning

a
Course
Postgraduate
Semester
Electives
Subject Code
AVD874

Syllabus

Introduction (ML applications)-topics in Linear system (linear regression)-Basics of Gradient Descent and its variants (logistic regression)-A detailed understanding of Projected Gradient (white-box adversarial attack)and Proximal Gradient (lasso)-Details of Conditional Gradient (recommendation system)-The Sub gradient approach (svm)-Mirror Descent and Metric Gradient methods-Acceleration (total variation denoising)-Smoothing (robust svm)-optimal transport for machine learning -Alternating (VAE)- Minimax (adversarial training)-Averaging (GANs)-Splitting (federated learning)-Extra gradient (maxentropy)-Stochastic Gradient (Boltzmann machine)-Variance Reduction (boosting)- Derivative- free (black-box adversarial attack)

Text Books

Same as Reference

References

1. First-order and Stochastic Optimization Methods for Machine Learning, Guanghui Lan, Springer, 2020.

2. Algorithms for Optimization, Mykel J. Kochender fer and Tim A.Wheeler, The MIT Press, 2019.

3. First-Order Methods in Optimization, Amir Beck, SIAM, 2017.