Skip to main content

Advanced Optimization

a
Course
Postgraduate
Semester
Electives
Subject Code
MA872
Subject Title
Advanced Optimization

Syllabus

Unconstrained Optimization: line search method: Wolf condition, Goldstein condition, sufficient decrease and backtracking, Newtons method and Quazi Newton method; trust region method: the Cauchy point, algorithm based on Cauchy point, improving on the Cauchy point, the Dog- leg method, two-dimensional subspace reduction; nonlinear conjugate gradient method: the Fletcher Reeves method.

Constrained Optimization: penalty method, quadratic penalty method, convergence, non smooth penalty function, L1 penalty method, augmented Lagrangian method; quadratic programming, Schur complementary, null space method, active set method for convex QP; sequential quadratic programming, convex programming.

Text Books

Same as Reference

 

References

  1. Boyd, S. and Vandenberghe, L., Convex Optimization, Cambridge Univ. Press (2004).
  2. Nocedel, J. and Wright, S. Numerical Optimization, Springer (2006).

Course Outcomes (COs):
CO1: Impart knowledge of advanced theory of optimization.

CO2: Familiarize with advanced algormths to solve optimization problems.

CO3: Write codes for optimization problems using advance algorithms.