Skip to main content

Optimal Control Systems

a
Course
Postgraduate
Semester
Electives
Subject Code
AVC621
Subject Title
Optimal Control Systems

Syllabus

Basic mathematical concepts: Finite dimensional optimization, Infinite dimensional optimization, Con- ditions for optimality, Performance measures for optimal control problems. Dynamic programming: The optimal control law, The principle of optimality, Dynamic programming concept, Recurrence relation, computational procedure, The Hamilton-Jacobi-Bellman equations.

Calculus of variations: Examples of variational problems, Basic calculus of variations problem, Weak and strong extrema, Variable end point problems, Hamiltonian formalism and mechanics: Hamilton’s canonical equations.

From Calculus of variations to Optimal control: Necessary conditions for strong extrema, Calculus of variations versus optimal control, optimal control problem formulation and assumptions, Variational ap- proach to the fixed time, free end point problem.

The Pontryagin’s Minimum principle: Statement of Minimum principle for basic fixed end point and variable end point control problems, Proof of the minimum principle, Properties of the Hamiltonian, Time optimal control problems.

The Linear Quadratic Regulator: Finite horizon LQR problem- Candidate optimal feedback law, Ricatti differential equations (RDE), Global existence of solution for the RDE. Infinite horizon LQR problem- Existence and properties of the limit, solution, closed loop stability. Examples: Minimum energy control of a DC motor, Active suspension with optimal linear state feedback, Frequency shaped LQ Con- trol.

LQR using output feedback: Output feedback LQR design equations, Closed loop stability, Solution of design equations, example.

Linear Quadratic tracking control: Tracking a reference input with compensators of known structure, Tracking by regulator redesign, Command generator tracker, Explicit model following design.

Linear-Quadratic-Gaussian controller (LQG) and Kalman-Bucy Filter: LQG control equations, es- timator in feedback loop, steady state filter gain, constraints and minimizing control, state estimation using Kalman-Bucy Filter, constraints and optimal control

Text Books

Same as Reference

 

References

1. D.E.Kirk, Optimal Control Theory- An Introduction, Dover Publications, New York, 2004.

2. AlokSinha, Linear Systems- Optimal and Robust Controls, CRC Press, 2007.

3. Daniel Liberzone, Calculus of variations & Optimal control theory, Princeton, 2012.

4. Frank L. Lewis, Applied optimal control & Estimation- Digital design and implementation, Prentice Hall and Digital Signal Processing Series, Texas Instruments, 1992.

5. Jason L. Speyer, David H. Jacobson, Primer on Optimal Control Theory, SIAM,2010.

6. Ben-Asher, Joseph Z, Optimal Control Theory with Aerospace Applications, American Institute of Aeronautics and Astronautics, 2010.

7. IT course notes on Principles of optimal control, 2008.

8. Brian D. O. Anderson, John Barratt Moore, Optimal control: linear quadratic methods, Dover, 2007.

9. Brian D. O. Anderson, John Barratt Moore, Optimal filtering, Dover, 2005.

10. Frank L. Lewis, Optimal estimation: with an introduction to stochastic control theory, Wiley Inter- science, 1986.