Sorry, you need to enable JavaScript to visit this website.

  • 8:04 PM, Friday, 19 Apr 2024


Course Postgraduate
Semester Electives
Subject Code AVC621
Subject Title Optimal Control Systems

Syllabus

Basic mathematical concepts: Finite dimensional optimization, Infinite dimensional optimization, Conditions for optimality, Performance measures for optimal control problems. Dynamic programming: The optimal control law, The principle of optimality, Dynamic programming concept, Recurrence relation, computational procedure, The Hamilton-Jacobi-Bellman equations.

Calculus of variations: Examples of variational problems, Basic calculus of variations problem, Weak and strong extrema, Variable end point problems, Hamiltonian formalism and mechanics: Hamilton’s canonical equations.

From Calculus of variations to Optimal control: Necessary conditions for strong extrema, Calculus of variations versus optimal control, optimal control problem formulation and assumptions, Variational approach tothe fixed time, free end point problem.

The Pontryagin’s Minimum principle: Statement of Minimum principle for basic fixed end point and variable end point control problems, Proof of the minimum principle, Properties of the Hamiltonian, Time optimal control problems.

The Linear Quadratic Regulator: Finite horizon LQR problem-Candidate optimal feedback law, Ricatti differential equations (RDE), Global existence of solution for the RDE. Infinite horizon LQR problem-Existence and properties of the limit, solution, closed loop stability. Examples: Minimum energy control of a DC motor, Active suspension with optimal linear state feedback, Frequency shaped LQ Control.

LQR using output feedback: Output feedback LQR design equations, Closed loop stability, Solution of design equations, example.

Linear Quadratic tracking control: Tracking a reference input with compensators of known structure, Tracking by regulator redesign, Command generator tracker, Explicit model following design.

Linear-Quadratic-Gaussian controller (LQG) and Kalman-Bucy Filter: LQG control equations, estimator in feedback loop, steady state filter gain, constraints and minimizing control, state estimation using Kalman-Bucy Filter, constraints and optimal control

Text Books
References

1.D.E.Kirk, Optimal Control Theory-An Introduction, Dover Publications, New York, 2004.

2.Alok Sinha, Linear Systems-Optimal and Robust Controls, CRC Press, 2007.

3.Daniel Liberzone, Calculus of variations and Optimal control theory, Princiton University press, 2012.

4.Frank L. Lewis, Applied optimal control & Estimation-Digital design and implementation,Prentice Hall and Digital Signal Processing Series, Texas Instruments, 1992.

5.Jason L. Speyer, David H. Jacobson, Primer on Optimal Control Theory, SIAM,2010.

6.Ben-Asher, Joseph Z, Optimal Control Theory with Aerospace Applications, American Institute of Aeronautics and Astronautics, 2010.

7.IT course notes on Principles of optimal control, 2008.

8.Brian D. O. Anderson, John Barratt Moore, Optimal control: linear quadratic methods, Dover, 2007.

9.Brian D. O. Anderson, John Barratt Moore, Optimal filtering, Dover, 2005.

10.Frank L. Lewis, Optimal estimation: with an introduction to stochastic control theory, Wiley Interscience, 1986.