Optimal Control (OC)



  • to interpret the difference between feasible and optimal control strategies
  • to understand and reason about the underlying mathematical theory of optimal control
  • to design optimal controllers for linear and nonlinear systems
  • to assess the possibilities and limitations of optimal control in design and application


  • Principle of optimality

  • Dynamic programming

  • Hamilton-Jacobi-Bellman equation

  • Pontryagin’s minimum principle

  • Boundary value problems

  • H2-norm of generalized system plants

  • Linear quadratic control

  • Kalman filter


  • Lecture Material
  • E. Kirk: Optimal Control Theory, Dover, 1998
  • D. O. Anderson, J. B. Moore: Optimal Control - Linear Quadratic Methods, Dover 2007

Recommended Prerequisites

  • Content of the courses Fundamentals of Controls, Linear and Nonlinear Control Systems


3L + 1T, 4 Credits
(L: lecture hours per week, T: tutorial hours per week)
The course is offered in the summer semester; the examination in the winter and summer semester (in English only)

Course Number

- to be inserted -

Assignment to Course Programs

Master of Electrical Engineering, Master of Mechatronics, open as elective course within other Master program

Weitere Informationen, Kursinhalte und Lehrmaterial: