Stockholm university
Gå till denna sida på svenska webben

Dynamic Systems and Optimal Control Theory

A dynamical system is a system in which a function describes the time dependence of a point in a geometrical space, for example population growth, a swinging pendulum, or a particle whose state varies over time.

The course covers linear systems of differential equations, stability theory, basic concepts in control theory, Pontryagin's maximum principle, dynamical programming (in particular linear quadratic optimal control). The contents of the course can be applied in modelling in a number of fields, for example in physics, machine learning, artificial intelligence, and economics.

Recommended knowledge before starting the course

Besides the courses that are required in order to be eligible for the course, we recommend that you have also taken a course in ordinary differential equations, for instance our course Mathematics III - Ordinary Differential Equations (MM5026).