Stockholm university
Gå till denna sida på svenska webben

Dynamic Systems and Optimal Control Theory

A dynamical system is a system in which a function describes the time dependence of a point in a geometrical space, for example population growth, a swinging pendulum, or a particle whose state varies over time.

The course covers linear systems of differential equations, stability theory, basic concepts in control theory, Pontryagin's maximum principle, dynamical programming (in particular linear quadratic optimal control). The contents of the course can be applied in modelling in a number of fields, for example in physics, machine learning, artificial intelligence, and economics.

Recommended knowledge before starting the course

Besides the courses that are required in order to be eligible for the course, we recommend that you have also taken a course in ordinary differential equations, for instance our course Mathematics III - Ordinary Differential Equations (MM5026).

  • Course structure

    The course consists of one element.

    Teaching format

    Teaching consists of lectures and seminars.

    Assessment

    Assessment takes place through written examination and hand in problems.

    Examiner

    A list of examiners can be found on

    Exam information

  • Schedule

    The schedule will be available no later than one month before the start of the course. We do not recommend print-outs as changes can occur. At the start of the course, your department will advise where you can find your schedule during the course.
  • Course literature

    Note that the course literature can be changed up to two months before the start of the course.

    Sontag: Mathematical Control Theory. Deterministic Finite Dimensional Systems. Springer.

    List of course literature Department of Mathematics

  • Course reports

  • More information

    Course web

    We do not use Athena, you can find our course webpages on kurser.math.su.se.

  • Contact