Dynamic Systems and Optimal Control Theory
7.5 credits cr.
- Gå till denna sida på svenska webben
A dynamical system is a system in which a function describes the time dependence of a point in a geometrical space, for example population growth, a swinging pendulum, or a particle whose state varies over time.
The course covers linear systems of differential equations, stability theory, basic concepts in control theory, Pontryagin's maximum principle, dynamical programming (in particular linear quadratic optimal control). The contents of the course can be applied in modelling in a number of fields, for example in physics, machine learning, artificial intelligence, and economics.
Recommended knowledge before starting the course
Besides the courses that are required in order to be eligible for the course, we recommend that you have also taken a course in ordinary differential equations, for instance our course Mathematics III - Ordinary Differential Equations (MM5026).
The course consists of one element.
Teaching consists of lectures and seminars.
Assessment takes place through written examination and hand in problems.
A list of examiners can be found on
ScheduleThe schedule will be available no later than one month before the start of the course. We do not recommend print-outs as changes can occur. At the start of the course, your department will advise where you can find your schedule during the course.
Note that the course literature can be changed up to two months before the start of the course.
Sontag: Mathematical Control Theory. Deterministic Finite Dimensional Systems. Springer.