Statistical Information Theory
Information theory is at the intersection of mathematics, statistics, computer science and several other fields, with applications in many areas.
The course aims at introducing the fundamental concepts in information theory, their relationship and their contemporary applications in statistics, machine learning, time series analysis, dynamical system, physics, etc. Topics that will be covered in the course include basic concepts of information theory, entropy rates of stochastic process, differential entropy, information flow & causal detection, multivariate dependence and multi-information.
-
Course structure
The course consists of one element.
Teaching format
Instruction is given in the form of lectures, exercise sessions and computer exercises.
Assessment
The course is assessed through take-home exam.
Examiner
A list of examiners can be found on
-
Schedule
The schedule will be available no later than one month before the start of the course. We do not recommend print-outs as changes can occur. At the start of the course, your department will advise where you can find your schedule during the course. -
Course literature
Note that the course literature can be changed up to two months before the start of the course.
Cover and Thomas:Elements of Information Theory (2nd ed). Wiley.
Haykin: Neural Networks and Learning Machines (3rd ed.) Pearson Education Inc.
-
Course reports
-
More information
New student
During your studiesCourse web
We do not use Athena, you can find our course webpages on kurser.math.su.se.
-
Contact