Course syllabus for Information theory, advanced level

Course syllabus adopted 2024-02-02 by Head of Programme (or corresponding).

Overview

  • Swedish nameInformationsteori
  • CodeSSY210
  • Credits7.5 Credits
  • OwnerMPICT
  • Education cycleSecond-cycle
  • Main field of studyElectrical Engineering
  • DepartmentELECTRICAL ENGINEERING
  • GradingUG - Pass, Fail

Course round 1

  • Teaching language English
  • Application code 13116
  • Open for exchange studentsYes

Credit distribution

0108 Oral examination 7.5 c
Grading: UG
7.5 c

In programmes

Examiner

Go to coursepage (Opens in new tab)

Eligibility

General entry requirements for Master's level (second cycle)
Applicants enrolled in a programme at Chalmers where the course is included in the study programme are exempted from fulfilling the requirements above.

Specific entry requirements

English 6 (or by other approved means with the equivalent proficiency level)
Applicants enrolled in a programme at Chalmers where the course is included in the study programme are exempted from fulfilling the requirements above.

Course specific prerequisites

A solid foundation in probability and calculus. The course difficulty is on the Ph.D. level, which means that it is mathematically more advanced and runs at a higher pace than most Master's courses.

Aim

This course offers an introduction to information theory and its application to digital communication, statistics, and machine learning.

One important feature of the information-theory approach is its ability to provide fundamental results, i.e., results that demonstrate the optimality of certain procedures.

Obtaining results of this flavor is useful for many reasons: for example, we can assess whether achieving a target error probability in the transmission of information is feasible; we can determine how many data samples need to be collected to distinguish between two or more statistical hypotheses, or how many examples are needed to train a machine learning algorithm.

Learning outcomes (after completion of the course the student should be able to)

  • Define entropy, relative entropy, and mutual information and explain their operational meaning
  • Describe and demonstrate Shannon’s source coding and channel coding theorems
  • Compute the capacity of discrete communication channels
  • Describe the fundamental performance metrics in binary hypothesis testing, their trade-off, their asymptotic behavior, and the structure of the optimal test
  • Explain how relative entropy can help characterizing the generalization error in statistical learning
  • Apply Fano’s inequality to demonstrate impossibility results in group testing, graphical model selection, and sparse linear regression

Content

  • Shannon’s information metrics: entropy, relative entropy (a.k.a. Kulback-Leibler divergence), mutual information
  • Asymptotic equipartition property and typicality
  • Data compression and the source coding theorem
  • Data transmission and the channel coding theorem
  • Binary hypothesis testing, Neyman-Pearson Lemma, Stein’s lemma
  • Generalization error in statistical learning theory and probably-approximately correct (PAC) Bayesian bounds
  • Minimax bounds in statistical estimations and the Fano’s method

Organisation

Approximately 15 lectures and 7 exercise sessions

Literature

The course is partly based on the following references:

Examination including compulsory elements

Mandatory weekly assignments, oral exam (pass or fail), 7.5hp

The course examiner may assess individual students in other ways than what is stated above if there are special reasons for doing so, for example if a student has a decision from Chalmers on educational support due to disability.