The course syllabus contains changes
See changesCourse syllabus adopted 2021-04-29 by Head of Programme (or corresponding).
Overview
- Swedish nameInformationsteori
- CodeSSY210
- Credits7.5 Credits
- OwnerMPCOM
- Education cycleSecond-cycle
- Main field of studyElectrical Engineering
- DepartmentELECTRICAL ENGINEERING
- GradingUG - Pass, Fail
Course round 1
The course round is cancelled. This course round is planned to be given every other year. For further questions, please contact the director of studies- Teaching language English
- Application code 13112
- Open for exchange studentsYes
Credit distribution
Module | Sp1 | Sp2 | Sp3 | Sp4 | Summer | Not Sp | Examination dates |
---|---|---|---|---|---|---|---|
0108 Oral examination 7.5 c Grading: UG | 7.5 c |
In programmes
Examiner
- Giuseppe Durisi
- Full Professor, Communication, Antennas and Optical Networks, Electrical Engineering
Eligibility
General entry requirements for Master's level (second cycle)Applicants enrolled in a programme at Chalmers where the course is included in the study programme are exempted from fulfilling the requirements above.
Specific entry requirements
English 6 (or by other approved means with the equivalent proficiency level)Applicants enrolled in a programme at Chalmers where the course is included in the study programme are exempted from fulfilling the requirements above.
Course specific prerequisites
A solid foundation in probability and calculus. The course difficulty is on the Ph.D. level, which means that it is mathematically more advanced and runs at a higher pace than most Master's courses.Aim
This course offers an introduction to information theory and its application to digital communication, statistics, and machine learning.
One important feature of the information-theory approach is its ability to provide fundamental results, i.e., results that demonstrate the optimality of certain procedures.
Obtaining results of this flavor is useful for many reasons: for example, we can assess whether achieving a target error probability in the transmission of information is feasible; we can determine how many data samples need to be collected to distinguish between two or more statistical hypotheses, or how many examples are needed to train a machine learning algorithm.
Learning outcomes (after completion of the course the student should be able to)
- Define entropy, relative entropy, and mutual information and explain their operational meaning
- Describe and demonstrate Shannons source coding and channel coding theorems
- Compute the capacity of discrete communication channels
- Describe the fundamental performance metrics in binary hypothesis testing, their trade-off, their asymptotic behavior, and the structure of the optimal test
- Explain how relative entropy can help characterizing the generalization error in statistical learning
- Apply Fanos inequality to demonstrate impossibility results in group testing, graphical model selection, and sparse linear regression
Content
- Shannons information metrics: entropy, relative entropy (a.k.a. Kulback-Leibler divergence), mutual information
- Asymptotic equipartition property and typicality
- Data compression and the source coding theorem
- Data transmission and the channel coding theorem
- Binary hypothesis testing, Neyman-Pearson Lemma, Steins lemma
- Generalization error in statistical learning theory and probably-approximately correct (PAC) Bayesian bounds
- Minimax bounds in statistical estimations and the Fanos method
Organisation
Approximately 15 lectures and 7 exercise sessionsLiterature
The course is partly based on the following references:
- S. M. Moser, Information Theory (Lecture Notes), six ed., ETH Zurich, Switzerland and National Chiao Tung University, Taiwan, Oct. 2019. [Online]. Available: https://moser-isi.ethz.ch/scripts.html
- Y. Polyanskiy and Y. Wu, Lecture notes on information theory, May 2019. [Online]. Available: http://people.lids.mit.edu/yp/homepage/papers.html
- J. Duchi, Information theory and statistics (lecture notes). Stanford, CA: Stanford University, 2019. [Online]. Available: http://web.stanford.edu/class/stats311/
- A. El Gamal and Y.-H. Kim, Network information theory. Cambridge, U.K.: Cambridge Univ. Press, 2011.
Examination including compulsory elements
Mandatory weekly assignments, oral exam (pass or fail), 7.5hpThe course examiner may assess individual students in other ways than what is stated above if there are special reasons for doing so, for example if a student has a decision from Chalmers on educational support due to disability.
The course syllabus contains changes
- Changes to course rounds:
- 2021-04-28: Cancelled Changed to cancelled by UOL
[Course round 1] Cancelled
- 2021-04-28: Cancelled Changed to cancelled by UOL