Course syllabus adopted 2021-02-26 by Head of Programme (or corresponding).
Overview
- Swedish nameStatistik och maskininlärning i högre dimensioner
- CodeEEN100
- Credits7.5 Credits
- OwnerMPCOM
- Education cycleSecond-cycle
- Main field of studyElectrical Engineering, Software Engineering, Mathematics
- DepartmentELECTRICAL ENGINEERING
- GradingUG - Pass, Fail
Course round 1
- Teaching language English
- Application code 13117
- Maximum participants40
- Block schedule
- Open for exchange studentsYes
Credit distribution
Module | Sp1 | Sp2 | Sp3 | Sp4 | Summer | Not Sp | Examination dates |
---|---|---|---|---|---|---|---|
0120 Oral examination 6 c Grading: UG | 6 c | ||||||
0220 Project 1.5 c Grading: UG | 1.5 c |
In programmes
Examiner
- Giuseppe Durisi
- Full Professor, Communication, Antennas and Optical Networks, Electrical Engineering
Eligibility
General entry requirements for Master's level (second cycle)Applicants enrolled in a programme at Chalmers where the course is included in the study programme are exempted from fulfilling the requirements above.
Specific entry requirements
English 6 (or by other approved means with the equivalent proficiency level)Applicants enrolled in a programme at Chalmers where the course is included in the study programme are exempted from fulfilling the requirements above.
Course specific prerequisites
A solid foundation in probability and calculusAim
The explosion in the volume of data collected in all scientific disciplines and in industry requires students interested in statistical analyses and machine-learning and signal-processing algorithms to acquire more sophisticated probability tools than the ones taught in basic probability courses.
This course provides an introduction to the area of high-dimensional statistics, which deals with large scale problems where both the number of parameters and the sample size is large.
The course covers fundamental tools for the analysis of random vectors, random matrices and random projections, such as tail bounds and concentration inequalities.
It further provides concrete applications of such tools in the context of generalization-error analyses in statistical learning theory, sparse linear model, and matrix models with rank constraints.
Learning outcomes (after completion of the course the student should be able to)
- State basic tails and concentration bounds for sums of independent random variables
- Apply these bounds to provide guarantees on how accurately one can
- estimate a covariance matrix from data
- recover a sparse linear vector from noisy linear projections
- estimate a low-rank matrix from few of its entries
Content
- Fundamental probability tools
- Preliminaries on random variables: classical inequalities and limit theorems
- Concentration of sums of independent random variables: Hoeffding, Chernoff, Bernstein, sub-Gaussian and sub-exponential distributions
- Random vectors and random matrices in high dimensions
- Concentration without independence
- Uniform laws of large number: Rademacher complexity and VC dimension
- Applications in machine learning, statistics, and signal processing
- Covariance matrix estimation
- Recovery of sparse signals
- Principal component analysis
- Low-rank matrix recovery
- Sample complexity in statistical learning theory
Organisation
Approximately 16 lectures, 8 exercise sessions, and one project where the tools learnt in the course will be used to solve a real-world machine-learning taskLiterature
The course will be partly based on the following references- R. Vershynin, High-dimensional probability: an introduction with applications in data science. Cambridge Univ. Press, 2019. Available: Online
- M. J. Wainwright, High-dimensional statistics: a nonasymptotic viewpoint. Cambridge, U.K.: Cambridge Univ. Press, 2019.
Examination including compulsory elements
Oral exam and project (pass or fail)The course examiner may assess individual students in other ways than what is stated above if there are special reasons for doing so, for example if a student has a decision from Chalmers on educational support due to disability.