High Dimensional Probability and Concentration of Measure
General Information
Lecture: |
Monday 10:00-12:00 Uhr Thursday 10:00-12:00 Uhr |
Lecturer: | Jun.-Prof. Dr. Anna Gusakova |
Assitance: | |
KommVV: | This course in the course overview The tutorial in the course overview |
Content: |
In this course we study the probabilistic properties of such objects like random vectors, random matrices and random linear subspaces, when the dimension of the ambient space is getting large. Such objects typically appear in data science as a model for big data samples, and a number of algorithms require good understanding of their properties. One of the core topic of this course are concentration inequalities, which allow to describe in a quantitative way how much a given random object can deviate from a particular value, or alternatively for a given measure to describe a subset where nearly all mass is concentrated. For example we will consider the following questions: Where is the mass of a high-dimensional ball concentrated? What is the "typical" length of a high dimensional Gaussian random vector? What is the shape of the random projection of a high-dimensional convex set? In order to answer these question we will use a combination of geometric, analytic and probabilistic arguments. The preknowledege in Probability Theory are necessary. |
Learnweb: |
The participants are kindly asked to register for the course in the learnweb in order to access the material of the class. |
Exercise classes: | The exercise sheets will be uploaded to the learnweb. The exercise classes take place on Tuesday at 10:15 am in room SRZ 204 (Orléans-Ring 12) starting on October 15th. |