Samples of exercise sheets and their solutions can be found on my personal website.
High-dimensional Probability Theory (exercises, Summer Semesters 2020, 2021 and 2022)
The course studies the non-asymptotic theory of random objects in high-dimensional spaces (random vectors and matrices, random projections. etc) that are useful in applications in data science (machine learning, dimensionality reduction, compressive sensing, etc.) It closely follows the presentation suggested by R. Vershynin's book "High-dimensional probability" and covers topics such as concentration inequalities, decoupling and symmetrisation techniques, Johnson-Lindenstrauss' lemma, chaining and comparison techniques for stochastic processes, etc.
Mathematical Foundations of Machine Learning (exercises, Winter Semesters 2019/2020 and 2020/2021)
The aim of this class is to build a mathematical foundation to understand the most common and classical techniques used in the ever-growing field of machine learning and to obtain quantitative guarantees for learning algorithms.
Master and/or seminar projects
I welcome supervision enquiries from motivated and serious students who are interested in a Master and/or seminar project in the broad fields of stochastic analysis, machine learning or high-dimensional probability. I can give you ideas but I am also equally happy to listen to your own suggestions!