Statistical Machine Learning, 9 ECTS
The lecture will be completely online.
- Content: The focus of this lecture is on algorithmic and theoretical aspects of statistical
machine learning. We will cover many of the standard algorithms, learn about
the general principles for building good machine learning algorithms, and analyze
their theoretical and statistical properties. The following topics will be
- Bayesian decision theory, loss functions
- simple supervised machine learning, for example linear methods; SVMs; kernel methods, boosting, decision trees.
- Evaluation and comparison of machine learning algorithms, model selection, cross validation, permutation tests
- semi-supervised learning
- Fairness, robustness and uncertainty quantification in classification
- Unsupervised learning problems, dimensionality reduction e.g. PCA, clustering: spectral clustering, hierarchical clustering
- Introduction to statistical learning theory: no free lunch theorem; generalization
bounds; VC dimension; universal consistency;
- Requirements: The lecture requires good mathematical skills roughly at the level of the lecture "Mathematics for Machine Learning" or the bachelor level courses Mathematics I-III in particular multivariate calculus and linear algebra are needed.
- Material: the lecture material will be on moodle and we use slack for communication
Convex and Non-convex Optimization, 9 ECTS
The lecture will be completely online. The lectures will be given asychronously and we do a separate online question round.
- Content: Convex optimization problems arise quite naturally in many application areas like signal processing, machine learning, image processing, communication and networks and finance etc. The course will give an introduction into convex analysis, the theory of convex optimization such as duality theory, algorithms for solving convex and nonconvex optimization problems such as interior point methods but also the basic methods in general nonlinear unconstrained minimization, and recent first-order methods in non-smooth convex optimization. We discuss also large scale techniques such as stochastic gradient and coordinate descent. Finally, we show how to model optimization problems and if time allows we show also applications of (convex) optimization in deep learning.
- Requirements: The semester requires good mathematical skills roughly at the level of the lecture "Mathematics for Machine Learning" in particular multivariate calculus and linear algebra are needed. Prior knowledge in optimization is not required.
Mathematics for Machine Learning
- Lectures: Mo, Thu, 14 c.t. - 16, MvL6, Exercise: Tue: 8 c.t -10
- lecture hall, MvL6
- see campus for more information
- Linear Algebra
- Multivariate Calculus
- Probability and Statistics
- Phenomena in high dimensions
- Approximation Theory and Functional Analysis
- The material of the lecture can be found here