Bachelor of Science (B.Sc.) & Master of Science (M.Sc.):

Available Thesis Topics

Both Bachelor and Master theses are extensive pieces of work that take full-time committment over several months. They are thus also deeply personal decisions by each student. This page lists a few topics we are currently seeking to address in the research group. If you find one of them appealing, please contact the person listed here for each project. If you have an idea of your own — one that you are passionate about, and which fits into the remit of the Chair for the Methods of Machine Learning — please feel invited to pitch your idea. To do so, first contact Philipp Hennig to make an appointment for a first meeting.

BackPACK

We want to extend the functionality of our backpropagation library BackPACK [1] that will be presented at ICLR 2020.

It is a high-quality software library that computes additional and novel numerical quantities with automatic differentiation that aim to improve the training of deep neural networks. Applicants should be interested in automatic differentiation and be experienced in PyTorch and Python. Students will learn about deep neural networks' operations and their autodifferentiation internals, as well as the quantities extracted by BackPACK. They thus offer an opportunity to gain expert knowledge in the algorithmic side of deep learning. Both are challenging projects, which require familiarity with the manipulation of tensors (indices!) and multivariate calculus (automatic differentiation). A significant amount of time will be spent on software engineering, as the works will be fully integrated into BackPACK and hopefully released in a future version. Results will be presented in forms of runtime benchmarks similar to the original work [1]. The students are encouraged to investigate further applications.

[1] F. Dangel, F. Kunstner & P. Hennig: BackPACK: Packing more into Backprop (2020)

[already taken] More layers for BackPACK (M.Sc. Thesis)

Supervisor: Felix Dangel


At the current stage, the operations supported by BackPACK are mainly used in image classification tasks with convolutional neural networks. In PyTorch, these layers have equivalent implementations for different dimensions (e.g. 1-d/2-d/3-d convolution). The student will formulate and implement their generalization for arbitrary dimensionality (e.g. n-d convolution).

BackPACK for Recurrent Neural Networks (M.Sc. Thesis)

Supervisor: Felix Dangel 

Backpropagation in recurrent neural networks (RNN) is more complicated than in convolutional neural networks, as a layer processes a temporal sequence of inputs. This introduces a cyclic structure in the computation graph. The student will formulate backpropagation through time for popular RNN architectures (Elman RNN, LSTM, ...) and, step by step, cover all quantities that can be computed with BackPACK. 

Speeding up BackPACK with custom PyTorch C++/CUDA operations (M.Sc. Thesis)

Supervisor: Felix Dangel 

Our approach in BackPACK was to re-use existing PyTorch operations and work primarily in Python. This works, but we had to work around not having low-level control. Some operations for convolutions can be sped up by custom low-level code in C++ and CUDA.


Probabilistic Linear Solvers

Linear systems A x=b are the bedrock of virtually all numerical computation. Machine learning poses specific challenges for the solution of such systems due to their scale, characteristic structure and their stochasticity. Datasets are often so large that data subsampling approaches need to be employed, inducing noise on A. In fact, usually only noise-corrupted matrix-vector products are available. Typical examples are large-scale empirical risk minimization problems. Classic linear solvers such as CG typically fail to solve such systems accurately since they rely on errors within machine precision.

Probabilistic linear solvers [1] aim to address these challenges raised by ML by treating the problem of solving the linear system itself as an inference task. This allows the incorporation of prior (generative) knowledge about the system, e.g. about its eigenspectrum and enables the solution of noisy systems.

[1] Hennig, P., Probabilistic Interpretation of Linear Solvers, *SIAM Journal on Optimization*, 2015, 25, 234-260

Benchmarking Linear Solvers for Machine Learning (M.Sc. Thesis)

Supervisor: Jonathan Wenger 

In order to compare linear solvers with respect to the specific linear systems which arise in machine learning, a benchmark suite of linear problems needs to be established. Such a list should include among others noisy systems (stochastic quadratic programs, empirical risk minimization), large scale systems (n > 100000), structured systems (sparsity, block diagonality) and systems with generative prior information (kernel Gram matrices). The student's task will be to establish such a benchmark based on interesting current research topics in ML and to evaluate existing linear solvers.

[already taken] Efficient Computation of Matrix-variate Distributions (B.Sc. Thesis)

Supervisor: Jonathan Wenger 

Matrix-variate distributions are a key component of matrix-based probabilistic linear solvers which perform inference on the system matrix or its inverse. In order to make use of the posterior distribution over the inverse as produced by the solver, efficient techniques for matrix-variate distributions need to be established. The student will derive and implement efficient sampling and evaluation of the probability density function of matrix-variate normal distributions with different covariance structures. The implementation will be in Python as part of the probnum package.


Bayesian quadrature

Bayesian quadrature (BQ) treats numerical integration as an inference problem by constructing posterior measures over integrals given observations, i.e. evaluations of the integrand. Besides providing sound uncertainty estimates, the probabilistic approach permits the inclusion of prior knowledge about properties of the function to be integrated and leverages active learning schemes for node selection as well as transfer learning schemes, e.g. when multiple similar integrals have to be jointly estimated.

[already taken] Bayesian quadrature on Riemannian manifolds (M.Sc. Thesis)

supervisors: Alexandra Gessner, Georgios Arvanitidis (MPI-IS)

Riemannian manifolds provide a principled way to model the nonlinear geometric structure assumed to be inherent in the data. The goal of the project is to use BQ to normalize probability distributions on these manifolds. In the iterative process of learning the distribution's parameters, the integral for the normalization has to be evaluated repeatedly. The setup is ideal for BQ: The integrand has known structure that can be encoded in the prior and function evaluations are expensive since they require computing geodesics (i.e. solving differential equations). The iterative nature of the integral updates further suggest a transfer learning approach to recycle information collected in previous iterations.


Probabilistic Solutions to ODEs

Ordinary differential equations (ODEs) are central to mathematical models of physical phenomena. For example, the spread of a disease in a population can be predicted by approximating the solution of an ODE. Classical numerical analysis has developed a rich body of methods regarding the solution of this task. 

By taking a probabilistic perspective, it is possible to derive an algorithm that returns a probability distribution describing the ODE solution. The variance of this posterior distribution is not only informed about numerical accuracy of the approximation but can be leveraged inside a chain of computation which has been useful, for instance in parameter inference problems involving ODEs.

Benchmarking Probabilistic ODE Solvers (M.Sc./B.Sc. Thesis)

Supervisor: Nicholas Krämer

In this project we want to compare different lines of thoughts regarding probabilistic ODE solvers. There are mainly two approaches: (i) computing samples of a stochastically perturbed "classical" method [1] and (ii) applying Bayesian filtering and smoothing to the ODE [2]. The students tasks will be to (i) implement sampling based methods in Python as part of the probnum package (filtering based methods already exist) and (ii) develop a set of benchmark problems on which computational speed and differences in posterior uncertainty estimates are evaluated. 

[1] Patrick R Conrad, Mark Girolami, Simo Särkkä, Andrew Stuart and Konstantinos Zygalakis. Statistical analysis of differential equations: introducing probability measures on numerical solutions. Statistics and Computing, 2017

[2] Filip Tronarp, Hans Kersting, Simo Särkkä and Philipp Hennig. Probabilistic solutions to ordinary differential equations: a new perspective. Statistics and Computing, 2019.

 

Probabilistic Computation of Geodesics (M.Sc. Thesis)

Supervisor: Nicholas Krämer 

Straight lines on manifolds, so-called geodesics, are of high importance in computational geometry, and consequently in modern data analysis. They can usually not be computed analytically, instead one numerically solves so-called boundary value problems: second order ordinary differential equations with an initial condition and a terminal condition. In this project investigate the potential of using probabilistic ODE solvers on this task and see how much one gains in terms of (i) computational speed and (ii) quantification of (numerical) uncertainty.

The students task will initially be to reproduce previous studies on this topic [1] and focus on how much the most recent efforts in numerical solution of boundary value problems [2, 3] are able to contribute to an improvement. If there is enough time, we investigate the effect of these insights on challenging tasks in computational geometry and machine learning. 

[1] Georgios Arvanitidis, Soren Hauberg, Philipp Hennig and Michael Schober. Fast and robust shortest paths on manifolds learned from data. AISTATS 2019.

[2] David John, Vincent Heuveline and Michael Schober. GOODE: a Gaussian off-the-shelf ordinary differential equation solver. ICML 2019.

[3] Filip Tronarp, Simo Särkkä and Philipp Hennig. Bayesian ODE solvers: The Maximum A Posteriori Estimate. arXiv:2004.00623.