Bachelor of Science (B.Sc.) & Master of Science (M.Sc.):

Available Thesis Topics

Both Bachelor and Master theses are extensive pieces of work that take full-time committment over several months. They are thus also deeply personal decisions by each student. This page lists a few topics we are currently seeking to address in the research group. If you find one of them appealing, please contact the person listed here for each project. If you have an idea of your own — one that you are passionate about, and which fits into the remit of the Chair for the Methods of Machine Learning — please feel invited to pitch your idea. To do so, first contact Philipp Hennig to make an appointment for a first meeting.

We want to extend the functionality of our backpropagation library BackPACK [1] that will be presented at ICLR 2020.

It is a high-quality software library that computes additional and novel numerical quantities with automatic differentiation that aim to improve the training of deep neural networks. Applicants should be interested in automatic differentiation and be experienced in PyTorch and Python. Students will learn about deep neural networks' operations and their autodifferentiation internals, as well as the quantities extracted by BackPACK. They thus offer an opportunity to gain expert knowledge in the algorithmic side of deep learning. Both are challenging projects, which require familiarity with the manipulation of tensors (indices!) and multivariate calculus (automatic differentiation). A significant amount of time will be spent on software engineering, as the works will be fully integrated into BackPACK and hopefully released in a future version. Results will be presented in forms of runtime benchmarks similar to the original work [1]. The students are encouraged to investigate further applications.

[1] F. Dangel, F. Kunstner & P. Hennig: BackPACK: Packing more into Backprop (2020)

More layers for BackPACK (M. Sc. Thesis)

Supervisor: Felix Dangel


At the current stage, the operations supported by BackPACK are mainly used in image classification tasks with convolutional neural networks. In PyTorch, these layers have equivalent implementations for different dimensions (e.g. 1-d/2-d/3-d convolution). The student will formulate and implement their generalization for arbitrary dimensionality (e.g. n-d convolution).

BackPACK for Recurrent Neural Networks (M. Sc. Thesis)

Supervisor: Felix Dangel 

Backpropagation in recurrent neural networks (RNN) is more complicated than in convolutional neural networks, as a layer processes a temporal sequence of inputs. This introduces a cyclic structure in the computation graph. The student will formulate backpropagation through time for popular RNN architectures (Elman RNN, LSTM, ...) and, step by step, cover all quantities that can be computed with BackPACK.