Maximus Mutschler
Background
Since March 2018
Research assistant at the Department of Cognitive Systems, University of Tübingen
2015 - 2018
2015 - 2018: Master of Science in Computer Science at the University of Tübingen
2011 - 2014
Bachelor of Science in Medical Computer Science at the University of Heidelberg and the University of Heilbronn
Research Interests
- Machine Learning
- Deep Neural Networks
- Optimization Methods
Teaching
- Assignments class: Introduction to neural networks (Summer Semester 2018)
- Assignments class: Introduction to technical computer science (Winter Semester 2018)
- Assignments class: Deep Neural Networks (Summer Semester 2019)
- Assignments class: Deep Neural Networks (Winter Semester 2019)
- Supervision of a Software Engineering Practical Course about Deep Style Transfer (Summer Semester 2020)
- Seminar Optimization and Architecture Search for Deep Neural Networks (Winter Semester 2020)
- Supervision of a Software Engineering Practical Course about building a web application for Deep Style Transfer (Summer Semester 2021)
- Assignments class: Deep Learning (Winter Semester 2021)
Project Work
Administrator of the Training Center for Machine Learning Cluster (TCML-Cluster) with 40 nodes and 160 GPUs
Supervised Theses
2019 | Bachelor thesis | Analysis of the Generalization Capability of a New Deep Neural Network Optimizer |
2020 | Bachelor thesis | Sensitivity Analysis of DNN Optimizers on different weight initializations |
2020 | Bachelor thesis | Forcing Deep Neural Networks to explore the Loss Landscape |
2020 | Master thesis | Adaptive Approximation niedrig dimensionaler Mannigfaltigkeiten zwischen lokalen Minima in den Fehlerlandschaften von Neuronalen Netzen |
Theses
Optimizing a motor cortex model by evolution of connectivity patterns
Master's thesis, University of Tübingen, October 2017
Softwareplagiatserkennung auf Java-Bytecodebasis
Bachelor's thesis, University of Heidelberg and University of Heilbronn, June 2014
Publications
[1] | Maximus Mutschler and Andreas Zell. “Parabolic Approximation Line Search for DNNs” . Accepted at NeurIPS 2020. |
[2] | Maximus Mutschler and Andreas Zell. “A straightforward line search approach on the expected empirical loss for stochastic deep learning problems” (2020). (discontinued work) |
[3] | Maximus Mutschler and Andreas Zell. "Emperically explaining SGD from a line search perspective" (2021). Accepted at ICANN 2021. |
[4] | Maximus Mutschler, Kevin Laube and Andreas Zell. "Using a one dimensional parabolic model of the full-batch loss to estimate learning rates during training" (2021). Accepted at NeurIPS 2021 Optimization Workshop. |
Reviews
- Reviewed three works at ICANN 2021