Many of our publications include source code releases. This page lists more extensive, purpose-built software packages.
Written by Frank Schneider and Lukas Balles, DeepOBS provides a benchmark suite for deep learning optimization methods. It aims to help researchers streamline the process of developing new optimizers, by offering a collection of test problems (dataset and architecture) of varying complexity, baselines for a variety of optimizers, and even an automated pipeline for generating comparison plots in papers.
Install with pip install deepobs
Originally written by Philipp Hennig and Christian Schuler, later expanded by Edgar Klenske, Robert Eisele and others, Entropy Search is a global optimization / experimental framework designed to extract information about the location of the extremum instead of trying to collect low function values.
The software package was originally released in 2012 and, although repeatedly updated afterward, is not maintained any longer. However, the open-source code has found its way into third-party packages, such as emukit.
ProbNum implements probabilistic numerical methods to solve numerical problems from linear algebra, optimization, quadrature and differential equations using probabilistic inference. This approach captures uncertainty arising from finite computational resources and stochastic input. The library is primarily developed and maintained by Jonathan Wenger, Nicholas Krämer and Nathanael Bosch.
Install with pip install probnum.
laplace is a flexible, PyTorch based package for Laplace approximations of neural networks. It offers efficient and easy to implement posterior, posterior predictive, and marginal-likelihood approximations. Thereby, the package can be used for improved predictive uncertainty estimates, model selection, and continual learning. In contrast to other approximate Bayesian methods in deep learning, it does not require retraining the neural network.
Install via pip install laplace-torch