Cognitive Modeling

Prof. Dr. Sebastian Otte

Cognitive Modeling
Wilhelm Schickard Institute
University of Tübingen

Room C 420
Sand 14
72076 Tübingen, Germany

Phone: +49 7071 29 70481
Fax: +49 7071 29 5719
Email: sebastian.otte[at]uni-tuebingen.de
LinkedIn: https://www.linkedin.com/in/sebastian-otte/


Link to my new profile at the University of Lübeck: https://www.rob.uni-luebeck.de/institut/mitarbeiter/otte-sebastian


Bio

  • Since September 2023: Professor at the Institute for Robotics and Cognitive Systems at the University of Lübeck
  • September 2022 - February 2023: Research stay (as a Humboldt fellow), Centrum Wiskunde & Informatica (CWI), Amsterdam, Netherlands
  • April 2020 - September 2020: Substitute professor, Distributed Intelligence, University of Tübingen
  • Since Juli 2016: Postdoc researcher, Cognitive Modeling group, University of Tübingen
  • Januar 2013 - Juni 2016: Doctoral student, Cognitive Systems group (supervised by Prof. Dr. Andreas Zell), University of Tübingen
  • October 2009 - May 2012: Master of Computer Science, University of Applied Sciences Wiesbaden
  • March 2007 - August 2009: Bachelor of Computer Science, University of Applied Sciences Wiesbaden

Research interests

  • Recurrent Neural Networks
  • Spiking Neural Networks
  • Reservoir Computing
  • Efficient Learning
  • Bio-inspired Computing

Teaching

  • WT19/20: Advanced Neural Networks
  • WT19/20: Seminar Recent Advances in Recurrent Neural Networks
  • WT18/19: Advanced Neural Networks
  • ST18: Seminar Current Advances in Deep and Recurrent Neural Networks
  • ST18: Informatik II
  • WT17/18: Seminar Dynamic Neural Networks
  • WT17/18: Praktikum Artificial Neural Networks
  • ST17: Informatik II
  • ST17: Praktikum Artificial Neural Networks
  • ST17: Einführung in Neuronale Netze
  • WT16/17: Advanced Neural Networks
  • WT16/17: Seminar Grounded Cognition
  • ST16: Einführung in Neuronale Netze
  • WT15/16: Artificial Intelligence
  • ST15: Advanced Neural Networks
  • ST15: Einführung in Neuronale Netze
  • WT14/15: Artificial Intelligence
  • ST14: Einführung in Neuronale Netze
  • ST14: Proseminar Machine Learning
  • WT13/14: Artificial Intelligence
  • ST13: Einführung in Neuronale Netze
  • ST13: Softwarepraktikum

ST: summer term, WT: winter term


Supervised theses and student projects

2023

  • Sebastian Kairat, Evaluating Resonate-and-Fire Neurons with Harmonic Oscillators (lab project, work in progress)
  • Saya Higuchi, Resonate-and-Fire Neurons in Recurrent Spiking Neural Networks (master thesis, work in progress)
  • Coşku Can Horuz, Physical Domain Parametrization and Reconstruction with Finite Volume Neural Networks (master thesis)
  • Clemens Mollik, Hardware Acceleration of Convolutional Neural Networks: A Template-based Design Exploration (master thesis, coreviewer)
  • Johannes Schubert, The Optimism Bias in Rational Agents (master thesis, coreviewer)
  • Elia Al Geith, A Quantitative Evaluation of Graph Embedding-based Surrogate Models for TinyML-Neural Architecture Search (master thesis, coreviewer)
  • Karl Vetter, Spiking Neural Networks for Event-Based Ball Detection (master thesis, coreviewer)
  • Geoffrey Kasenbacher, Towards a Biologically-Realistic Model of Speech Recognition with Spiking Neural Networks (master thesis, in cooperation with Mercedes-Benz AG)
  • Leon Scharwaechter, Representation Learning of Multivariate Time Series using Adversarial Training (lab project)
  • Saya Higuchi, Recurrent Spiking Neural Network with Adaptive Leaky Integrate-and-Fire Neurons (lab project)
  • Robert Deibel, Investigating Resonator-Gated Recurrent Neural Networks (master thesis)

2022

  • Luca Idler, Graph Convolutional Networks for Sequential Crash Simulation (bachelor thesis)
  • Tobias Bubeck, Learning Cellular Automata with Spatiotemporal Recurrent Neural Networks (bachelor thesis)
  • Frauke Andersen, Inverse Model Learning based on Adaptive Policy Sampling (bachelor thesis)
  • Sebastian Volker, Benchmarking Neuromorphic Machine Learning (master thesis)
  • Coşku Can Horuz, Inferring Boundary Conditions in Finite Volume Neural Networks (research project)
  • Michael Hoyer, Efficient Training of LSTMs for Supervised and Reinforcement Learning Tasks (master thesis)
  • Adrian Stock, Modeling Weather Station Data with Graph Neural Networks (master thesis, coreviewer)

2021

  • Guillermo Martin Sanchez, Biological plausibility of supervised learning algorithms in spiking neural networks (essay rotation)
  • Anna-Lena von Behren, Time Series Learning with Swarm-based Reservoir Computing (bachelor thesis)
  • Fedor Scholz, Planning under Uncertainty in Cognitive Maps (bachelor thesis, coreviewer)
  • Tobias Hald, Potentiale von Spiking Neural Networks für Ultra-Low-Power Sprachverarbeitung auf digitalen Hardware-Beschleunigern (master thesis, coreviewer)
  • Linda Ulmer, Time-Constrained Active Tuning in Recurrent Neural Networks (bachelor thesis)
  • Paul Schmidt-Barbo, Using Discrimination in Semantic Embeddings to Inform Articulatory Speech Synthesis (master thesis, coreviewer)
  • Tobias Hald, Voice activity detection mit gepulsten neuronalen Netzen (master thesis, coreviewer)
  • Jannik Steinmetz, Entwicklung eines Hardwarebeschleunigers für SincNet auf Basis von UltraTrail (master thesis, coreviewer)
  • Alexander Schießl, Gradient-based Signal Decomposition and Prediction with an Ensemble of Pretrained RNN Modules (bachelor thesis)
  • Felix Pfeiffer, Investigating Spiking Neural Network Models for End-to-End Supervised Learning (bachelor thesis)
  • Franziska Kaltenberger, Binding and Perspective Taking Problem with Active Tuning (bachelor thesis)
  • Michael Hoyer, Control of pneumatic driven manipulators using recurrent neural networks (research project)
  • Driton Guxhofi, Analysis for Safe and Robust Optimization of Power-Constrained Milling Processes (master thesis)
  • Julius Wührer, Extraction of Codes Representing Handwriting Styles in a Compositional RNN Architecture (bachelor thesis)
  • Stefan Krafcsik, Beat Matching with Dynamic Time Warping and Neural Networks (master thesis)
  • Melvin Ciurletti, Active Tuning in Spiking Neural Networks (master thesis)
  • Zixin Yi, Audio Filtering with Active Tuning (master thesis)

2020

  • Simon Garhofer, Noise Reduction in Voice Signals using Forward SNN Models (master thesis)
  • Joshua Marben, Surprise-Regularized Stochastic Optimization (master thesis)
  • Jana Lang, Predicting Ball Catching Attempts in Healthy and Pathological Subjects With Recurrent Neural Networks (master thesis, coreviewer)
  • Tobias Fritz, Vergleich neuronaler Netze zur Posenerkennung (bachelor thesis, coreviewer)
  • Vinhdo Doan, Balancing Sensorimotor Correlation within Recurrent Forward Models (bachelor thesis)
  • Dingling Yao, Exploring Capabilities of Eligibility Trace-based Learning (bachelor thesis)

2019

  • Manuel Traub, Biologically Inspired Action Inference with Recurrent Spiking Forward Models (master thesis)
  • Melvin Ciurletti, Multiple Time Scales in Multi-Dimensional RNNs (research project)
  • Martina Feierabend, Phonem-Klassifikation mit Spiking Neural Networks (research project)
  • Johannes Hölscher, Binarizing the gating functions of LSTM networks (bachelor thesis)
  • Danilo Brajovic, Recurrent Neural Networks for Letter Trajectory Generation (research project)
  • Lydia Federmann, Inference with Temporal Gradients (essay rotation)
  • Lea Hofmaier, Generating Locomotion Patterns within Recurrent Sensorimotor Forward Models (master thesis)
  • Andreas Sauter, Learning Long-Term Dependencies with Simple RNNs (bachelor thesis, work in progress)
  • Marco Kinkel, Inferring Interactive Behavior using Recurrent Forward Models (bachelor thesis)
  • Sebastian Penhouët, Locally Embedded Autoencoders (master thesis)
  • Jakob Stoll, Sensor-aware Action Inference with Recurrent Neural Forward Models (bachelor thesis)
  • Jonas Einig, Learning Rare Classes (master thesis, in cooperation with Daimler AG)
  • Erika Thierer, Learning Non-Maxmimum Suppression (master thesis, in cooperation with Daimler AG)
  • Marius Hobbhahn, Temporal Gradient-based Module Identification (research project)
  • Manuel Traub, Feasibility of Training Recurrent Spiking Neural Networks for Speech Recognition using Spike Time Dependent Plasticity (research project)

2018

  • Jonas Gregor Wiese, Discriminative Learning from Recurrent Generators (bachelor thesis)
  • Marius Hobbhahn, Inverse classification using generative models (bachelor thesis)
  • Sebastian Penhouët, Online Inference of Hyperparameters for Optimization Processes (research project)
  • Matthias Karlbauer, Investigating Noise Suppression for Deep Neural Car Detectors (master thesis, in cooperation with Daimler AG)
  • Florian Martin, Inferring Generating Trajectories from Images of Handwritten Letters with Recurrent Neural Networks (bachelor thesis)
  • Lea Hofmaier, Adding Obstacle-Awareness to a Many-Joint Robot Arm Controlled by a Recurrent Neural Network (lab project)
  • Mitja Nikolaus, Building Compact Generative RNNs For Handwritten Letters (lab project)
  • Patricia Rubisch, A Novel Approach to EEG Data Analysis using Neuro-Evolved Echo State Networks (bachelor thesis)
  • Markus Geike, Supervised Learning in Spiking Neural Networks with Error Feedback Loops (master thesis)
  • Kevin Laube, Reinforcement learning with Differentiable Neural Computers (master thesis)
  • Jonathan Schmidt, Modeling of Spiking Behavior with Gated Recurrent Neural Networks (bachelor thesis)
  • Nils Bultjer, Learning Spectral Representations with Generative Adversarial Networks using Sounds from Instruments and Urban Environment (master thesis)

2017

  • Steffen Schnürer, Event Segmentation mit Hilfe eines Rekurrenten Neuronalen Netzes mit LSTMs (bachelor thesis)
  • Danilo Brajovic, Exploring the informational content of max pooling positions in deep neuronal networks (bachelor thesis)
  • Albert Langensiepen, Systematisierte Generierung von Trainingsszenarien zur Audiosignalseparierung mittels Rekurrenter Neuronaler Netze auf Basis von Ableton (bachelor thesis)
  • Emanuel Gerber, High-Level Action Inference with a Hierarchical RNN Approach (bachelor thesis)
  • Michael Graf, Modellierung der Interaktion von dorsalem und ventralen Pfad mithilfe von Restricted Boltzmann Maschinen (bachelor thesis)
  • Theresa Schmitt, Active Inference with Recurrent Neural Networks (bachelor thesis)
  • Laurenz Grätz, Untersuchung von Pulsing Long Short-Term Memories für BPTT basierte Sequenzmodellierung (bachelor thesis)

2016

  • Erika Thierer, Object Detection with Deep Convolutional Neural Networks on RGB-Images (bachelor thesis)
  • Sindy Löwe, Semantic Segmentation of RGB-Images with Deep Convolutional Neural Networks (bachelor thesis)
  • Johannes Reisser, Pedestrian Segmentation with Deep Neural Networks (bachelor thesis)
  • Paul Stöckle, Audiosignal-Separierung mittels Rekurrenter Neuronaler Netze (bachelor thesis)
  • Lorand Madai-Tahy, Investigating Deep Neural Networks for RGB-D Based Object Recognition (bachelor thesis)

2015

  • Michael Schramm, Atembewegungsvorhersage mittels Rekurrenter Neuronaler Netze (bachelor thesis)
  • Marcel Binz, Pattern Recognition in Electroencephalography Signals with Recurrent Neural Networks (bachelor thesis)
  • Tobias Scherer, Terrain Classification based on Acceleration Sensor Data using Recurrent Neural Networks (bachelor thesis)

2014

  • Philipp Leutz, Liganden basiertes Virtual High Troughput Screening mit Künstlichen Neuronalen Netzen (bachelor thesis)
  • Oliver Obenland, Untersuchung populationsbasierter Optimierungsalgorithmen als alternative Trainingsverfahren für Deep Neural Networks (bachelor thesis)
  • Nils Bultjer, GP-GPU Implementation of an Experimental Suite for Deep Neural Networks (bachelor thesis)
  • Michaela Richter, Steuerung eines Micro Aerial Vehicles mit einem Rekurrenten Neuronalen Netz (bachelor thesis)

Publications

2023

  • J. Lang, M. Giese, W. Ilg, and S. Otte, “Generating sparse counterfactual explanations for multivariate time series,” in International Conference on Artificial Neural Networks (ICANN), 2023, accepted for publication. [ arXiv ]
  • S. Oladyshkin, T. Praditia, I. Kroeker, F. Mohammadi, W. Nowak, and S. Otte, “The deep arbitrary polynomial chaos neural network or how deep artificial neural networks could benefit from data-driven homogeneous chaos theory,” Neural Networks, vol. 166, pp. 85–104, 2023.
  • C. C. Horuz, M. Karlbauer, T. Praditia, M. V. Butz, S. Oladyshkin, W. Nowak, and S. Otte, “Physical domain reconstruction with finite volume neural networks,” Applied Artificial Intelligence, vol. 37, no. 1, p. 2204261, 2023.
  • M. Traub, S. Otte, T. Menge, M. Karlbauer, J. Thümmel, and M. V. Butz, “Learning what and where: Disentangling location and identity tracking without supervision,” in The Eleventh International Conference on Learning Representations (ICLR), 2023. [ arXiv ]

2022

  • V. Wulfmeyer, J. M. V. Pineda, S. Otte, M. Karlbauer, M. V. Butz, T. R. Lee, M. Buban, and V. Rajtschan, “Estimation of the Surface Fluxes for Heat and Momentum in Unstable Conditions with Machine Learning and Similarity Approaches for the LAFE Data Set,” Boundary-Layer Meteorology, Nov. 2022.
  • T. Praditia, M. Karlbauer, S. Otte, S. Oladyshkin, M. V. Butz, and W. Nowak, “Learning groundwater contaminant diffusion-sorption processes with a finite volume neural network,” Water Resources Research, p. e2022WR033149, 2022, Editor's highlight.
  • G. Martín-Sánchez, S. Bohté, and S. Otte, “A taxonomy of recurrent learning rules,” in International Conference on Artificial Neural Networks (ICANN). Springer Nature Switzerland, 2022, pp. 478–490. [ arXiv ]
  • M. Hoyer, S. Eivazi, and S. Otte, “Efficient LSTM training with eligibility traces,” in International Conference on Artificial Neural Networks (ICANN). Springer Nature Switzerland, 2022, pp. 334–346. [ arXiv ]
  • C. C. Horuz, M. Karlbauer, T. Praditia, M. V. Butz, S. Oladyshkin, W. Nowak, and S. Otte, “Inferring boundary conditions in finite volume neural networks,” in International Conference on Artificial Neural Networks (ICANN). Springer Nature Switzerland, 2022, pp. 538–549.
  • F. Kaltenberger, S. Otte, and M. V. Butz, “Binding dancers into attractors,” in 12th IEEE International Conference on Development and Learning (ICDL), 2022. [ arXiv ]
  • S. Fabi, S. Otte, F. Scholz, J. Wührer, M. Karlbauer, and M. V. Butz, “Extending the omniglot challenge: Imitating handwriting styles on a new sequential dataset,” IEEE Transactions on Cognitive and Developmental Systems, 2022.
  • F. Scholz, C. Gumbsch, S. Otte, and M. V. Butz, “Inference of affordances and active motor control in simulated agents,” Frontiers in Neurorobotics, vol. 16, Aug. 2022. [ arXiv ]
  • F. Scholz, C. Gumbsch, S. Otte, and M. V. Butz, “Inference of affordances and active motor control in simulated agents,” in 15th Biannual Conference of the German Society for Cognitive Science (KogWis), 2022, extended abstract.
  • M. Karlbauer, T. Praditia, S. Otte, S. Oladyshkin, W. Nowak, and M. V. Butz, “Composing partial differential equations with physics-aware neural networks,” in Proceedings of the 39th International Conference on Machine Learning, ser. Proceedings of Machine Learning Research, vol. 162. PMLR, Jul 2022, pp. 10773–10801. [ arXiv ]
  • J. Lang, M. Giese, W. Ilg, and S. Otte, “Generating sparse counterfactual explanations for multivariate time series,” 2022. [ arXiv ]
  • P. Schmidt-Barbo, S. Otte, M. V. Butz, R. H. Baayen, and K. Sering, “Using semantic embeddings for initiating and planning articulatory speech synthesis,” in Studientexte zur Sprachkommunikation: Elektronische Sprachsignalverarbeitung 2022, 2022, pp. 32–42.

2021

  • M. Traub, R. Legenstein, and S. Otte, “Many-joint robot arm control with recurrent spiking neural networks,” in IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Sep. 2021, pp. 4895–4902. [ arXiv ]
  • M. Traub, M. V. Butz, R. Legenstein, and S. Otte, “Dynamic action inference with recurrent spiking neural networks,” in International Conference on Artificial Neural Networks (ICANN), no. 12895. Springer International Publishing, Sep. 2021, pp. 233–244.
  • M. Ciurletti, M. Traub, M. Karlbauer, M. V. Butz, and S. Otte, “Signal denoising with recurrent spiking neural networks and active tuning,” in International Conference on Artificial Neural Networks (ICANN), no. 12895. Springer International Publishing, Sep. 2021, pp. 220–232.
  • J. Lang, M. A. Giese, M. Synofzik, W. Ilg, and S. Otte, “Early recognition of ball catching success in clinical trials with rnn-based predictive classification,” in International Conference on Artificial Neural Networks (ICANN), no. 12894. Springer International Publishing, Sep. 2021, pp. 444–456. [ arXiv ]
  • S. Fabi, S. Otte, and M. V. Butz, “Fostering compositionality in latent, generative encodings to solve the omniglot challenge,” in International Conference on Artificial Neural Networks (ICANN), no. 12892. Springer International Publishing, Sep. 2021, pp. 525–536.
  • M. Karlbauer, T. Menge, S. Otte, H. P. Lensch, T. Scholten, V. Wulfmeyer, and M. V. Butz, “Latent state inference in a spatiotemporal generative model,” in International Conference on Artificial Neural Networks (ICANN), no. 12894. Springer International Publishing, Sep. 2021, pp. 384–395. [ arXiv ]
  • M. Sadeghi, F. Schrodt, S. Otte, and M. V. Butz, “Binding and perspective taking as inference in a generative neural network model,” in International Conference on Artificial Neural Networks (ICANN), no. 12893. Springer International Publishing, Sep. 2021, pp. 3–14. [ arXiv ]
  • S. Mahdi, F. Schrodt, S. Otte, and M. V. Butz, “Gestalt perception of biological motion: A generative artificial neural network model,” in 11th IEEE International Conference on Development and Learning (ICDL), no. 271, Aug. 2021.
  • D. Koryakin, S. Otte, and M. V. Butz, “Inference of time series components by online co-evolution,” Genetic Programming and Evolvable Machines, Jul. 2021.
  • K. Lingelbach, Y. Lingelbach, S. Otte, M. Bui, T. Künzell, and M. Peissner, “Demand forecasting using ensemble learning for effective scheduling of logistic orders,” in 12th International Conference on on Applied Human Factors and Ergonomics (AHFE 2021), Jul. 2021, pp. 313–321.
  • D. Humaidan, S. Otte, C. Gumbsch, C. Wu, and M. V. Butz, “Latent event-predictive encodings through counterfactual regularization,” in Annual Meeting of the Cognitive Science Society (CogSci), 2021. [ arXiv ]
  • S. Fabi, S. Otte, and M. V. Butz, “Compositionality as learning bias in generative rnns solves the omniglot challenge,” in International Conference on Learning Representations (ICRL) – Workshop Learning to Learn, 2021.
  • T. Praditia, M. Karlbauer, S. Otte, S. Oladyshkin, M. V. Butz, and W. Nowak, “Finite volume neural network: Modeling subsurface contaminant transport,” in International Conference on Learning Representations (ICRL) – Workshop Deep Learning for Simulation, 2021. [ arXiv ]

2020

  • K. Sering, P. Schmidt-Barbo, S. Otte, M. V. Butz, and R. H. Baayen, “Recurrent gradient-based motor inference for speech resynthesis with a vocal tract simulator,” in 12th International Seminar on Speech Production, 2020.
  • S. Otte, M. Karlbauer, and M. V. Butz, “Active tuning,” 2020. [ arXiv ]
  • M. Traub, M. V. Butz, R. H. Baayen, and S. Otte, “Learning precise spike timings with eligibility traces,” in Artificial Neural Networks and Machine Learning – ICANN 2020, ser. Lecture Notes in Computer Science, vol. 12397. Springer International Publishing, Sep. 2020, pp. 659–669. [ arXiv ]
  • D. Humaidan, S. Otte, and M. V. Butz, “Fostering event compression using gated surprise,” in Artificial Neural Networks and Machine Learning – ICANN 2020, ser. Lecture Notes in Computer Science, vol. 12396. Springer International Publishing, Sep. 2020, pp. 155–167. [ arXiv ]
  • M. Karlbauer, S. Otte, H. P. Lensch, T. Scholten, V. Wulfmeyer, and M. V. Butz, “Inferring, predicting, and denoising causal wave dynamics,” in Artificial Neural Networks and Machine Learning – ICANN 2020, ser. Lecture Notes in Computer Science, vol. 12396. Springer International Publishing, Sep. 2020, pp. 566–577. [ arXiv ]
  • S. Fabi, S. Otte, J. G. Wiese, and M. V. Butz, “Investigating efficient learning and compositionality in generative lstm networks,” in Artificial Neural Networks and Machine Learning – ICANN 2020, ser. Lecture Notes in Computer Science, vol. 12396, Sep. 2020, pp. 143–154. [ arXiv ]
  • M. Hobbhahn, M. V. Butz, S. Fabi, and S. Otte, “Sequence classification using ensembles of recurrent generative expert modules,” in 28th European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning (ESANN), Bruges, Belgium, Oct. 2020, pp. 333–338.
  • M. Karlbauer, S. Otte, H. P. Lensch, T. Scholten, V. Wulfmeyer, and M. V. Butz, “A distributed neural network architecture for robust non-linear spatio-temporal prediction,” in 28th European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning (ESANN), Bruges, Belgium, Oct. 2020, pp. 303–308. [ arXiv ]

2019

  • S. Otte, P. Rubisch, and M. V. Butz, “Gradient-Based Learning of Compositional Dynamics with Modular RNNs”, in Artificial Neural Networks and Machine Learning – ICANN 2019, 2019, pp. 484–496. Best paper award.
  • S. Otte, J. Stoll, and M. V. Butz, “Incorporating Adaptive RNN-based Action Inference and Sensory Perception”, in Artificial Neural Networks and Machine Learning – ICANN 2019, 2019, pp. 543–555.
  • M. V. Butz, Menge Tobias, D. Humaidan, and S. Otte, “Inferring Event-Predictive Goal-Directed Object Manipulations in REPRISE”, in Artificial Neural Networks and Machine Learning – ICANN 2019, 2019, pp. 639–653.
  • M. V. Butz, D. Bilkey, D. Humaidan, A. Knott, and S. Otte, “Learning, planning, and control in a monolithic neural event inference architecture”, Neural Networks, May 2019. [ arXiv ]

2018

  • S. Otte, L. Hofmaier, and M. V. Butz, “Integrative Collision Avoidance Within RNN-Driven Many-Joint Robot Arms”, in Artificial Neural Networks and Machine Learning – ICANN 2018, 2018, pp. 748–758.
  • P. Kuhlmann, P. Sanzenbacher, and S. Otte, “Online Carry Mode Detection for Mobile Devices with Compact RNNs”, in Artificial Neural Networks and Machine Learning – ICANN 2018, 2018, pp. 232–241.
  • M. V. Butz, D. Bilkey, A. Knott, and S. Otte, “REPRISE: A Retrospective and Prospective Inference Scheme”, Proceedings of the 40th Annual Meeting of the Cognitive Science Society, pp. 1427–1432, Jul. 2018.
  • A. Zwiener, S. Otte, R. Hanten, and A. Zell, “Configuration Depending Crosstalk Torque Calibration for Robotic Manipulators with Deep Neural Regression Models”, in 15th International Conference on Intelligent Autonomous Systems (IAS), Baden-Baden, Germany, 2018, pp. 361–373.
  • R. Hanten, P. Kuhlmann, S. Otte, and A. Zell, “Robust Real-Time 3D Person Detection for Indoor and Outdoor Applications”, in 2018 IEEE International Conference on Robotics and Automation (ICRA), Brisbane, Australia, 2018, pp. 2000–2006.

2017

  • S. Otte and M. V. Butz, “Differentiable Oscillators in Recurrent Neural Networks for Gradient-based Sequence Modeling”, in Artificial Neural Networks and Machine Learning – ICANN 2016, 2017, pp. 745–746.
  • S. Otte, T. Schmitt, and M. V. Butz, “Anticipatory Active Inference from Learned Recurrent Neural Forward Models”, in 39th Annual Meeting of the Cognitive Science Society (CogSci), London, United Kingdom, 2017, p. 3803.
  • C. Gumbsch, S. Otte, and M. V. Butz, “A Computational Model for the Dynamical Learning of Event Taxonomies”, in 39th Annual Meeting of the Cognitive Science Society (CogSci), London, United Kingdom, 2017, pp. 452–457.

2016

  • S. Otte, A. Zwiener, R. Hanten, and A. Zell, “Inverse Recurrent Models – An Application Scenario for Many-Joint Robot Arm Control”, in Artificial Neural Networks and Machine Learning – ICANN 2016, 2016, pp. 149–157.
  • A. Dörr, S. Otte, and A. Zell, “Investigating Recurrent Neural Networks for Feature-Less Computational Drug Design”, in Artificial Neural Networks and Machine Learning – ICANN 2016, 2016, pp. 140–148.
  • S. Otte, M. V. Butz, D. Koryakin, F. Becker, M. Liwicki, and A. Zell, “Optimizing recurrent reservoirs with neuro-evolution”, Neurocomputing, vol. 192, pp. 128–138, Jun. 2016.
  • L. Madai-Tahy, S. Otte, R. Hanten, and A. Zell, “Revisiting Deep Convolutional Neural Networks for RGB-D Based Object Recognition”, in Artificial Neural Networks and Machine Learning – ICANN 2016, 2016, pp. 29–37.
  • S. Otte, C. Weiss, T. Scherer, and A. Zell, “Recurrent Neural Networks for Fast and Robust Vibration-based Ground Classification on Mobile Robots”, in IEEE International Conference on Robotics and Automation (ICRA), Stockholm, Sweden, 2016, pp. 5603–5608.
  • R. Hanten, S. Buck, S. Otte, and A. Zell, “Vector-AMCL: Vector based Adaptive Monte Carlo Localization for Indoor Maps”, in 14th International Conference on Intelligent Autonomous Systems (IAS-14), Shanghai, China, 2016.

2015

  • S. Otte, M. Liwicki, and A. Zell, “An Analysis of Dynamic Cortex Memory Networks”, in International Joint Conference on Neural Networks (IJCNN), Killarney, Ireland, 2015, pp. 3338–3345.
  • A. Patel et al., “Confronting the challenge of ‘virtual’ prostate biopsy”, in 8th International Symposium on “Focal Therapy and Imaging in Prostate and Kidney Cancer, 2015.
  • S. Otte, F. Becker, M. V. Butz, M. Liwicki, and A. Zell, “Learning Recurrent Dynamics using Differential Evolution”, in European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning (ESANN), Bruges, Belgium, 2015, pp. 65–70.
  • M. Binz, S. Otte, and A. Zell, “On the Applicability of Recurrent Neural Networks for Pattern Recognition in Electroencephalography Signals”, in Machine Learning Reports 03/2015, Workshop New Challenges in Neural Computation, 2015, pp. 85–92.
  • S. Otte, S. Laible, R. Hanten, M. Liwicki, and A. Zell, “Robust Visual Terrain Classification with Recurrent Neural Networks”, in European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning (ESANN), Bruges, Belgium, 2015, pp. 451–456.

2014

  • S. Otte, U. Schwanecke, and A. Zell, “ANTSAC: A Generic RANSAC Variant Using Principles of Ant Colony Algorithms”, in 22nd International Conference on Pattern Recognition (ICPR), 2014, pp. 3558–3563.
  • S. Otte, M. Liwicki, and A. Zell, “Dynamic Cortex Memory: Enhancing Recurrent Neural Networks for Gradient-Based Sequence Learning”, in Artificial Neural Networks and Machine Learning – ICANN 2014, Ed. Springer International Publishing, 2014, pp. 1–8.
  • S. Otte, M. Liwicki, and D. Krechel, “Investigating Long Short-Term Memory Networks for various Pattern Recognition Problems”, in Machine Learning and Data Mining in Pattern Recognition, P. Perner, Ed. Springer International Publishing, 2014, pp. 484–497.
  • C. Otte et al., “Investigating Recurrent Neural Networks for OCT A-Scan based Tissue Analysis”, Methods of Information in Medicine, vol. 53, no. 4, pp. 245–249, 2014.
  • L. Wittig, C. Otte, S. Otte, G. Hüttmann, D. Drömann, and A. Schlaefer, “Tissue analysis of solitary pulmonary nodules using OCT A-Scan imaging needle probe”, European Respiratory Journal, vol. 44, no. Suppl 58, p. P4979, Sep. 2014.

2013

  • C. Otte, S. Otte, L. Wittig, G. Hüttmann, D. Drömann, and A. Schlaefer, “Identifizierung von Tumorgewebe in der Lunge mittels optischer Kohärenztomographie”, Lübeck, 2013.
  • S. Otte, D. Krechel, and M. Liwicki, “JANNLab Neural Network Framework for Java”, in Poster Proceedings ofthe International Conference on Machine Learning and Data Mining (MLDM), New York, USA, 2013, pp. 39–46.
  • S. Otte et al., “OCT A-Scan based lung tumor tissue classification with Bidirectional Long Short Term Memory networks”, in IEEE International Workshop on Machine Learning for Signal Processing (MLSP), 2013.

2012

  • S. Otte, M. Liwicki, D. Krechel, and A. Dengel, “Local Feature based Online Mode Detection with Recurrent Neural Networks”, in ICFHR 2012: International Conference on Frontiers in Handwriting Recognition, 2012.

2011

  • S. Otte, U. Schwanecke, and P. Barth, “Mobile 3D Vision - 3D Scene Reconstruction with a Trifocal View on a Mobile Device”, in 06. Multimediakongress, Wismar, Germany, 2011.