Cognitive Modeling

Dr. rer. nat. Johannes Lohmann

E-Mail johannes.lohmann[at]uni-tuebingen.de
Address Cognitive Modeling, Department of Computer Science
University of Tübingen
Sand 14
72076 Tübingen
Germany
Room C423
Phone +49 7071 29-75464

Research Interests

Technical Interests

Workshops and Conferences

Publications

Journals

Lohmann, J., Belardinelli, A., & Butz, M. V. (2019). Hands Ahead in Mind and Motion: Active Inference in Peripersonal Hand Space. Vision, 3(2), 15. doi:10.3390/vision3020015

Lohmann, J., Schroeder, P. A., Nuerk, H. C., Plewnia, C., & Butz, M. V. (2018). How Deep Is Your SNARC? Interactions Between Numerical Magnitude, Response Hands, and Reachability in Peripersonal Space. Frontiers in Psychology, 9(622). doi:10.3389/fpsyg.2018.00622

Belardinelli, A., Lohmann, J., Farnè, A., & Butz, M. V. (2018): Mental space maps into the future. Cognition, Vol. 176, July 2018, 65-73.

Lohmann, J., & Butz, M. V. (2017): Lost in space: multisensory conflict yields adaptation in spatial representations across frames of reference. Cognitive Processing. doi: 10.1007/s10339-017-0798-5

Lohmann, J., Rolke, B., & Butz, M. V. (2017): In touch with mental rotation: Interactions between mental and tactile rotations and motor responses. Experimental Brain Research. doi: 10.1007/s00221-016-4861-8

Schroeder, P. A., Lohmann, J., Butz, M. V., & Plewnia, C. (2015): Behavioral bias for food reflected in hand movements: A preliminary study with healty subjects. Cyberpsychology, Behavior, and Social Networking. 19(2), 120-126. doi: 10.1089/cyber.2015.0311

Lohmann, J., Herbort, O., & Butz, M. V. (2013). Modeling the temporal dynamics of visual working memory. Cognitive Systems Research, 24, 80-86. doi: 10.1016/j.cogsys.2012.12.009

Koryakin, D., Lohmann, J., & Butz, M. V. (2012). Balanced echo state networks. Neural Networks, 36, 35-45. doi: 10.1016/j.neunet.2012.08.008
 

Conferences and Workshops

Lohmann, J., Weigert, P., & Butz, M. V. (2019). Grasping Uncertainty: A Free Energy Approach To Anticipatory Behavior Control. European Conference for Cognitive Science, EuroCogSci 2019.

Lohmann, J., & Butz, M. V. (2019). Unflinching Predictions: Anticipatory Crossmodal Interactions are Unaffected by the Current Hand Posture. The Annual Meeting of the Cognitive Science Society, CogSci 2019.

Lohmann, J., Janczyk, M., & Butz, M. V. (2018). Common Codes in Virtual Actions. Proceedings of the 14th Biannual Conference of the German Cognitive Science Society (KogWis 2018).

Lohmann, J., Belardinelli, A., & Butz, M. V. (2018). Are you Sure How to Move? Expected Uncertainty Modulates Anticipatory Crossmodal Interactions. The Annual Meeting of the Cognitive Science Society, CogSci 2018.

Lohmann, J., & Butz, M. V. (2016). Multisensory Conflict yields Adaptation in Peripersonal and Extrapersonal Space. Proceedings of the 13th Biannual Conference of the German Cognitive Science Society (KogWis 2016). Received the KogWis 2016 Brain Products Best Paper Award.

Lohmann, J., Kurz, J., Meilinger, T., & Butz, M. V. (2016). Embodied social spaces: Implicit racial bias modulates spatial perspective taking. 58th Conference of Experimental Psychologists, TeaP 2016.

Lohmann, J., Rolke, B., & Butz, M. V. (2016). Touching the embodied mind: Selective interference between mental rotation and tactile stimulation. 58th Conference of Experimental Psychologists, TeaP 2016.

Lohmann, J., & Butz, M. V. (2015). In touch with VWM: Selective interference between tactile stimulation and visual representations of rotations. 57th Conference of Experimental Psychologists, TeaP 2015.

Lohmann, J., & Butz, M. V. (2014). Memory disclosed by motion: predicting visual working memory performance from movement patterns. Proceedings of the 12th Biannual Conference of the German Cognitive Science Society (KogWis 2014).

Lohmann, J., & Butz, M. V. (2013). Modeling Continuous Representations in Visual Working Memory. The Annual Meeting of the Cognitive Science Society, CogSci 2013, pp. 2926-2931.

Lohmann, J. (2013). TVA as the foundation of an integrative model of visual working memory. 55th Conference of Experimental Psychologists, TeaP 2013.

Lohmann, J., Herbort, O., & Butz, M. V. (2012). Modeling the temporal dynamics of visual working memory. International Conference on Cognitive Modeling (ICCM 2012). www.iccm2012.com/proceedings/papers/0044/index.html

Lohmann, J. (2012). Modeling Change Detection with TVA+DM. 2nd International TVA meeting, ITVA 2012.

Gütschow, J., Lohmann, J., Koryakin, D., & Butz, M. V. (2012). Learning Motor Primitives with Echo State Networks. In B. Hammer, & T. Villmann, Workshop New Challenges in Neural Computation 2012, University of Bielefeld, Dept. of Technology CITEC, pp. 20-34.

Lohmann, J. & Butz, M. V. (2011). Learning a neural multimodal body schema: Linking vision with proprioception. In B. Hammer, & T. Villmann, Workshop New Challenges in Neural Computation 2011, University of Bielefeld, Dept. of Technology CITEC, pp. 53-57.

Lohmann, J., Herbort, O., Wagener, A., & Kiesel, A. (2009). Anticipations of time spans: New data from the foreperiod paradigm and the adaptation of a computational model. In G. Pezzulo, M. V. Butz, O. Sigaud, G. Baldassarre (Eds.), Anticipatory Behavior in Adaptive Learning Systems: From Psychological Theories to Artificial Cognitive Systems (pp. 170-187). Berlin, Heidelberg: Springer. (book website).
 

Supplementaries

In Touch with Mental Rotation: Interactions between Mental and Tactile Rotations and Motor Responses

Data of the experiments reported in the manuscript "In Touch with Mental Rotation: Interactions between Mental and Tactile Rotations and Motor Responses". The archive contains the data of the first and the second experiment as well as the R scripts used for the analysis. Click here to download the archive.

Lost in Space: Multisensory Conflict yields Adaptation in Spatial Representations across Frames of Reference

This archive contains the data of the experiment reported in the manuscript "Lost in space: multisensory conflict yields adaptation in spatial representations across frames of reference", as well as R scripts which might be used for the analysis. Furthermore, the archive contains some HTML5 slides with additional visualizations of the dependent measures, some videos illustrating the task and a quick overview of the data. Click here to download the archive.

Augmenting Virtual Object Interactions with Tactile Feedback

This archives contains the blueprint for a tactile stimulation device that can be used to augment virtual reality setups with vibrotactile feedback. The device consists of an Arduino controller and five shaftless vibration motors. The circuit is based on a project described at LearningaboutElectronics. The archive contains the controller code and two Unity® projects, showing how to use the device in a VR setup. For the second example, a Leap motion sensor is required. Click here to download the archive, or click here to view the overview (also included in the archive).