This seminar will focus on recent research on binding, attentional processing, and the development of latent states in artificial neural networks. Particularly generative neural networks are often relying on internal, hidden states, which guide prediction generation. In language annotation and other tasks, transformer networks have been very successful. We will look at the groundbreaking papers in this respect as well as at related work that addresses binding, attention, and latent neural state inference.
Particpants will need to present one paper and submit a short summary of the paper and the discussion that followed. Moreover, participation in all other talks is required.
|Dozent||Prof. Dr. Martin Butz & Dr. Sebastian Otte|
Di. 12 c.t. - 14
|Ilias||Link to ILIAS Seminar Class Room (If you have not done so yet, please register in Alma also!)|