|Instructors||Kevin Laube, Maximus Mutschler|
|Credits||3 LP (new PO), 4 LP (old PO)|
Deep learning is a subfield of machine learning that has achieved significant state-of-the-art results in many areas of artificial intelligence, including computer vision and robotics, and has been advancing very quickly in recent years. This seminar aims to take a closer look at how the network weights and the network topologies are optimized.
It takes shape as a paper reading and discussing the concept of "learning and learning". A collection of papers from selected journals and conferences is provided for the students to choose from. In each meeting, one topic is presented by the students.
Students are graded based on:
- their presentation (20 to 25 minutes),
- a short (8-12 pages) report that they write on the subject, and
- two short (1 to 2 pages) review of other reports.
- their participation in post-presentation discussions. So, attendance is required to pass the course.
The date for the first meeting can be seen from the table above. In the session, each student chooses one topic and the presentations will start after two weeks; one presentation in each meeting. Participation in the preliminary meeting is required. If you are unable to attend this session, please write to email to firstname.lastname@example.org.
Important note: The available topics require a basic foundation in modern deep neural networks.
Important note: Since there is a maximum number of 12 participants in this course, please register in ILIAS as soon as possible if you are interesting in taking the seminar.
This is a MSc Seminar. Interested BSc students are welcome as well, if they match the requirements.
This seminar requires basic knowledge about how modern deep neural networks are trained (e.g. Gradient Descent, dropout) and common architectures (e.g. ResNet, MobileNetV2). It is not necessary but helpful to have attended the Deep Neural Network (Deep Learning) course.
Furthermore, it is helpful to have a good background in mathematics (linear algebra, statistics).