Modul RO4100-KP08

Robot Learning (RobLe)


Dauer

2 Semester

Angebotsturnus

Jährlich, kann sowohl im SoSe als auch im WiSe begonnen werden

Leistungspunkte

8

Studiengang, Fachgebiet und Fachsemester:

  • Master Robotics and Autonomous Systems 2019, Pflicht, Pflicht-Lehrmodule, 1. und 2. Fachsemester

Lehrveranstaltungen:

  • CS4295-Ü: Deep Learning (Übung, 1 SWS)
  • CS4575-V: Sequence Learning (Vorlesung, 2 SWS)
  • CS4575-Ü: Sequence Learning (Übung, 1 SWS)
  • CS4295-V: Deep Learning (Vorlesung, 2 SWS)

Workload:

  • 120 Stunden Eigenständige Projektarbeit
  • 120 Stunden Selbststudium
  • 60 Stunden Präsenzübung
  • 60 Stunden Präsenzstudium

Lehrinhalte:

  • Foundations and Deep Learning Basics (Learning Paradigms, Classification and Regression, Underfitting and Overfitting)
  • Shallow Neural Networks (Basic Neuron Model, Multilayer Perceptions, Backpropagation, Computational Graphs, Universal Approximation Theorem, No-Free Lunch Theorems, Inductive Biases)
  • Optimization (Stochastic Gradient Descent, Momentum Variants, Adaptive Optimizer)
  • Convolutional Neural Networks (1D Convolution, 2D Convolution, 3D Convolution, ReLUs and Variants, Down and Up Sampling Techniques, Transposed Convolution)
  • Regularization (Early Stopping, L1 and L2 Regularization, Label Smoothing, Dropout Strategies, Batch Normalization)
  • Very Deep Networks (Highway Networks, Residual Blocks, ResNet Variants, DenseNets)
  • Dimensionality Reduction (PCA, t-SNE, UMAP, Autoencoder)
  • Generative Neural Networks (Variational Autoencoder, Generative Adversarial Networks, Diffusion Models)
  • Graph Neural Networks (Graph Convolutional Networks, Graph Attention Networks)
  • Fooling Deep Neural Networks (Adversarial Attacks, White Box and Black Box Attacks, One-Pixel Attacks)
  • Physics-Aware Deep Learning (Physical Knowledge as Inductive Bias, PINN, PhyDNet, Neural ODE, FINN)
  • Introduction to Sequence Learning (Formalisms, Metrics, Recapitulation of Relevant Machine Learning Techniques)
  • Recurrent Neural Networks (Simple RNN Models, Backpropagation Through Time)
  • Gated Recurrent Networks (Vanishing Gradient Problem in RNNs, Long Short-Term Memories, Gated Recurrent Units, Stacked RNNs)
  • Important Techniques for RNNs (Teacher Forcing, Scheduled Sampling, h-Detach)
  • Bidirectional RNNs and related concepts
  • Hierarchical RNNs and Learning on Multiple Time Scales
  • Online Learning and Learning without BPTT (Real-Time Recurrent Learning, e-Prop, Forward Propagation Through Time)
  • Reservoir Computing (Echo State Networks, Deep ESNs)
  • Spiking Neural Networks (Spiking Neuron Models, Learning in SNNs, Neuromorphic Computing, Recurrent SNNs)
  • Temporal Convolution Networks (Causal Convolution, Temporal Dilation, TCN-ResNets)
  • Introduction to Transformers (Sequence-to-Sequence Learning, Basics on Attention, Self-Attention and the Query-Key-Value Principle, Large Language Models)
  • State Space Models (Structured State Space Sequence Models, Mamba)

Qualifikationsziele/Kompetenzen:

  • Students get a fundamental understanding deep learning basics such as backpropagation, computational graphs, and auto-differentiation
  • Students understand the implications of inductive biases
  • Students get a comprehensive understanding of most relevant deep learning approaches
  • Students learn to analyze the challenges in deep learning tasks and to identify well-suited approaches to solve them
  • Students will understand the pros and cons of various deep learning models
  • Students know how to analyze the models and results, to improve the model parameters, and to interpret the model predictions and their relevance
  • Students get a comprehensive understanding of most relevant sequence learning approaches
  • Students learn to analyze the challenges in sequence learning tasks and to identify well-suited approaches to solve them
  • Students will understand the pros and cons of various sequence learning models
  • Students can implement common and custom sequence learning models for time series analysis, classification, and forecasting
  • Students know how to analyze the models and results, to improve the model parameters, and to interpret the model predictions and their relevance

Vergabe von Leistungspunkten und Benotung durch:

  • Klausur oder mündliche Prüfung nach Maßgabe des Dozenten

Modulverantwortliche:

  • Prof. Dr. Sebastian Otte

Lehrende:

Literatur:

  • Goodfellow, I., Bengio, Y., & Courville, A. : Deep Learning MIT Press (2016), ISBN 978-0262035613
  • Nakajima, K., & Fischer, I. : Reservoir Computing: Theory, Physical Implementations, and Applications Springer Nature Singapore (2021), ISBN 978-9811316869
  • Sun, R., & Giles, C. : Sequence Learning: Paradigms, Algorithms, and Applications Springer Berlin Heidelberg (2001), ISBN 978-3540415978
  • Bishop, C. M. : Pattern Recognition and Machine Learning Springer (2006), ISBN 978-0387310732
  • Sutton, R., & Barto, A. : Reinforcement Learning: An Introduction The MIT Press (2018), ISBN 978-0262039246
  • François-Lavet, V., Henderson, P., Islam, R., Bellemare, M., & Pineau, J. : An Introduction to Deep Reinforcement Learning Now Publishers Inc (2018), ISBN 978-1680835380
  • Recent publications on the related topics :

Sprache:

  • Wird nur auf Englisch angeboten

Bemerkungen:

Admission requirements for taking the module:
- None

Admission requirements for participation in module examination(s):
- Successful completion of exercise assignments as specified at the beginning of the semester

Module Exam(s):
- CS4295-L1: Deep Learning, exam, 90 min, 50% of the module grade
- CS4575-L1: Sequence Learning, exam, 90 min, 50% of the module grade

Letzte Änderungen:

07.02.2024