Hauptnavigation

Pages about teaching are available in German only Zurück zu der Liste der Abschlussarbeiten

Offene Abschlussarbeiten

Forward-Forward Algorithms for Self-Supervised Training of Deep Networks

Title Forward-Forward Algorithms for Self-Supervised Training of Deep Networks
Description

Forward Forward algorithms [1,2] offer an alternative to traditional backpropagation training of deep networks. Unlike backpropagation, where spreading the computation over multiple devices requires synchronisation and wait-times [3], in forward-forward optimization training targets are not propagated through the network, eliminating the need for wait-times during training.

 

Qualification
  • Preliminary knowledge on machine learning, data science
  • Python programming
  • Good understanding of the mathematical foundations of optimization and linear algebra

 

If you are interested, please write a meaningful email that addresses your previous experience, interests, and strengths.

Proposal

In this thesis, we explore the connections with self-supervised learning using contrastive losses [4].

 

  1. Hinton, G.. “The Forward-Forward Algorithm: Some Preliminary Investigations.” ArXiv abs/2212.13345 (2022): https://arxiv.org/abs/2212.13345
  2. Kohan, Adam A. et al. “Signal Propagation: The Framework for Learning and Inference in a Forward Pass.” IEEE Transactions on Neural Networks and Learning Systems (2022): https://arxiv.org/abs/2204.01723
  3. https://pytorch.org/docs/stable/pipeline.html
  4. Chen, T., Kornblith, S., Norouzi, M., & Hinton, G.E. (2020). A Simple Framework for Contrastive Learning of Visual Representations. ArXiv, https://arxiv.org/abs/2002.05709
Thesistype Bachelorthesis
Second Tutor Pfahler, Lukas
Professor Pfahler, Lukas
Status Offen