CondMat for Dummies: Attention Is All You Need: Transformers and the Physics They're Starting to Solve
- Date
- Apr 17, 2026
- Time
- 5:00 PM - 6:00 PM
- Speaker
- Giovanni Cemin
- Affiliation
- MPIPKS
- Language
- en
- Main Topic
- Physik
- Other Topics
- Physik
- Description
- In 2017, a paper with an unusually confident title proposed replacing recurrent neural networks with a mechanism called self-attention — and quietly started a revolution. In this talk I will explain what attention is, why it works, and how it became the dominant paradigm in machine learning. Starting from the limitations of RNNs, I will introduce the transformer architecture. Beyond the applications in large language models, I will discuss how this machinery is finding its way into physics: the variational description of strongly correlated quantum systems, and the operator-learning approach to solving partial differential equations.
Last modified: Apr 17, 2026, 7:38:57 AM
Location
Max-Planck-Institut für Physik komplexer SystemeNöthnitzer Straße3801187Dresden
- Phone
- + 49 (0)351 871 0
- MPI-PKS
- Homepage
- http://www.mpipks-dresden.mpg.de
Organizer
Max-Planck-Institut für Physik komplexer SystemeNöthnitzer Straße3801187Dresden
- Phone
- + 49 (0)351 871 0
- MPI-PKS
- Homepage
- http://www.mpipks-dresden.mpg.de
Legend
- Biology
- Chemistry
- Civil Eng., Architecture
- Computer Science
- Economics
- Electrical and Computer Eng.
- Environmental Sciences
- for Pupils
- Law
- Linguistics, Literature and Culture
- Materials
- Mathematics
- Mechanical Engineering
- Medicine
- Physics
- Psychology
- Society, Philosophy, Education
- Spin-off/Transfer
- Traffic
- Training
- Welcome
