BEGIN:VCALENDAR
VERSION:2.0
PRODID:www.dresden-science-calendar.de
METHOD:PUBLISH
CALSCALE:GREGORIAN
X-MICROSOFT-CALSCALE:GREGORIAN
X-WR-TIMEZONE:Europe/Berlin
BEGIN:VTIMEZONE
TZID:Europe/Berlin
X-LIC-LOCATION:Europe/Berlin
BEGIN:DAYLIGHT
TZNAME:CEST
TZOFFSETFROM:+0100
TZOFFSETTO:+0200
DTSTART:19810329T030000
RRULE:FREQ=YEARLY;INTERVAL=1;BYMONTH=3;BYDAY=-1SU
END:DAYLIGHT
BEGIN:STANDARD
TZNAME:CET
TZOFFSETFROM:+0200
TZOFFSETTO:+0100
DTSTART:19961027T030000
RRULE:FREQ=YEARLY;INTERVAL=1;BYMONTH=10;BYDAY=-1SU
END:STANDARD
END:VTIMEZONE
BEGIN:VEVENT
UID:DSC-22820
DTSTART;TZID=Europe/Berlin:20260417T170000
SEQUENCE:1776404337
TRANSP:OPAQUE
DTEND;TZID=Europe/Berlin:20260417T180000
URL:https://dresden-science-calendar.de/calendar/en/detail/22820
LOCATION:MPI-PKS\, Nöthnitzer Straße 3801187 Dresden
SUMMARY:Cemin: CondMat for Dummies: Attention Is All You Need: Transformers
  and the Physics They're Starting to Solve
CLASS:PUBLIC
DESCRIPTION:Speaker: Giovanni Cemin\nInstitute of Speaker: MPIPKS\nTopics:\
 nPhysik\n Location:\n  Name: MPI-PKS ()\n  Street: Nöthnitzer Straße 38\
 n  City: 01187 Dresden\n  Phone: + 49 (0)351 871 0\n  Fax: \nDescription: 
 In 2017\, a paper with an unusually confident title proposed replacing rec
 urrent neural networks with a mechanism called self-attention — and quie
 tly started a revolution. In this talk I will explain what attention is\, 
 why it works\, and how it became the dominant paradigm in machine learning
 . Starting from the limitations of RNNs\, I will introduce the transformer
  architecture. Beyond the applications in large language models\, I will d
 iscuss how this machinery is finding its way into physics: the variational
  description of strongly correlated quantum systems\, and the operator-lea
 rning approach to solving partial differential equations.
DTSTAMP:20260515T150135Z
CREATED:20260416T053620Z
LAST-MODIFIED:20260417T053857Z
END:VEVENT
END:VCALENDAR