Use Single SignOn (SSO) to login with your home institute's login service and credentials.

21–22 Oct 2024
Humanistiska Teatern, Engelska Parken. Uppsala
Europe/Stockholm timezone

When BART Writes Lagrangians: Towards a HEP-LLM (Large Lagrangian Model)

22 Oct 2024, 13:45
15m
Humanistiska Teatern, Engelska Parken. Uppsala

Humanistiska Teatern, Engelska Parken. Uppsala

Thunbergsvägen 3C

Speaker

Yong Sheng Koay (Uppsala University)

Description

By drawing parallels between linguistic structure and the formulation of Lagrangians — where fields, terms, and symmetry conservation resemble words, sentences, and grammatical structures — we explore the use of transformer architectures, specifically BART, for generating Lagrangians in particle theory. Trained on datasets of approximately 300,000 Lagrangians each, our models demonstrate high accuracy in generating symmetry-conserving Lagrangians containing up to six fields. They demonstrate the ability to recognise important features such as dummy indices, spins, field conjugations and much more. While still in the exploratory phase, it demonstrates the potential of leveraging machine learning for formal theoretical tasks in particle physics and lays the groundwork for developing foundational models in this field.

Primary authors

Eliel Camargo-Molina (Uppsala University) Rikard Enberg (Uppsala University) Stefano Moretti (University of Uppsala) Yong Sheng Koay (Uppsala University)

Presentation materials