BEGIN:VCALENDAR
VERSION:2.0
PRODID:talks.ox.ac.uk
BEGIN:VEVENT
SUMMARY:Does neural learning have an inbuilt Occam's razor? - Prof. Ard Lo
 uis (University of Oxford)
DTSTART;VALUE=DATE-TIME:20260122T150000Z
DTEND;VALUE=DATE-TIME:20260122T170000Z
UID:https://talks.ox.ac.uk/talks/id/f74cfa03-ab85-4c51-ac78-d71d87737501/
DESCRIPTION:The coding theorem from algorithmic information theory  is one
  of the most profound and underappreciated results in science. It can be v
 iewed as a computational reformulation of the infinite monkey theorem:  mo
 nkeys on universal computers instead of on  typewriters.  The theorem pred
 icts that many natural processes are exponentially biased toward highly co
 mpressible outputs\, that is\, toward outcomes with low Kolmogorov complex
 ity. I will discuss applications of this principle to biological evolution
 \, where it implies a strong preference for symmetry [1]\, and to machine 
 learning\, where it predicts an Occam’s razor–like bias that helps exp
 lain why deep neural networks can generalize effectively despite being hea
 vily overparameterized [2]. The central question I would like to ask you i
 s how this generic principle  extends to neural learning. \n\n[1] Symmetry
  and simplicity spontaneously emerge from the algorithmic nature of evolut
 ion. Iain G Johnston et al\, PNAS 119\, e2113883119 (2022).\n [2] Deep neu
 ral networks have an inbuilt Occam’s Razor\, C. Mingard et al\, Nat Comm
 .16\, 220 (2025).\nSpeakers:\nProf. Ard Louis (University of Oxford)
LOCATION:Sherrington Library\, off Parks Road OX1 3PT
TZID:Europe/London
URL:https://talks.ox.ac.uk/talks/id/f74cfa03-ab85-4c51-ac78-d71d87737501/
BEGIN:VALARM
ACTION:display
DESCRIPTION:Talk:Does neural learning have an inbuilt Occam's razor? - Pro
 f. Ard Louis (University of Oxford)
TRIGGER:-PT1H
END:VALARM
END:VEVENT
END:VCALENDAR
