Structure in the randomness of trained recurrent neural networks
Recurrent neural networks are an important class of models for explaining neural computations. Recently, there has been progress both in training these networks to perform various tasks, and in relating their activity to that recorded in the brain. Specifically, these models seem to capture the complexity of realistic neural responses. Despite this progress, there are many fundamental gaps towards a theory of these networks. What does it mean to understand a trained network? What types of regularities should we search for? How does the network reflect the task and its environment? I will present several examples of such regularities, in both the structure and the dynamics that arise through training.
Date:
15 November 2019, 14:00 (Friday, 5th week, Michaelmas 2019)
Venue:
Le Gros Clark Building, off South Parks Road OX1 3QX
Venue Details:
Lecture Theatre
Speaker:
Dr. Omri Barak (Technion Israel Institute of Technology)
Organiser:
Dr Chaitanya Chintaluri (University of Oxford)
Organiser contact email address:
chaitanya.chintaluri@dpag.ox.ac.uk
Host:
Dr Tim Vogels
Topics:
Booking required?:
Not required
Audience:
Members of the University only
Editor:
Chaitanya Chintaluri