Oxford Events, the new replacement for OxTalks, will launch on 16th March. From now until the launch of Oxford Events, new events cannot be published or edited on OxTalks while all existing records are migrated to the new platform. The existing OxTalks site will remain available to view during this period.
From 16th, Oxford Events will launch on a new website: events.ox.ac.uk, and event submissions will resume. You will need a Halo login to submit events. Full details are available on the Staff Gateway.
How does the structure of a neural network shape its function? In this talk I will introduce partially recurrent neural networks (pRNNs): a model in which a set of connection pathways can be combined combinatorially to generate a complete taxonomy of architectures between feedforward and fully recurrent. I will present two functional explorations across these structures. First, using closed-form solutions, I will demonstrate that linear pRNNs exhibit surprisingly diverse temporal dynamics, including transient amplifications and oscillations, which are approximately invariant to network size. Second, using nonlinear pRNNs trained with deep reinforcement learning, I will show that distinct architectures differ in their learning speed, peak performance, and robustness to various perturbations. I will conclude by mapping these functional differences to specific network traits, illustrating how pRNNs can illuminate structure-function principles relevant to both neuroscience and machine learning.