OxTalks will soon be transitioning to Oxford Events (full details are available on the Staff Gateway). A two-week publishing freeze is expected in early Hilary to allow all events to be migrated to the new platform. During this period, you will not be able to submit or edit events on OxTalks. The exact freeze dates will be confirmed as soon as possible.
If you have any questions, please contact halo@digital.ox.ac.uk
How does the structure of a neural network shape its function? In this talk I will introduce partially recurrent neural networks (pRNNs): a model in which a set of connection pathways can be combined combinatorially to generate a complete taxonomy of architectures between feedforward and fully recurrent. I will present two functional explorations across these structures. First, using closed-form solutions, I will demonstrate that linear pRNNs exhibit surprisingly diverse temporal dynamics, including transient amplifications and oscillations, which are approximately invariant to network size. Second, using nonlinear pRNNs trained with deep reinforcement learning, I will show that distinct architectures differ in their learning speed, peak performance, and robustness to various perturbations. I will conclude by mapping these functional differences to specific network traits, illustrating how pRNNs can illuminate structure-function principles relevant to both neuroscience and machine learning.