OxTalks will soon move to the new Halo platform and will become 'Oxford Events.' There will be a need for an OxTalks freeze. This was previously planned for Friday 14th November – a new date will be shared as soon as it is available (full details will be available on the Staff Gateway).
In the meantime, the OxTalks site will remain active and events will continue to be published.
If staff have any questions about the Oxford Events launch, please contact halo@digital.ox.ac.uk
Humans can discover and exploit shared structure in a problem domain to improve learning performance, to the point of being able to learn from a very limited amount of data. The theory of meta-learning hypothesizes that such fast learning is supported by slower learning processes which unfold over many problem instances. While a number of artificial meta-learning algorithms have been proposed, the biological mechanisms that support this form of learning are largely unknown. Here, we present a biologically plausible meta-learning rule in which synaptic changes are buffered and contrasted across more than one problem before being consolidated. Our rule is theoretically justified and, unlike standard machine learning methods, it does not require reversing learning trajectories in time or evaluating second-order derivatives, two operations that are difficult to conceive in neural circuits. Experiments reveal that our meta-learning rule enables deep neural network models to learn new tasks from few labeled examples. We conclude by discussing a systems model where the hippocampus plays the role of an instructor which is in charge of prescribing auxiliary learning problems to the cortex. Our theory suggests that the concerted action of hippocampus and cortex may enable meta-learning to be implemented using a simple synaptic plasticity rule.