Learning in cortical networks through error back-propagation
To efficiently learn from feedback, the cortical networks need to update synaptic weights on multiple levels of cortical hierarchy. An effective and well-known algorithm for computing such changes in synaptic weights is the error back-propagation. It has been successfully used in both machine learning and modelling of the brain’s cognitive functions. However, in the back-propagation algorithm, the change in synaptic weights is a complex function of weights and activities of neurons not directly connected with the synapse being modified. Hence it has not been known if it can be implemented in biological neural networks. Here we analyse relationships between the back-propagation algorithm and the predictive coding model of information processing in the cortex, in which changes in synaptic weights are only based on activity of pre-synaptic and post-synaptic neurons. We show that when the predictive coding model is used for supervised learning, it performs very similar computations to the back-propagation algorithm. Furthermore, for certain parameters, the weight change in the predictive coding model converges to that of the back-propagation algorithm. This suggests that it is possible for cortical networks with simple Hebbian synaptic plasticity to implement efficient learning algorithms in which synapses in areas on multiple levels of hierarchy are modified to minimize the error on the output.
Date: 10 February 2016, 13:00 (Wednesday, 4th week, Hilary 2016)
Venue: Tinsley Building, Mansfield Road OX1 3TA
Speakers: Speaker to be announced
Organising department: Centre for Neural Circuits and Behaviour
Organiser: Dr Rui Ponte Costa (University of Oxford)
Part of: Oxford Neurotheory Forum
Topics:
Booking required?: Not required
Audience: Public
Editor: Rui Costa