On 28th November OxTalks will move to the new Halo platform and will become 'Oxford Events' (full details are available on the Staff Gateway).
There will be an OxTalks freeze beginning on Friday 14th November. This means you will need to publish any of your known events to OxTalks by then as there will be no facility to publish or edit events in that fortnight. During the freeze, all events will be migrated to the new Oxford Events site. It will still be possible to view events on OxTalks during this time.
If you have any questions, please contact halo@digital.ox.ac.uk
Acoustic contexts – the sounds that characterize a space or an event- are rich in information. Our auditory systems readily detect and learn the acoustic context, often implicitly, and use it to build expectations that facilitate change detection, or background suppression. We know little about where and how in the brain this learning takes place. Using neuronal activity recorded from subcortical and cortical stations in the auditory system of mice exposed to predictable and unpredictable sound contexts, I will illustrate how sensitive the auditory system is to predictability in the surrounding acoustic context, whether this has the form of a passive sound stream or an interactive environment.