Neural coding in the auditory cortex - Emergent Scientists Seminar Series

After the talk, at 5.15 pm, everyone is welcome to join for a virtual "pub chat" with dr Jennifer Lawlor and Aleksandar Ivanov. Details on how to join the talk and the informal "pub chat" will be released via our mailing list (info on how to subscribe on

Dr Jennifer Lawlor

Title: Tracking changes in complex auditory scenes along the cortical pathway

Complex acoustic environments, such as a busy street, are characterised by their everchanging dynamics. Despite their complexity, listeners can readily tease apart relevant changes from irrelevant variations. This requires continuously tracking the appropriate sensory evidence while discarding noisy acoustic variations. Despite the apparent simplicity of this perceptual phenomenon, the neural basis of the extraction of relevant information in complex continuous streams for goal-directed behavior is currently not well understood. As a minimalistic model for change detection in complex auditory environments, we designed broad-range tone clouds whose first-order statistics change at a random time. Subjects (humans or ferrets) were trained to detect these changes.They were faced with the dual-task of estimating the baseline statistics and detecting a potential change in those statistics at any moment. To characterize the extraction and encoding of relevant sensory information along the cortical hierarchy, we first recorded the brain electrical activity of human subjects engaged in this task using electroencephalography. Human performance and reaction times improved with longer pre-change exposure, consistent with improved estimation of baseline statistics. Change-locked and decision-related EEG responses were found in a centro-parietal scalp location, whose slope depended on change size, consistent with sensory evidence accumulation. To further this investigation, we performed a series of electrophysiological recordings in the primary auditory cortex (A1), secondary auditory cortex (PEG) and frontal cortex (FC) of the fully trained behaving ferret. A1 neurons exhibited strong onset responses and change-related discharges specific to neuronal tuning. PEG population showed reduced onset-related responses, but more categorical change-related modulations. Finally, a subset of FC neurons (dlPFC/premotor) presented a generalized response to all change-related events only during behavior. We show using a Generalized Linear Model (GLM) that the same subpopulation in FC encodes sensory and decision signals, suggesting that FC neurons could operate conversion of sensory evidence to perceptual decision. All together, these area-specific responses suggest a behavior-dependent mechanism of sensory extraction and generalization of task-relevant event.

Aleksandar Ivanov

How does the auditory system adapt to different environments: A song of echoes and adaptation

Almost every natural sound is accompanied by many delayed and distorted copies of itself, known as echoes or reverberation, caused by reflections from nearby surfaces. Unless the environment is very echoic (imagine a big cave), our brains cope effortlessly with reverberation. In contrast, reverberation can cause severe difficulties for speech recognition algorithms and hearing-impaired people. How might the healthy auditory system cope so well with reverberation? To answer this question, we used a rich data set of anechoic natural sounds, including speech, textures and other environmental sounds, and made versions of them with different amount of reverberation. We then ask the question: What would be the optimal algorithm that will recover the clean anechoic sounds from their reverberant counterparts? To do so, we train a generalised linear model (GLM) and build our model in a way such that the model “neurons” can be directly compared to the properties of real auditory cortical neurons. We find that the model retrieves some known properties of neurons such as frequency tuning and temporally asymmetric auditory filters (STRF) with excitation followed by inhibition. The model also makes two novel predictions: (1) The inhibitory part of the neuronal filter scales with the amount of reverberation so as to cancel the excess echoes; (2) This inhibitory scaling is frequency dependent and tracks the profile of the acoustic space. We verify these predictions by recording neuronal responses in ferret auditory cortex using Neuropixels electrodes. We also explore putative mechanisms allowing for the implementation of this adaptation relying on local inhibitory interneurons. Thus, our data show that auditory cortical neurons seem to adapt to reverberation by adjusting their filtering properties in order to minimize its negative impact.