Mapping sounds into space

In the first part of this talk I will focus on how auditory cortex represents sound sources in space during free movement. The spatial location of a sound source must be reconstructed from sound localization cues, principally available through the comparison of the timing and level of the sound at the two ears. These cues are extracted by dedicated brainstem nuclei, but, at least in primates and carnivores, auditory cortex is required for sound localisation. Until recently studies of spatial hearing were exclusively in head-fixed animals. By recording in freely moving animals we were able to explore how neurons map sounds across coordinate frames to provide both head-centred and head-independent representations of sound source location. Training animals to localise sounds in head or world centred coordinates demonstrated that, like humans, ferrets can map sounds in multiple reference frames. In the second part of the talk I will present data from a study in which ferrets were trained to switch between localising visual and auditory elements of an audiovisual stimulus while recordings were made in auditory, frontal and parietal cortex. Our goal was to understand how the representation of sounds differed when sounds were to be attended, versus to be ignored, and how network interactions might shape the processing of sound in different contexts. Preliminary analysis reveals a rich repertoire of sensory and motor response components in both auditory and parietal cortices.