Improving temporal coding in cochlear implants: what can animal models tell us?

Cochlear implants (CIs) are remarkable prosthetic devices which have allowed over one million severely deaf individuals to regain sufficient auditory perception to have conversations on the phone, but they are not withouh their shortcomings. For example, patients often experience difficulties in utilizing temporal cues for spatial hearing or pitch perception. The underlying causes of this temporal processing deficit may be attributable to both biological and technological factors. However, investigating the exact causes of poor temporal processing in patients is challenging due to the confounding influence of their clinical needs on experimental design.

To address this, we conducted a series of studies on neonatally deafened rats, examining their ability to utilize interaural time differences (ITDs) for localizing CI stimuli. Notably, we found that our CI rats achieved significantly better outcomes compared to human patients, and they exhibited the same sensitivity to ITDs as small as a few tens of microseconds that is normally only seen in normally hearing individuals. Such high sensitivity to binaural temporal cues would likely greatly improve the ability to hear in noisy environments if it could be replicated in human patients.

Importantly, our research emphasizes the importance of delivering ITDs in pulse timing rather than through stimulus envelopes for effective ITD detection. We also observed that suboptimal pulse timing can conflict with the processing of other spatial cues, such as interaural level differences (ILDs). The common problems seen in current human patient populations are therefore likely due to fact that their brains have to adapt to inadequacies in the technology in current clinical use. Based on our results, we predict that implementing improved CI processing strategies that effectively deliver crucial temporal information in pulse timing could lead to significantly enhanced patient outcomes.