Vishal Choudhari, Maximilian Nentwich, Sarah Johnson, Jose L. Herrero, Stephan Bickel, Ashesh D. Mehta, Daniel Friedman, Adeen Flinker, Edward F. Chang, Nima Mesgarani
(in press at Nature Neuroscience)
Understanding speech in noisy environments is difficult for many people, and current hearing aids often fail because they amplify all sounds rather than the talker of interest. Auditory Attention Decoding (AAD) offers a potential solution by using the listener's brain signals to identify and enhance the attended speaker, but it has been unclear whether this can provide real‑time perceptual benefits. Here, we used high-resolution intracranial EEG in neurosurgical participants to implement a closed-loop system that achieves the decoding fidelity necessary to dynamically amplify the attended talker. Across multiple experiments, the system improved speech intelligibility, reduced listening effort, and was consistently preferred by subjects. It also tracked both instructed and self-initiated attention shifts. By providing direct evidence that a real-time, brain-controlled hearing system can enhance perception, this work establishes a key performance benchmark for future auditory brain-computer interfaces and advances AAD from a theoretical concept to a validated solution for personalized assistive hearing.
In this experiment, the brain-controlled system was activated mid-conversation under challenging listening conditions. Neural decoding drove real-time gain modulation, producing large improvements in intelligibility, reduced listening effort, and strong user preference. This provides the first direct behavioral evidence that closed-loop auditory BCIs can enhance real-world speech perception.
Participants were instructed to switch attention between competing conversations while the system remained active. The decoder reliably detected these shifts and rapidly enhanced the newly attended conversation, rebalancing amplification within seconds. This demonstrates that neural steering enables seamless, uninterrupted listening in dynamic conversations.
Without external cues, participants freely switched their attention between competing conversations. The system reliably tracked these self-initiated attention shifts and dynamically enhanced the newly attended conversation in real time, demonstrating fully autonomous neural control in natural listening.
After completing Experiments 1 and 2, participants were debriefed and learned for the first time that their brain activity had been controlling the system in real time. Their genuine reactions were recorded, as seen in the video, offering a direct window into user experience and perceived agency in brain-controlled hearing. With this new understanding, they then moved on to Experiment 3, where they intentionally interacted with the system, freely shifting their attention while knowing how the technology worked.
Paper and code coming soon! Please check back for updates.