Project: Immersive Multitalker Remote Microphone System

Our team from the Augmented Listening Laboratory at the University of Illinois will be participating in the ASA challenge.

Many hearing aids can connect to remote microphones that are worn by a distant talker and transmit sound directly to the hearing aid. However, most remote microphone systems are limited to a single talker, and the remote signal does not have realistic room acoustics or spatial cues. We plan to implement a real-time version of the multiple-talker binaural adaptive filter from our forthcoming WASPAA paper. It processes the low-noise signal(s) from one or several remote microphones to match the magnitude and phase of the earpiece microphones, thereby preserving spatial cues. If it works well, the listener will be able to hear the two talkers as if listening through their own ears, but with much less noise.

We plan to implement it on the Tympan using the earpieces, codec shield, and two wireless microphones paired with a stereo receiver. The adaptive filter will have four inputs: two from the earpieces and two from the wireless microphones. It will produce two outputs to the earpieces. We might also experiment with the smartphone app to let the user control the relative levels of the two talkers and mix in some ambient sound from the earpieces.

1 Like

For those who weren’t at the ASA conference, we put together a blog post describing our results, including a demo video:

https://publish.illinois.edu/augmentedlistening/tympan-asa/

I loved the demo in this presentation showing the adaptive left / right filter. Very cool stuff!