How AI is revolutionizing hearing assistance
When you hear the phrase “hearing assistance,” what’s the first thing that comes to mind? If you said hearing aids, you’re in the majority. However, hearing assistance is more complex, encompassing prescription devices like hearing aids and cochlear implants, consumer devices like earbuds, headphones, and other assistive wearables, and technologies that enhance those devices, including apps like HeardThat. Artificial intelligence (AI) is enhancing hearing assistance and improving how these hearing-assistive devices and technologies help users hear better.
Three ways AI is used in hearing assistance
Personalization - AI analyzes data from the user’s environment to fine-tune the settings of the hearing-assistive device to provide the most appropriate values for things like amplification or microphone directionality for different situations.
Speech recognition - AI is the basis for speech recognition software that transcribes spoken words into text. Otter.ai uses AI to record and transcribe meetings in real-time, making notes shareable and searchable on iOS, Android, and desktops with the Chrome extension.
Scene detection - Some hearing-assistive devices use AI to automatically switch between different programs(e.g., restaurant mode or outdoor mode) based on the user's environment and listening needs. This can help users better understand speech in different situations without the need for manual adjustments. Most hearing-assistive devices have branded this capability, like Miracle-Ear hearing aids, which call it M-E Mode™.
How does HeardThat leverage AI?
Without getting too technical, HeardThat uses AI to separate speech from background noise. This improves speech intelligibility and reduces listening effort. HeardThat enables users to choose how much ambient sound to hear by using a slider. To learn more, check out our How it works page. You can also try it out for yourself. Download HeardThat here.