AI can now help you hear speech better

Livio AI Edge

Achin Bhowmik and his team at Starkey have a new recipe for “soup” and it offers help and hope for people who suffer from severe hearing loss and are struggling to understand speech, especially in noisy environments.

Acoustic “soup” is Starkey’s Chief Technology Officer’s analogy for a complex hearing environment, such as a noisy restaurant. Somewhere in that “soup” amid all the clatter are the morsels of human speech.

Identifying and isolating those morsels is something the brain normally does. But with severe hearing loss our brain doesn’t get enough information to do the job.

But now Starkey has unveiled some remarkable new artificial intelligence technology that can sift through that “soup”, pluck out the speech, enhance it and then deliver the cleaned-up result to the hearing aid speakers.

In effect, Starkey’s AI does the job the brain of a person with normal hearing does. And it promises to be a game changer.

This new technology, dubbed IntelliVoice, is now available on the latest versions of its Livio Edge AI line of hearing aids.

It’s designed for people who have 50 decibel or greater hearing loss. Meeting their needs requires two key components.  The first is a powerful hearing aid that can produce the necessary amplification.  “So we developed a custom hearing aid with 2.4 gigahertz Bluetooth connectivity that is rechargeable and provides up to 70 decibel gain.”  An industry first, he says.

The second component, and the real magic, is the Deep Neural Network AI that Bhowmik says, “has the intelligence to understand what is speech in a complex acoustic environment and enhance it”. 

The complex algorithms are “taught” to recognize speech and “learn” in different environments.  Needless to say, this requires a lot of computer horse power, more than can currently be packed into a hearing aid chip.  Instead, IntelliVoice uses the processing power of a smartphone to do the heavy lifting.

One drawback to that is a delay caused by the back and forth between the phone and the hearing aids.  People with relatively mild hearing loss would notice an echo effect.  

But as Bhowmik emphasizes, IntelliVoice is only for people with severe hearing loss.  Their hearing aids would almost completely occlude the ear canal, so little outside sound can enter. In other words, they only hear what is coming from the hearing aids and not any external sounds, so the echo effect wouldn’t be noticeable.

He points out that hearing aid chips are becoming more and more powerful and it won’t be long before they can do the job themselves so the lag will become virtually nonexistent. 

“We continue to push hearing aid technology forward”, Bhowmik says. That means deepening our understanding of how hearing works and diving further into the “soup”.

In that sense, IntelliVoice is a taste of the future.

(For more on AI and hearing aids see my interview with Achin Bhowmik from January 2020)

Author: Digby Cook

2 thoughts on “AI can now help you hear speech better

  1. I löst hearing After suffering from encephalitis in 2005, I was staffed with two Cochlear implants which don’t work at all.
    Is there a hope now for me. I am totally recovered nowadays but still deaf

    1. Sorry that you have lost your hearing. I can only suggest consulting with doctors. Sadly there are no miracle cures.

Comments are closed.