An important new study reveals a tantalizing new clue about hearing loss.
Iron deficiency anemia (IDA) has been linked to hearing loss in a major study conducted by researchers at Penn State’s Milton S. Hershey Medical Center.
They studied over 300,000 people, ranging from the young to the elderly. Among the findings, the risk of sensorineural hearing loss was 82% higher among those with low iron levels in their blood.
Your bone marrow needs iron to produce hemoglobin for the red blood cells that carry oxygen throughout the body. Iron deficiency can cause symptoms ranging from fatigue to muscle weakness and maybe, just maybe, play a role in hearing loss.
But the study’s lead author, Kathleen P. Schieffer, emphasizes that, “Our study does not say that iron deficiency causes hearing loss, but only that there is a link between the two.”
She also does not recommend that anyone take iron supplements without first consulting a doctor.
The reason for the link is unknown but one theory is gaining ground. We know from animal studies that iron deficiency reduces the flow of hemoglobin to the cochlea and that the auditory nerve cells need a lot of oxygen.
The report concludes that “further research is needed to better understand the potential links between IDA and hearing loss and whether screening and treatment of IDA in adults could have clinical implications in patients with hearing loss”.
Another promising development in research into restoring hearing.
The key to restoring lost hearing is finding a way to re-grow hair cells in the cochlea. We’re born with about 30,000 of these tiny sound detectors and because of exposure to noise, age and some types of antibiotics they die off.
The good news: researchers around the world are working to develop techniques to regenerate hairs cells. Now comes word that a team at MIT, Brigham and Women’s Hospital, and Massachusetts Eye and Ear have discovered a combination of drugs that does just that. At least it works in mice.
“There have been a couple patients with hearing improvement, so we are definitely encouraged.” – Dr. Lawrence Lustig
There is no cure for sensorineural hearing loss, the type that most of us with aging ears suffer from. At least not yet. But as I’ve written about in earlier posts, a pioneering treatment may be on the way. It’s called CGF166. It’s the only gene therapy for hearing loss now undergoing human trials in the U.S. and early reports are promising. Here’s a progress report. Continue reading “CGF166 – The Latest News”
What’s going on between your ears when you’re losing your hearing? Emerging research is beginning to give us some revealing, and disturbing answers about brain function and hearing loss. But it also offers some hope.
First the bad news: Take a look at this picture.
On the left, is the brain of a person with normal hearing and the areas that process sound are lit up as they should be. But on the right is the brain of a person with mild hearing loss. As you can see there’s less activity and what there is has shifted to other areas. Continue reading “Mind Reading Hearing Aids?”
Will your next hearing aids have cameras? New research on lip reading by artificial intelligence suggests that, and more is on the way.
“Read my lips.” That’s a lot easier said than done. It’s a difficult skill to master in part because only about 30% of speech is considered “visible”. Even the best lip readers can only understand somewhere between 40% and 60%, and those figures are open to question.
Put another way, it means that about half the time they are wrong. A point illustrated by an episode of “Seinfeld” where Jerry is dating a deaf woman who relies on lip reading. He asks her out, and offers to pick her up, “How about six?”. She looks angry, offended and then leaves. Jerry discovers later that she thought he had said “sex” instead of “six”.
Now comes news that the University of Oxford in partnership with Google’s DeepMind artificial intelligence program has come up with a system that may help clear up the confusion.
Their AI system was taught to lip read using some 5,000 hours of BBC television clips. The system scanned people’s lips learning to read them, and it got better and better at it. In fact, the AI correctly “read” about 47% of what was being said without making a mistake. By comparison human lip readers barely managed to get 12% right.
You can try it yourself. Here is one of the silent BBC clips.
The AI scanned inside the red square and produced these captions.
It’s a breakthrough that opens up some intriguing possibilities according to another team of researchers at Oxford who are working on a similar system called LipNet.
‘Machine lip readers have enormous practical potential, with applications in improved hearing aids, silent dictation in public spaces, covert conversations, speech recognition in noisy environments, biometric identification, and silent-movie processing.’ -LipNet Research Report
By parsing that statement you can imagine a few scenarios: One day you may be able to look at your phone and mouth a command to Siri without speaking. Or you might point your phone’s camera at someone in a noisy room, and have what they are saying dictated directly into your hearing aids via bluetooth.
More ominously it may offer new secret surveillance tools that can “listen” in on distant conversations. Combine that with facial recognition software and you have a great plot twist in a spy thriller.
In the meantime, a few tips on lip reading. Actually, the correct term these days is “speech reading” because it involves reading not just lips but facial expressions and gestures.
Anyone with hearing loss is already something of a speech reader since your brain is constantly searching for clues about what is being said. Your may notice how much easier it is to understand someone if they are facing you directly and in a well lit space.
So to a large extent it’s intuitive but it’s also a skill that can be improved. For online training, try Lipreading.org.
“MEMS microphones have the potential of providing significant performance improvements in hearing aids”
There’s nothing more frustrating to me than trying to decipher speech in a place where my hearing aids are overwhelmed by the noisy hubbub in the background. A party or a crowded restaurant are two vexing examples.