In the United States one in eight people aged 12 years or older has hearing loss in both ears. The challenges of living with hearing loss are profound and wide-ranging and can affect both personal and work life.
One of the biggest issues with hearing loss is that it is mostly irreversible. The most common form of hearing loss is ‘sensorineural’ - this is acquired when the delicate hair cells of the inner ear (which are responsible for picking up sound) are destroyed. When gone, these sound collectors do not regenerate.
The incurability of ‘sensorineural’ hearing loss is the reason why millions of dollars are spent annually on hearing aid research. After years of work, the hearing aid industry are currently pinning their hopes on one technology - artificial intelligence.
It seems the same technologies underpinning tech companies, such as Bluetooth wearable, geo-enabled smartphone applications, and artificial intelligence (AI), are now making their way into hearing aids.
No longer will hearing aids be stigmatized as a last-resort medical device. But rather, due to AI, they are transiting into a powerful wearable for the tech-savvy crowd, mod crowd.
But this is not just about appearing modern and cutting-edge - this new AI hearing aid technology is highly sophisticated and is not just an improvement on previous know-how. Where the previous approach only relied on external sound-amplifiers, like microphones, the new technology also monitors the listener’s own brain waves.
Since we previously discovered that when two people are talking, the brain waves of the speaker begin to resemble the brain waves of the listener, scientists were able to create a system that first separates out the voices of individual speakers in a group, and then compares the voices of each speaker to the brain waves of the person listening. The speaker whose voice pattern most closely matches the listener’s brain waves is then amplified over the rest. After rigorous testing and refinement, the end result of the algorithm’s effectiveness is that it can recognize and decode a voice—any voice—right off the bat.
The merging of AI and hearing aids will present some amazing possibilities for the millions that live with hearing impairment around the world.
Although the technology mentioned above is no doubt exciting, today's hearing aids are still some way to achieving automated voice isolation in noisy environments. However, hearing aid manufacturers are already doing amazing things with AI. Here we feature some of the latest AI hearing aids, and explain how they use the technology to improve the hearing aid experience for the user.
The Starkey Livio series is the first to feature body and brain health monitoring capabilities. They scan the world with embedded sensors and artificial intelligence and make adjustments on the spot to support you when you need it the most. Besides this, they also monitor brain and body well-being and can detect when you have fallen, in turn automatically calling one of your loved ones.
Although these features are no doubt impressive, it is only a taste of what's to come. The Starkey Livio series could set the stage for hearing aids to track all sorts of health indicators in the future. You could potentially monitor and send data to your smartphone with AI hearing aids in the same way that a smartwatch would update you with details about your heart rate, pulse, or breathing.
Building on the progress made with this 'heathable' hearing aid, audiology technicians hope they will be able to improve technologies with hearing aids in the future to control body temperature and blood sugar levels, so your hearing aids will help you stay healthy if you have a fever or are struggling to stay on top of your diabetes.
Widex Moment is one of the first hearing aids to enhance real-time listening using dual artificial intelligence engines. They have produced the first and only hearing aid in the world powered by real-time machine learning - one that can learn how a person needs to hear right in the Moment, helping them personalize their sound experience.
The Widex AI engine analyzes hearing aid users' preferences through the data they provide to SoundSense Learn (an app feature) and quickly delivers a tailor-made sound. The technology collects a range of anonymous data, such as how much the wearer changes the volume, the sound presets they use most often, and the number of custom settings they make. A full, customized listening experience is produced for the wearer based on these on-the-fly preferences.
The future of AI and hearing aids is bright. From advanced speech recognition in noisy environments to crowd-sourced sound adjustments, it’s clear we have only scratched the surface of what is possible.