AI AND HEARING AIDS - a balance to push possibilities

Hearing aid technology. Manufacturers reveal where they stand on artificial intelligence. Now we want to hear from professionals on how they see the pros and cons of AI in hearing instruments. Write to editor-in-chief Peter Wix with your views please. Tell us which companies you work with and how their AI is working out for you.

Published on 14 February 2024

AI AND HEARING AIDS – a balance to push possibilities

AI is all around us. AI is all over our future. The January 2024 Consumer Electronics Show (CES) was dominated by this technology, which is being applied in design, building and manufacture, marketing, you name it…if you can before an AI application does it for you!

There is no doubt, however, that while claims and counter-claims of the benefits and threats of artificial intelligence in other fields are being hotly debated, the advantages of complex computing are being embraced rapidly in areas of medical research, diagnostics, and treatment. Where do major manufacturers of hearing aids currently stand on the role of AI in their products? Audiology Worldnews asked them.

While the development and deployment of AI’s potential might have seemed a gradual process for those bringing it to the marketplace, the process is cascading in and perhaps confusing some audiologists and other professionals trained to criteria that predate the use of computer algorithms in the equipment they learned to use.

AI is not going away. Hearing aids from Starkey, for example, have for some time been using those formidable two vowels as part of the names of their devices (Livio AI, Evolv AI, Genesis AI,…), and this US producer admits an 11/10 attention to the importance of AI in its devices.

The technological leaders at Demant have also spent many a year developing Oticon devices to include deep neural networks.

Machine learning is part of the modern hearing aid. How long before a hearing aid name doesn’t need to spell it out?

There is no need to underline that AI is much a part of our lives and our life equipment as the computer. But the devil is in the details, so if you’re wondering where we stand in January 2024, this is the picture among the world’s leading hearing aid brands.


We asked four questions:

In what functions of your hearing aids is artificial intelligence technology currently deployed?

In terms of importance in both hearing aid composition and product marketing, what rating (out of 10) does AI have for your brand?

Is the future of hearing aids all about AI: Yes? No? Explain.

Please put into words your brand’s stance on artificial intelligence in hearing devices.





Thomas Behrens, Vice President Audiology at Demant (Hearing Aids Group):

Oticon’s BrainHearing approach to technology starts with the premise that the brain needs access to the full sound scene. Speech, environmental sounds and background noise should be balanced such that the most useful sounds are kept in the foreground, while the noise is pushed into the background. Traditional approaches to noise reduction use man-made algorithms to instruct hearing aids to reduce background noise, but these algorithms lack the precision to balance the sound scene beyond a simple speech vs noise classification. In Oticon Polaris and Polaris R platform hearing aids (currently Real, Own and Play PX), a Deep Neural Network (DNN) was used to push the possibilities of noise reduction, providing hearing aid users with more precise, effective noise reduction and at the same time more detail in sound.

A DNN is a special class of artificial intelligence where an advanced algorithm structure that resembles brain networks learns how to do a task through repeated exposure to relevant situations in ways that are similar to human learning, rather than being given an instruction. DNNs enable far more complex tasks to be deployed, in this case balancing the sound scene. The DNN in Oticon hearing aids has been trained with 12 million sound scenes and has learnt to perfectly balance sound environments, before the hearing aids even reach the ears of users. Because the DNN is embedded on the chip and fully trained, hearing aid users can enjoy the benefits of precise noise reduction right from the first time they put them on, and in any situation they find themselves.

Oticon      Oticon Real includes an on-board Deep Neural Network (DNN).


GN Hearing

AI has been integrated into hearing aids for many years, and drives many of our technological developments. A prime example is our Environmental Classifier that is constantly scanning and adjusting to the environment, recognising and classifying and adapting to complex listening environments in real time so that the wearer can hear their best as they go through their day. The accuracy and speed of this classification is key to ensuring our hearing aids are activating the right features at the right time.

Another example of our use of AI in our hearing aids is our 360 All-Around directionality. This unique approach to directionality provides the user with the ease of automatic adjustments that support the way the brain processes all different types of sound environments, from listening in quiet to the most difficult speech-in-noise situations. Together, the Environmental Classifier and 360 All-Around underpin our Organic Hearing philosophy, to provide all the auditory information our wearers need for their brains to instinctively make the right decisions on what they want to focus on at any given time.

This for us encapsulates the role of AI in hearing, to empower and enable natural hearing, rather than trying to guess the intentions of the wearer and make decisions on their behalf.



At Signia, we have been utilising intelligent learning in our signal processing since launching BestSound Technology in 2010. This technology introduced SoundLearning, an automated intelligent learning algorithm, resulting in the instant personalisation of hearing aid parameters based on user preference. These were the first steps toward integrating AI into our hearing aids.



Dave Fabry, Chief Innovation Officer:

1. Detecting and dealing with challenging listening environments

Using an onboard deep neural network, Genesis aids are capable of calculating and then adjusting incoming sounds at 80 million adjustments an hour.

This vastly improves holding conversations in busy social environments and outdoors where wind noise can be reduced more than ever.

In addition, if that is not enough, Edge Mode+ is an AI-driven feature to scan a listening environment. When selected by the hearing aid wearer the processor will automatically fine tune the parameters beyond the preprogrammed settings to optimise speech clarity and comfort in challenging listening environments. This makes hearing aid use far easier than multiple listening programs.

2. Fall Detection.

We can all see the links between age and risk of falling. Add to that the addition of hearing loss; this increases the risk significantly. Using onboard sensors and AI to detect the natural position a fall occurs, whether stood or sat down, the hearing aid can trigger an intelligent alert to show family members or caregivers (selected by the hearing aid user) a fall has occurred immediately and their location.

3. Digital Assistant.

Genesis hearing aids use their microphones to actively process speech commands to drive the hearing aid controls. A Genesis user can simply ask their hearing aids to turn up volume, change programmes or start an accessory by speaking. They can also ask a question from the web and receive the answer via the hearing aids.

4. Translation.

In conjunction with their smartphone, Genesis owners can use their hearing aids to receive translations of many different languages – handy when on holiday or a foreign business trip.

5. Activity tracking.

Genesis devices are equipped with accelerometers and gyroscopes that can detect motion and, in combination with AI, are able to track daily activities and also the type of activity users have participated in, including indoor biking, aerobics and running.

6. Reminders.

A great AI driven feature to help remember important daily routines such as taking medication, drink more fluids, clean or even wear your hearing aids!



Phonak hearing aids have utilised artificial intelligence technology for 23 years. For many years AI technology using Machine Learning (ML) has been utilised in hearing aids to perform two tasks: 1) classification of sound environments and 2) application of parameters for sound processing. Phonak introduced this technology in 1999 with the launch of Claro, featuring Auto Select which distinguished between two situations. At that time, the algorithms used were not as sophisticated as today and the amount of data utilised for training the system was much less.

Phonak’s latest implementation is the use of machine learning to train our hearing aid operating system, AutoSense OS. This technology was developed and trained using thousands of real-world sound recordings. AutoSense OS scans the environment 700 times per second, automatically detecting and analysing incoming sound in real-time based on the listening environment. It then instantly activates the appropriate blend of gain, programmes, noise reduction and other features, intelligently choosing from more than 200 unique setting combinations. The result is a truly personalised hearing experience.





Sonova      The AutoSense OS used by Phonak on hearing aids such as its new Phonak Lumity range has been trained by AI to accurately identify a sound environment, orchestrating Lumity’s comfort and speech understanding features.

For Phonak, any technology that helps drive our mission to improve the quality of life for people with hearing loss will always be a 10 out of 10. When it comes to integrating technology like AI into our hearing aids and product marketing, we believe how it’s integrated is more important than how much or even how fast.



5/10  AI is important as a means to an end, but it is not an end in itself. Our ultimate goals are to provide wearers with the best experiences and the highest levels of satisfaction, and these are driven by the ability to hear speech in noise, provide natural sound quality and offer great connectivity to devices such as phones. Our philosophy of Organic Hearing is based on providing all the auditory information the brain needs to make natural hearing decisions, and AI enables this.

GN Group      A marketing video for GN Resound Nexia highlights, among other things, the benefit of its Hear in Noise programme. But GN’s focus on AI is as part of the balance with human intelligence.



Dave Fabry, Chief Innovation Officer:

I would say 10/10.  We have been turning it to 11 since 2018 when we launched Livio.  We incorporate AI to provide the best speech intelligibility & sound quality, to promote health and wellness, and as a personal digital assistant.



Thomas Behrens, Vice President Audiology at Demant (Hearing Aids Group):

8/10. More and more hearing aid manufacturers are employing artificial intelligence in some way in their hearing aids, so the use of AI in hearing aid marketing could mean less differentiation between hearing aid brands if creators are not strategic in their choice of training material and teaching algorithms for their AI. At Oticon, we are very specific about defining the goals of our AI, using carefully selected training material to be representative of real life and using teaching algorithms that have capabilities of achieving strong and meaningful progress in training towards the given goal. Whilst AI technologies are an important part of our R&D, Oticon’s unique BrainHearing approach will always be central to our hearing solutions and the way we explain the benefits to HCPs and users.



This WS Audiology brand pointed out that it does not use a rating system for AI.





No. We do not believe the future of hearing aids is ALL about AI. AI represents huge opportunities and will play an important role in advancements, but we believe, and have always believed, that the role of a great hearing aid is to provide the brain with all the information it needs to so that the wearer can direct their focus and pick out the signal they want to hear, rather than the hearing aid making these decisions on behalf of the wearer. AI should be viewed as a method of achieving these goals, without losing sight of the need to empower the wearer to hear what they want to hear in any listening situation.



Phonak recognises AI as a significant catalyst for innovation in certain aspects of hearing aid technology in the short term. However, we also acknowledge the importance of other technological advancements in meeting the needs of hearing care professionals (HCPs) and their patients, especially in areas like speech understanding, sound quality, and comfort. Instead of incorporating unnecessary functionalities, we aim to enhance their hearing experience in real-life situations, particularly in challenging environments with background noise. Our objective is to provide practical and effective solutions that truly benefit our users.



Thomas Behrens, Vice President Audiology at Demant (Hearing Aids Group):

Yes and no! To push the boundaries of what is possible in hearing technology, more and more complex algorithms will be required. Oticon is at the forefront of this trend in technology development. However, the technology itself is not the only consideration; it’s how you use it and what you want to achieve with it. Oticon will continue to use cutting-edge technologies in a range of fields to bring even more BrainHearing benefits to our hearing aid users.



PW      “We know AI will provide further opportunities for hearing health and overall health using hearing instruments,” says Dave Fabry, Starkey’s Chief Innovation Officer, photographed at the UK launch of Genesis AI in Manchester, October 2023.

Dave Fabry, Chief Innovation Officer:

No, it’s all about caring. We believe that optimal patient benefits are delivered with our technology in the professional’s hands. AI just provides further capabilities to improve patient outcomes and hearing aid success. Without the professional this is not possible to achieve.



Hearing needs and preferences are very unique to the individual and are subject to change over time. Having an AI companion that learns from you and can adapt with you is therefore a very helpful tool – one of AI’s main benefits to the wearer is providing personalised optimisations in real-time. Such personalised solutions allow wearers to reduce travel time to the clinic and spend less time with the hearing aids performing sub-optimally, as the user has support with simple troubleshooting. By providing feedback to the Hearing Care Professional (HCP) on the personalisation efforts, these solutions also facilitate a more informed conversation between wearer and HCP about challenges and expectations.




Dave Fabry, Chief Innovation Officer:

AI plus human interaction and expertise from hearing care providers provides optimal patient outcomes. We know AI will provide further opportunities for hearing health and overall health using hearing instruments, and our goal is provide these tools to extend benefit of hearing devices with the help of hearing professionals and their expert work globally.



In the realm of hearing aids, Phonak firmly believes that the integration of artificial intelligence (AI) and its subsets, such as machine learning and deep neural networks, holds immense potential. By leveraging AI, significant advancements in functionality and user experience can be achieved. Phonak is committed to exploring the untapped capabilities of AI, particularly in the field of signal processing.

Responsible marketing of AI: Phonak’s approach to AI in hearing aids is grounded in responsible marketing. While embracing the excitement surrounding AI technology, Phonak remains steadfast in the belief that transparency is key. It is essential to openly acknowledge the current limitations of AI and avoid making exaggerated claims packed with buzzwords simply for the sake of trendiness. The focus is on delivering genuine advancements and leveraging AI to enhance the overall patient experience.

The role of skilled professionals in AI implementation: At Phonak, we recognise the vital role of hearing care professionals in achieving the best patient outcomes. While AI technology continues to evolve, it is important to underscore the significance of the combined efforts of innovation and skilled hearing care professionals. The integration of advanced technology with expert care and guidance ensures the optimal utilisation of AI capabilities in hearing aids, ultimately benefiting the end-users.



We believe the balance between human intelligence and artificial intelligence is critical, and we strive to balance nature with science to create solutions that help people connect to what matters most to them.



Thomas Behrens, Vice President Audiology at Demant (Hearing Aids Group):

Oticon      How Oticon imagines its mini RITE, REAL, an image that bespeaks the complex computing at the heart of its functions.

Artificial intelligence is just one of the important developments that will help improve benefits for hearing aid users. Sensor technologies is another important development, but we have yet to see use of sensors to create new ways of solving unmet needs for people with hearing loss. So researching how we may create new solutions to unmet needs is one of our primary goals, so we can add further layers of precision to reduction of background noise and improve precision amplification. New measurement and fitting tools, such as the Audible Contrast Threshold test, will equip HCPs to provide even more personalised fittings.



Signia – WS Audiology      Marketing for the Signia Integrated Xperience emphasises group conversation benefits.

Today, the application of AI in modern hearing aids is a lever for even more precise help in real-life situations outside the clinic, leading to more personalised and optimised hearing aid settings for the individual. AI is also a lever to provide real-life insights about the wearer to the hearing care professional, leading to efficient and successful fine-tuning. It can consider and account for how the wearer feels, their primary concerns, and their responses to amplification. All of this results in highly personalised hearing care, focusing on what matters most in the wearer’s life with relevant counselling and support.


Source: Audiology News UK Issue 06 January – February 2024