AirPods Pro Become Hearing Aids in iOS 14
Apple tackles hearing loss accessibility issue with “Headphone Accommodations” and AirPods Pro
Doctor of Audiology
The long-awaited missing feature—Headphone Accommodations in Transparency Mode—is now available (as of the Sept 14 release of AirPods Pro firmware version 3A283). After upgrading, please restart your devices to enable the new functionality.
I have now tested Headphone Accommodations in Transparency Mode, and can confirm that it does work, within limits. The amplification is not 100% perfect, and AirPods won’t work for more severe hearing losses, but for those with mild-to-moderate hearing loss, they should provide some benefit.
Adam Carlan reviews the new AirPods Pro functionality. Fast forward to about 5:30 to see Adam's walkthrough of the new settings. Closed captions are available on this video. If you are using a mobile phone, please enable captions by clicking on the three small dots.
When the news broke
Huge news from Apple on the accessibility front! Buried near the bottom of their iOS 14 press release, under “Additional iOS Features”, Apple included a mention of the new “Headphone Accommodations” accessibility feature, which “amplifies soft sounds and tunes audio to help music, movies, phone calls, and podcasts sound crisper and clearer.”
In the iOS 14 feature preview, Apple states that “Headphone Accommodations also supports Transparency mode on AirPods Pro, making quiet voices more audible and tuning the sounds of your environment to your hearing needs.” This is the extremely exciting part, as it indicates that AirPods can now provide some of the same functionality that you might expect from a hearing aid; personalized amplification (and noise reduction) to make it easier to hear those around you.
Dr. Cliff Olson, Audiologist and founder of Applied Hearing Solutions in Phoenix Arizona, discusses the news that iOS 14 will allow the Apple AirPods Pro the ability to amplify sound like hearing aids. Closed captions are available on this video. If you are using a mobile phone, please enable captions by clicking on the three small dots.
Fixing the audio lag problem
While there are plenty of third-party apps that already offer customized AirPods amplification for everyday sounds, the big issue with hearing-aid-like amplification has been the latency (or lag) introduced by processing audio on the phone and transmitting it to the AirPods for playback. “The challenge of using smartphone processing is that auditory information must be presented within 80 milliseconds,” says Chad Ruffin, MD. “If processing and relaying this information cannot occur during this time, it will make communication harder. This is because the lipreading cues can become out of sync with the amplified audio.”
The challenge of using smartphone processing is that auditory information must be presented within 80 milliseconds. If processing and relaying this information cannot occur during this time, it will make communication harder. This is because the lipreading cues can become out of sync with the amplified audio.
Chad Ruffin, MD
Today’s announcement doesn’t state whether audio processing will happen on the iPhone or onboard the AirPods Pro, but given Apple’s track record for audio so far, it would only make sense for them to release an important accessibility feature like this if the latency was at or below an acceptable number of milliseconds to maintain audio and visual sync. [ See update below. We have an answer on this ]
Setting up Headphone Accommodations
For those with early access to iOS 14, go to Settings ⇾ Accessibility ⇾ Audio/Visual to turn on Headphone Accommodations. You will see “Custom Audio Setup” link which—based on the leaked screenshots below—appears to guide the user through a series of listening tests. The results of the listening tests presumably enable clearer speech for phone calls, media, and real-world conversion (AirPods Pro only).
Detailed walkthrough instructions
For a more detailed walkthrough of setting up Headphone Accomodations in Transparency Mode, check out this new video from Adam Carlan.
Closed captions are available on this video. If you are using a mobile phone, please enable captions by clicking on the three small dots.
Using an “audiogram” to customize amplification
An option to use “an audiogram from [Apple] Health to customize your audio” is also displayed in the leaked screenshots. This seems to indicate that the AirPods Pro will be capable of providing a very fine-tuned custom amplification experience, based on the audiogram (pitch-by-pitch hearing abilities) unique to the user. With third-party apps like Mimi, you can test your hearing and generate an audiogram, and with iOS 14, it looks like that audiogram can serve as the foundation for personalized amplification.
Skipping “Custom Audio Setup”
If you don’t feel like going through the listening tests, or loading in an audiogram, you’re in luck. Apple is also providing some general purpose audio enhancement options that should help those with milder forms of hearing loss. In the “Headphone Audio” settings, you will have the option to “tune audio” based on the following options:
- Balanced Tone
- Vocal Range
- Brightness
Apple will also introduce a slider to increase the “boost” for soft sounds, from “slight” to “strong”. The new audio settings will work with Transparency mode on AirPods pro and phone calls and media on AirPods Pro, AirPods (2nd generation), EarPods, Powerbeats, Powerbeats Pro, and Beats Solo Pro.
What we have learned since the announcement
After the announcement in June, I was in direct contact with some folks over at Apple to learn more about the upcoming Headphone Accommodations feature. I learned a lot about Apple’s history of supporting the Deaf and Hard of Hearing communities, and had some of my burning AirPods questions answered.
Apple’s accessibility history
Apple has a long history of supporting the Deaf and Hard of Hearing communities. Apple was the first to enable Teletype (TTY) and real-time text (RTT) calling directly on device. For the uninitiated, TTY and RTT provide a text-based chat alternative to voice for those who are not able to hear clearly on the phone. With TTY, the user must hit the send button before the chat is transmitted, and with RTT, chats are sent in real-time, as text is typed.
In 2014, Apple launched the Made for iPhone hearing aid program and designed a completely new Bluetooth Low Energy protocol for the hearing industry. This enabled—for the first time—seamless audio streaming connectivity between a hearing device and smartphone. Hearing aid wearers could easily stream phone calls, FaceTime calls, music, Siri, etc, with clear sound. Apple licensed this technology for free to hearing aid manufacturers. Today, Apple technology is built into more than 100 hearing aid and cochlear implant models.
In 2018, Apple introduced Live Listen on AirPods, enabling customers to use their iPhone as a remote directional microphone. Remote microphones are widely endorsed by the hearing healthcare industry and provide better hearing in noisy settings, lecture halls, or when sitting far away from the speaker. Live Listen also works with Made for iPhone hearing aids.
That brings us to today, where we are eagerly awaiting Apple’s latest accessibility marvels—Headphone Accommodations and personalized amplification for everyday sounds via AirPods Pro.
Personalized amplification onboard AirPods Pro
We learned from the iOS 14 announcement that Apple would be supporting personalized amplification in Transparency mode for AirPods Pro only. This led me to wonder whether Apple would be processing the customized amplification onboard the AirPods Pro with the new embedded H1 chip (unique to the AirPods Pro). Via my communications with Apple, I have confirmed that the H1 chip, with 10 audio cores, will indeed be used to locally process the sound, and Apple is promising “incredibly low audio processing latency.” This is huge news for millions of AirPods pro owners!
Other important accessibility updates from Apple
While Headphone Accommodations and personalized amplification on AirPods Pro may have stolen the show, there are plenty of additional hearing accessibility updates coming this fall. Here’s a quick summary of some of the other announcements:
Sound Recognition
A new setting that will notify a user on their device about sounds or alerts detected by their iPhone, iPad, or iPod Touch. These cover a range of categories, such as sirens, smoke alarms, doorbell chimes, and appliance beeps.
Group FaceTime
Group FaceTime now detects when a participant is using sign language and makes the person prominent in a video call.
RTT Improvements
Apple has made it simpler for RTT users to engage with calls and incoming RTT messages through notifications — even when they’re not in the phone app and don’t have RTT conversation view enabled.
Hearing health
Following the introduction of the Noise app in watchOS 6 that measures ambient sound levels and duration of exposure, watchOS 7 adds further support for hearing health with headphone audio notifications.
Through iOS 14 and watchOS 7, users can now understand how loudly they are listening to media through their headphones using their iPhone, iPod, or Apple Watch, and when these levels may impact hearing over time. The hearing control panel displays a live UI showing whether what you’re playing is above recommended limits. When this reaches safe weekly listening amounts, Apple Watch provides a notification to the wearer.
Link:AirPods Pro Become Hearing Aids in iOS 14
The article comes from the Internet. If there is any infringement, please contact service@jhhearingaids.com to delete it.