fbpx
News

Apple initially wanted AirPods Pro 2’s Adaptive Audio to work with GPS

The AirPods Pro (2nd-Gen) currently monitor your environment in real-time and make Adaptive Audio decisions intelligently

Apple’s 2nd-Gen AirPods Pro offer several top-of-the-line features but they truly shine thanks to their ‘Adaptive Transparency’ feature that protects your hearing by automatically lowering the volume of loud noises around you.

With iOS 17, however, Apple’s 2nd-Gen AirPods Pro (USB-C and Lightning) gain new Adaptive Audio functionality, which uses adaptive noise control, personalized volume and conversation awareness to automatically fine-tune your audio experience depending on your environment and surroundings.

Adaptive Noise Control pairs Active Noise Cancellation and Transparency mode together to customize the level of noise control based on the conditions in your environment, while Personalized Volume automatically adjusts the volume of the AirPods Pro (2nd-Gen) based on your surroundings and your volume preferences.

Lastly, Conversation Awareness essentially lowers the volume of your media when you start talking to someone, and automatically enhances the volume of the voice of the person you’re talking to. Once the conversation ends, the feature automatically turns the volume of your media back up.

These new Adaptive Audio features are available on iPhone SE (2nd-Gen and later), iPhone XS, iPhone XR, and later with iOS 17 and later when paired with AirPods Pro (2nd-Gen).

Now, in an interview with TechCrunch, Apple’s Vice President of Sensing and Connectivity, Ron Huang, revealed that Apple initially wanted the feature to be influenced by the user’s GPS location. For example, the device would be able detect if you’re on a busy road, and make sure you hear sounds like horns, whereas when it detects that you’re in your house, it would focus more on noise cancellation and conversation awareness.

However, in testing, the GPS-enabled feature failed to impress. According to Huang:

“During early exploration for Adaptive Audio, we basically put you in ANC versus transparency, based on where you are. You can imagine the phone can give a hint to the AirPods and say, ‘hey, you’re in the house’ and so forth. That is a way to do that, but after all our learnings, we don’t think that is the right way to do it, and that is not what we did. Of course, your house is not always quiet and the streets are not always loud. We decided that, instead of relying on a location hint from the phone, the AirPods monitor your environment in real time and make those decisions intelligently on their own.”

Read the complete interview here.

Source: TechCrunch Via: MacRumors

MobileSyrup may earn a commission from purchases made via our links, which helps fund the journalism we provide free on our website. These links do not influence our editorial content. Support us here.

Related Articles

Comments