Apple iOS 16 allows users to personalise spatial audio for AirPods using the iPhone’s camera

Apple Spatial audio iOS 16
(Image credit: Apple)

Apple's WWDC conference has officially started and there's already some interesting news concerning Apple spatial audio to report back on.

During the keynote address from senior vice president (SVP) of software engineering Craig Federighi, it was announced that in iOS 16, users will be able to use the iPhone's TrueDepth camera to create a personalised spatial audio profile.

Apple spatial audio is designed to deliver surround sound and 3D audio through headphones – optimally through the AirPods 3AirPods ProAirPods Max and Beats Fit Pro – and first arrived as part of iOS 14.

Essentially it takes 5.1, 7.1 and Dolby Atmos signals and applies directional audio filters, adjusting the frequencies that each ear hears so that sounds can be placed virtually anywhere in 3D space. Sounds can appear to be coming from in front of you, from the sides, the rear, and even above to recreate the audio experience of being in a 3D space.

To achieve this feat, Apple uses Head-Related Transfer Functions, a mathematical way to understand and offset the differences in the physicality of a listener that affect their perception of space, from the shape of a person's ears to the width and shape of their head.

Apple currently uses the physical data of thousands of different people, measuring the sound and the response of their ears in multiple different directions to create a generic HRTF that’s closest to the average person's perceptual response. This is done so Spatial Audio can work for as many people as possible.

But where does the personalised element come in, exactly? In today's announcement, Apple said that users would now be able to create a personal sonic profile using the TrueDepth camera on their iPhone. Presumably, the user will have to take photos of their ears from various angles before they're analysed.

We've seen companies use a similar type of technology before. Most notably Sony's Headphones App allows users to scan their ears using their camera phones to optimise the performance of its  WH-1000XM4WF-1000XM4 and WH-1000XM5 headphones when listening to content in its own spatial audio format, Sony 360 Reality Audio, while Creative SXFI's app also encourages ear and head photography to improve its sonic delivery.

Apple iOS 16 features many new improvements to the iPhone, including an overhauled lock screen and new privacy features. The public beta version will arrive in July, with a full rollout slated for late 2022 for iPhone 8 devices and above.

More

11 of the best spatial audio tracks in Dolby Atmos on Apple Music

Apple spatial audio: what is it? How do you get it? And is it like Dolby Atmos?

These are the best wireless earbuds money can buy

Mary is a staff writer at What Hi-Fi? and has over a decade of experience working as a sound engineer mixing live events, music and theatre. Her mixing credits include productions at The National Theatre and in the West End, as well as original musicals composed by Mark Knopfler, Tori Amos, Guy Chambers, Howard Goodall and Dan Gillespie Sells.