Apple iOS 16 allows users to personalise spatial audio for AirPods using the iPhone’s camera

Apple Spatial audio iOS 16
(Image credit: Apple)

Apple's WWDC conference has officially started and there's already some interesting news concerning Apple spatial audio to report back on.

During the keynote address from senior vice president (SVP) of software engineering Craig Federighi, it was announced that in iOS 16, users will be able to use the iPhone's TrueDepth camera to create a personalised spatial audio profile.

Apple spatial audio is designed to deliver surround sound and 3D audio through headphones – optimally through the AirPods 3AirPods ProAirPods Max and Beats Fit Pro – and first arrived as part of iOS 14.

Apple currently uses the physical data of thousands of different people, measuring the sound and the response of their ears in multiple different directions to create a generic HRTF that’s closest to the average person's perceptual response. This is done so Spatial Audio can work for as many people as possible.

But where does the personalised element come in, exactly? In today's announcement, Apple said that users would now be able to create a personal sonic profile using the TrueDepth camera on their iPhone. Presumably, the user will have to take photos of their ears from various angles before they're analysed.

These are the best wireless earbuds money can buy

Mary is a staff writer at What Hi-Fi? and has over a decade of experience working as a sound engineer mixing live events, music and theatre. Her mixing credits include productions at The National Theatre and in the West End, as well as original musicals composed by Mark Knopfler, Tori Amos, Guy Chambers, Howard Goodall and Dan Gillespie Sells.