I tried spatial audio with head-tracking on Android – and there's good news and bad news

Dynamic Spatial Audio
(Image credit: Future)

3D audio has been more of a buzzword than 3D anything else for some years now, and since Apple introduced its own 3D audio technology – called ‘spatial audio’ – in 2021, the term has been bandied around at nearly every Apple device launch event. Apple’s spatial audio has been an enjoyable addition to the iOS and AirPods experience… for iOS and AirPods users. But what about Android users? 

Well, many non-Apple devices support immersive sound technologies like Dolby Atmos and 360 Reality Audio, and from next year we will start seeing other phone and headphones manufacturers launch models with the spatial audio experience. That’s because, as well as Android 13 laying down the framework for an Android spatial audio experience, telecomms giant Qualcomm is launching new chips that support its own version of spatial audio technology – Dynamic Spatial Audio – for the first time, and they will begin finding themselves in devices in the coming months. The new Snapdragon 8 Gen 2 phone chip will likely come to many of the flagship Android phones, while the new S3 or S5 Gen 2 audio chip is the one headphone and speaker brands will need to snap up (and Bose looks set to be one of those committed brands).

I attended the company’s annual conference and got a taster of what to expect from Qualcomm’s Dynamic Spatial Audio. So firstly, what is it? Well, like any and all 3D audio technologies and formats, Dynamic Spatial Audio delivers audio content – straight-up music, or soundtracks for films or games – in a broader, more dimensional soundscape in an effort to immerse the listener in sound. The ‘Dynamic’ part refers to something rather crucial – that the soundfield can move and adapt as the listener’s head turns so that the audio is always ‘glued’ to the location of the screen. So, if you’re wearing headphones with your phone in front of you while watching a film and you twist your head to the right, the headphones’ soundfield between your ears shifts more to the left earbud – just as, in a real-life situation, your left ear would hear more of what someone in front of you was saying than your right ear if you turned your head to the right. The idea is that it feels more natural.

Now, I’ve heard spatial audio through an iPhone/iPad and AirPods 3/Pro before, so I knew roughly what to expect. It seems a bit strange – redundant even – to have the head-tracking feature for moments when you’re sat comfortably looking straight ahead at a screen, though the adjustments do kick in for even the subtlest head movements – not just when you’re mimicking an owl for demonstration’s sake (or spinning around and around on the spot like some other journalist decided to do for his demonstration…). On previous occasions, I have found the effect more worthwhile when more casually watching something on a screen in a kitchen while moving around it, and I can imagine this being an even better experience through a wireless speaker that detects your body movement across a room as opposed to through headphones when you don’t want the isolation that headphones inherently provide. This could well materialise, though I imagine the spatial audio experience will first and foremost be offered through headphones.

This particular demo was admittedly modest, comprised of a phone in a fixed position on a stand playing an Adele track and then an instructional walk-through video on the technology while I was wearing a pair of non-branded earbuds (which would have had the sensors that are necessary to detect the phone’s position) connected to it over Bluetooth. Swap out the demo material for a blockbuster film, the unbranded earbuds for a pair of commercially available earbuds (or, better yet, over-ear headphones), and the journalist-packed conference room for a living room or train carriage, and you have an idea of an IRL setup and use case that will be available from next year. Is it likely you’ll experience this as soon as next year? That rather depends on how many phone and headphone manufacturers opt to use one of Qualcomm’s new Dynamic Spatial Audio-supporting chips, and whether you’re planning on replacing both your existing phone and earbuds next year. Indeed, both the screen device and headphones need to have the chips (again, Snapdragon 8 Gen 2 for the phone, and S3 or S5 Gen 2 for the headphones) for them to play ball, so what you’re currently using now is off the menu, I’m sorry to say.

An opinion I apparently share with many people on the internet who have experienced spatial audio songs on Apple Music through headphones or speakers, spatial audio can sound pretty odd with music, and it can sound pretty cool with music. It can depend on the way the song was mixed, or just on your preference really. The Adele song came through a soundstage that sounded as ’spatialised’ as other times I’ve experienced the other variant of the technology: it sounded opened up and her spotlight voice and the sparse instrumental accompaniment had more space around it than what it does in good ol’ (unspatialised) stereo. Anyone who has heard a Dolby Atmos- or 360 Reality Audio-mixed song through headphones should know what I mean. I turned to the left to avoid the event’s photographer snapping my face and the audio moved to keep aligned with the static screen, with Adele’s vocals shifting more into the right ear. It worked, I just don’t know how much that music experience will really catch on.

What’s less up for debate – in my book, at least – is how effective the technology is with film soundtracks (and I would imagine gaming too, due to the similarly cinematic nature of game soundtracks and effects), which is to say that it’s very effective. Indeed, the video clip’s audio had more elements to it than the music, and the spatialising and head-tracking function seemed better suited, even if it was far from the best demo material or advert for the technology. Like everything, its implementation will have varying degrees of effectiveness based on the quality of the devices’ sound capability and perhaps even of their sensors, but if done right this should draw a crowd.

My colleague not long ago wrote about how compelling a movie experience you can get from 'just' an iPad and pair of AirPods Max, partly thanks to the duo’s spatial skills, and this on-the-go movie experience is where I can see spatial audio thriving. Look out for the technology brandished on headphones, earbuds and speakers next year, particularly if you’re someone who watches a lot of TV or films or plays games on portable screens.


Award-winning Bose QuietComfort Earbuds II will sound even better in 2023

Dynamic spatial audio is coming to Android phones for more immersive gaming and movie watching

aptX Lossless: what is the breakthrough Bluetooth codec? How can you get it?

Becky Roberts

Becky is the managing editor of What Hi-Fi? and, since her recent move to Melbourne, also the editor of Australian Hi-Fi magazine. During her 10 years in the hi-fi industry, she has been fortunate enough to travel the world to report on the biggest and most exciting brands in hi-fi and consumer tech (and has had the jetlag and hangovers to remember them by). In her spare time, Becky can often be found running, watching Liverpool FC and horror movies, and hunting for gluten-free cake.