Skip to main content

Coaxial vs optical vs HDMI: which is the best audio connection to use?

Coaxial vs optical vs HDMI: which is the best audio connection to use?
(Image credit: Chord Company)

You’ve seen the sockets, you’ve probably bought the relevant audio cables, but which digital audio connection should you be using? Which gives you the best AV performance? Allow us to give you a brief overview.

If you’ve ever owned a TV, DVD player, set-top box or soundbar, chances are you’ll have come across either a coaxial, optical or, more recently, an HDMI connection. Those of you with a full-blown surround sound system almost certainly will. 

All three are digital, of course. Coaxial and optical can only transmit audio data, while HDMI brings the added bonus of supporting both audio and video. If you're not quite sure which connection to take advantage of, we've created this page to help guide you.

Coaxial digital connection

Coaxial vs optical vs HDMI - which is the best audio connection to use?

Probably the least common connection when it comes to modern AV kit, coaxial digital uses electricity to transmit audio. 

The connector is a standard, circular RCA connector - the kind that’s found at either end of a pair of analogue audio cables (or 'interconnects').

But don’t be tempted to try and use a standard RCA phono cable in place of a dedicated coaxial digital cable. They look similar and can work, but an analogue interconnect has different impedance values from a digital one (50 ohm versus 75 ohm), so won’t work as well. An entry-level cable like the QED Performance Coaxial will do a fine job for most.

Coaxial might not be as widespread as its rival optical connection these days, but you'll still find it at the back of certain AV receivers, integrated amplifiers and TVs.

And, in our experience, compared to optical, a coaxial connection does tend to sound better. That's because it has greater bandwidth available, meaning it can support higher quality audio up to 24-bit/192kHz. Optical is usually restricted to 96kHz.

The main downside to a coaxial digital connection is the potential transfer of electrical noise between your kit. Noise is bad news when it comes to sound quality, but it exists in all AV components to one degree or another. Unfortunately, using a coaxial connection enables noise to travel along the cable from the source to your amplifier.

Also, coaxial doesn't have the bandwidth required to support high-quality surround sound formats such as Dolby TrueHD, DTS-HD Master Audio, Dolby Atmos and DTS:X. So, in a modern home cinema setting, its uses are quite limited.

Optical digital connection

Coaxial vs optical vs HDMI - which is the best audio connection to use?

An optical digital connection uses the medium of light to transmit data through a cable’s optical fibres (which can be made from plastic, glass or silica). An optical cable doesn’t allow noise to pass from source to DAC circuitry like a coaxial can, and so makes sense to use this socket when connecting straight into the DAC of a soundbar or AV receiver.

Traditionally, in a home cinema environment, optical connections tend to be used to transmit compressed Dolby Digital and DTS surround sound. Optical cables with a Toslink (Toshiba Link) connector slot into a matching socket on both source and receiver. Something like the QED Performance Graphite Optical is a good entry-level option.

Although HDMI has taken over as the main socket of choice for many manufacturers, optical outputs are still common on games consoles, Blu-ray players, set-top boxes and televisions. Optical inputs are found at the amplification or DAC end, e.g. on soundbars and AV receivers.

Like coaxial, one of the issues with optical is that it doesn’t have enough bandwidth for the lossless audio formats such as Dolby TrueHD or DTS-HD Master Audio soundtracks found on most Blu-rays and 4K Blu-rays. An optical connection also can’t support more than two channels of uncompressed PCM audio. Then there's the threat of damage if an optical cable is bent too tightly.

What about HDMI?

Coaxial vs optical vs HDMI - which is the best audio connection to use?

Launched in 2002, the biggest benefit of HDMI is it’s a one-size-fits-all connection for video and audio. It boasts much higher bandwidth than optical, allowing for playback of lossless audio formats such as Dolby TrueHD and DTS-HD Master Audio. Unlike optical and coaxial, there isn’t really a similar rival out there.

You'll find HDMI inputs and outputs a firm fixture on TVs, Blu-ray players, AV receivers and, increasingly, soundbars. An entry-level cable like the AudioQuest Pearl HDMI will suit a wide range of systems.

HDMI is a constantly evolving standard too, with new and improved versions offering more bandwidth and greater capacity to carry more channels of audio, such as Dolby Atmos and DTS:X soundtracks. It also supports new and current video formats - including Ultra HD 4K resolution and the various HDR formats - and additional  features such as high frame rate (HFR) and eARC (which can deliver up to 32 channels of audio).

The majority of TV and AV products launched over the last few years support HDMI version 2.0, but HDMI 2.1 (which supports 8K resolution content) is slowly making its way onto the market.

So, which connection should you use?

The answer to this will depend on the kit you’re using. If it’s a straight choice between coaxial and optical, we’d go for the former. In our experience, a coaxial connection tends to produce better audio quality than optical, allowing for a higher level of detail and greater dynamics.

But, we live in an age where convenience is king. HDMI is now the go-to connection for all things AV and it’s hard to argue against it if all the kit in your system chain sports that socket.

HDMI’s feature set, upgradability and the fact it can handle both audio and video means you don’t need to worry about too many wires clogging up your system. And, best of all, you won't sacrifice performance.

MORE:

Andy Madden
Andy Madden

Andy is Deputy Editor of What Hi-Fi? and a consumer electronics journalist with nearly 20 years of experience writing news, reviews and features. Over the years he's also contributed to a number of other outlets, including The Sunday Times, the BBC, Stuff, and BA High Life Magazine. Premium wireless earbuds are his passion but he's also keen on car tech and in-car audio systems and can often be found cruising the countryside testing the latest set-ups. In his spare time Andy is a keen golfer and gamer.

  • daddyo
    https://www.whathifi.com/advice/coaxial-vs-optical-vs-hdmi-which-is-the-best-audio-connection-to-use
    This is a very well written article for neophyte audiophiles like me. Specific comparison with just enuf practical technical information. A couple of yrs ago a "professional installer" used coaxial cable to connect my 5.1.2 system and I had difficulty convincing him that coax wasn't suitable for that application. It took extensive internet searches to gather the info that this article provides in a succinct manner. Well done.
    Reply
  • spk
    to be frank, it seems so strange that in this day an article with so many misconceptions would be written. Not only can optical support 192k bandwidth (was literally listening to a 192k FLACC over optical when I read this) that wouldn’t have any affect whatsoever on streams at a lower sample rate. It isn’t like the bits are frolicking more freely in copper, making those treble notes more sweet.

    furthermore the question of interference; because the signal is Digital it’s either there or it isn’t. While some massive amount of interference might potentially flip a low bit to a high bit, if you have that level of interference in your system then something else is certainly very, very wrong, and your cable choice is the least if your worries. If such digital interference is audible, it’s going to completely wreck the stream - not just some sort of subtle effect - that’s how digital works, it either is or it isn’t.

    While 20 years ago these sorts of misconceptions about digital signals were understandable, today it should be common knowledge that digital interference is not the same as analog interference and behaves completely differently.

    Now I think the biggest advantage of coaxial over consumer optical is that coaxial is will work better over distances greater than ten feet, though I could be wrong about that.
    Reply
  • Nick_B
    spk said:
    to be frank, it seems so strange that in this day an article with so many misconceptions would be written. Not only can optical support 192k bandwidth (was literally listening to a 192k FLACC over optical when I read this) that wouldn’t have any affect whatsoever on streams at a lower sample rate. It isn’t like the bits are frolicking more freely in copper, making those treble notes more sweet.

    furthermore the question of interference; because the signal is Digital it’s either there or it isn’t. While some massive amount of interference might potentially flip a low bit to a high bit, if you have that level of interference in your system then something else is certainly very, very wrong, and your cable choice is the least if your worries. If such digital interference is audible, it’s going to completely wreck the stream - not just some sort of subtle effect - that’s how digital works, it either is or it isn’t.

    While 20 years ago these sorts of misconceptions about digital signals were understandable, today it should be common knowledge that digital interference is not the same as analog interference and behaves completely differently.

    Now I think the biggest advantage of coaxial over consumer optical is that coaxial is will work better over distances greater than ten feet, though I could be wrong about that.

    Hey SPK this thread was kind of timely for me.
    Not sure if you have an opinion on this but:
    I have a panasonic ub9000 dvd player which I also use for cds, and a yamaha RX-A2080 av amp.
    Question is -- is it 'better' to use the DAC in the player, or the DAC in the receiver for CD playback? Assuming they're broadly similar, my thought was, do the conversion close to the source but I'm a total noob so that could be nonsense.
    If better to use the receiver; I have an optical cable already. I take it from your opinion that there shouldn't really be anything in it between that and a similar quality coaxial cable? (makes sense to me).
    If better to use the panasonic dac, then should I go ordinary analogue interconnects, or the balanced pin cables? why?
    Might be I'm overthinking it all at the 'level' of equipment i have...so if you say 'just run what you have' I get it too... :)
    Cheers
    Reply
  • spk
    Nick_B said:
    Hey SPK this thread was kind of timely for me.
    Not sure if you have an opinion on this but:
    I have a panasonic ub9000 dvd player which I also use for cds, and a yamaha RX-A2080 av amp.
    Question is -- is it 'better' to use the DAC in the player, or the DAC in the receiver for CD playback? Assuming they're broadly similar, my thought was, do the conversion close to the source but I'm a total noob so that could be nonsense.
    If better to use the receiver; I have an optical cable already. I take it from your opinion that there shouldn't really be anything in it between that and a similar quality coaxial cable? (makes sense to me).
    If better to use the panasonic dac, then should I go ordinary analogue interconnects, or the balanced pin cables? why?
    Might be I'm overthinking it all at the 'level' of equipment i have...so if you say 'just run what you have' I get it too... :)
    Cheers

    so assuming that the dac is equal in quality i would convert at the receiver. I could be mistaken, but my impression is that analog signals, especially unbalanced, low level signals, are susceptible to noise whereas digital signals, especially optical, are not. So you want to keep your analog signal path as short as possible.

    Also, think of your DAC as the source, not the player (transport) since the DAC is the source of the analog signal. So in this mindset keeping your source in close proximity still applies.
    Reply
  • spk
    Btw- over thinking it is how we audiophiles roll :)
    Reply
  • Nick_B
    Sweet, thanks for that. I *THINK* it sounded subtly better using the receiver dac anyway, but wasn't sure. Good to have a rational explanation for it tho. cheers.
    Reply
  • spk
    Nick_B said:
    Sweet, thanks for that. I *THINK* it sounded subtly better using the receiver dac anyway, but wasn't sure. Good to have a rational explanation for it tho. cheers.

    This alone likely will not matter much, but this is, in my opinion, your best practice. with analog everything is cumulative, a lot of little things will tend to add up, so its best to get as much right as practical, at least within reason.
    Reply
  • ginandbacon
    They could easily update both to support full bandwidth, all formats but it's all about HDCP, which is a shame. Also, maybe Toshiba invented the original digital toslink connected but both are S/PDIF, which was created by Sony and Philips.(see below).

    It's a shame because there is no way that can convince me that gig fiber internet and 4K SDI connectors exist but somehow neither can be updated to support maybe 5 to 10Mbps (500KB/s or 1MB/s because your typical 7.1 Atmos bedded in a Dolby TrueHD signal isn't possible (Atmos isn't a sound format, it's metadata embedded in a DD+ or Dolby TrueHD signal, both which have been around since Blu-Rays came out. Heck, they used unused CEC bandwidth originally to do ARC. My highest bitrate music is SACD and it's typically 4 to 6Mbps for 5.1 actually remastered for 5.1. I just don't get it because, while I would never condone it. HDCP doesn't protect anything. It's out there if you want. Now, I purchase or subscribe to services for my music/movies to support them but it's not like HDCP is really protecting anything.

    S/PDIF (Sony/Philips Digital Interface) is a type of digital audio interface used in consumer audio equipment to output audio over relatively short distances. The signal is transmitted over either a coaxial cable with RCA connectors or a fiber optic cable with TOSLINK connectors.
    Reply
  • ginandbacon
    spk said:
    so assuming that the dac is equal in quality i would convert at the receiver. I could be mistaken, but my impression is that analog signals, especially unbalanced, low level signals, are susceptible to noise whereas digital signals, especially optical, are not. So you want to keep your analog signal path as short as possible.

    Also, think of your DAC as the source, not the player (transport) since the DAC is the source of the analog signal. So in this mindset keeping your source in close proximity still applies.

    Pretty much this outside the fact that almost all HDMI cables shorter then 10 to 15ft max are copper. Really, you just need a well shielded cable although it's always a good idea to keep all cable runs as short as possible in my experience.

    Even fiber HDMI cables typically have 2 to 4 copper wires, which is why they are still shielded. Pretty sure toslink is no metal outside the connections, which are often plastic also. Noise is hard to pin down sometimes because it depends on your house/apartment wiring, potential ground loops, ECT...

    If you ran everything off battery that pretty much elimates most noise but that would be crazy expensive. Even high end sinewave AV battery backups can't keep a system up that long. This will 9bviously depend on the setup and power used. It would not be crazy expensive to keep a DAC, 100W amp going to passive speakers up on a sinewave battery but 500W amp and other power hungry equipment will drain the battery quickly.

    It also has to be implemented correctly so it's always running of battery. Many claim they do but in reality they really don't. Sinewave batteries can just produce stable clean power when done correctly, which is often very expensive.
    Reply