The road to TV-viewing perfection is paved with new technologies. And acronyms. Lots and lots of acronyms.

The first high-definition broadcasts in the UK began ten years ago, when Sky began broadcasting HD channels in May 2006. Now, if you have the necessary kit, you can watch live 4K TV, thanks to BT Sport Ultra HD, or stream 4K content via Amazon, Netflix and YouTube.

And 4K Ultra HD has kick-started a slew of innovations that should be coming to your TV soon. If you want the best possible video (and audio) quality, you better brace yourself for some changes.

From HDR to WCG, 4K to 8K, Yoeri Geutskens, the font of knowledge behind the @UHD4K Twitter account, looks at the key television technology that will improve your viewing experience over the coming months and years...

MORE: How to watch 4K content online and on TV

4K Ultra HD resolution

Around ten years ago, the TV industry started making the switch from standard definition (SD) TV to high definition (HD) TV. This was all about increasing resolution: from 480 (USA and Japan) and 576 (UK and Europe), to 720 or 1080 lines. With this came the move from interlaced to progressive video, which gave us an ultimate resolution for HD TV of 1920 × 1080p.

But it didn't stop there. The last few years has seen the steady rise of 4K Ultra HD, with a resolution of 3840 × 2160. This delivers twice the horizontal and vertical lines of HD, and four times as many pixels.

Ultra HD TVs came to the market in 2013, but they have gained adoption quicker than HD TV sets did a decade earlier. This is thanks in part to consumers' habit of buying increasingly larger screens, many of which are now 4K.

MORE: 4K Ultra HD TV explained

High Dynamic Range (HDR)

The hottest buzzword in TV technology in 2016 is High Dynamic Range (HDR). The HDR technology in your TV isn't the same as HDR in photography, but it aims for a similar goal: better pictures thanks to brighter whites, darker blacks and a wider range of colours.

Confusingly for consumers, there are various technical solutions for implementing HDR, which means different versions of HDR on the market. The most prominent ones are HDR10, Dolby Vision, the ‘Prime Single’ joint proposal by Technicolor and Philips, and Hybrid Log Gamma (HLG) by the BBC and NHK, Japan’s public broadcaster. 

Dolby Vision is already being used by some streaming services, such as Netflix. For live TV however, many broadcasters and equipment makers favour HLG, which is more compatible with existing equipment and workflows. 

However you get it, HDR is seen as a very big deal. And it's already available, both in terms of products and content.

MORE: What is HLG?

Connectivity and HDMI

HDMI remains the standard connection for sending HD and 4K video. HDMI 1.4a was good enough for HD and even 4K, as long as the frame rate was no higher than 30fps. HDMI 2.0, first introduced in 2014, raised the capability to 4K at 60fps. It also enables a newer version of HDMI’s content protection technology, HDCP 2.2. Ultra HD Blu-ray players and BT’s Ultra HD box require HDCP 2.2-compliant TV sets. 

To support HDR, you need HDMI 2.0a. Fortunately, TVs can be upgraded from HDMI 2.0 to 2.0a with a software update, but it's not possible to upgrade from HDMI 1.4a.   But a future version of HDMI will no doubt be needed to support future upgrades to video technology (some of which we'll get to later). Whether that will be DisplayPort over USB-C, or MHL, as found on mobile phones, remains to be seen.

Encoding and decoding standards

Ultra HD has introduced new compression standards. Although Ultra HD encoding is possible with MPEG-4 AVC (also known as H.264), a more advanced and efficient technology called HEVC or H.265 has become the new standard. 

Why does this matter? Because 4K needs a lot more bits, so better compression helps broadcasters, satellite operators and cable operators make best use of limited bandwidth.

Decoding chips for TVs and set-top boxes, as well as encoding hardware and software for broadcasters, are now available - so expect all future UHD transmissions (and many HD ones) to use this new HEVC standard.

MORE: BBC aims to broadcast 4K as standard by 2016

More after the break

Immersive audio: Atmos and more

The best quality level for broadcast surround sound is currently 7.1-channel Dolby Digital Plus. Although higher channel counts are possible for a more immersive experience, Dolby and DTS have taken a different approach.

Pioneered in the cinema, but more recently adapted for home use, object-based audio allows sounds to be better placed around the room, thanks to a wider range of speaker positions. 

Dolby Atmos for the home allows for up to nine speakers at base level, up to four height channels and an LFE channel for a subwoofer. Dolby Atmos is already available on selected titles on streaming video platform Vudu, and on a few dozen Blu-ray discs.

It’s not restricted to the new Ultra HD Blu-ray standard but works with the ‘old’ Blu-ray standard as well. The next step for this next-gen audio will be more content, and compatibility with broadcast TV. Watch this space.

MORE: Dolby Atmos: What is it? How can you get it?

MORE: DTS:X: What is it? How can you get it?

Wide Colour Gamut (WCG)

Now we get into future-gazing territory. Wide Colour Gamut means the adoption of a larger colour space than the one specified for HD TV, which was known as BT.709. For 4K TV, various tech organisations have standardised upon a colour space they call BT.2020 or Rec.2020. The aim is to give ample room for future picture improvement, the spec extending as it does far beyond what current screens can handle.

HDR more or less requires Wide Colour Gamut, although the use of WCG does not automatically imply HDR. Still with us?

Expect most UHD TVs coming to the market from 2016 that don’t support HDR to still support WCG, even though it’s barely mentioned. And when it comes to creating the perfect picture in the future, it may yet play a crucial part. 

Deep colour resolution

In theory, a wider colour gamut and higher dynamic range would be possible with the existing 8-bit colour resolution of HD TV, but the risk of colour banding would loom large. The reproduction of both WCG and HDR is greatly helped by the transition from 8-bit to 10-bit and 12-bit video. So this could be the next tech step.

The latter is mostly used in production; the former in distribution to consumers. Note the number here refers to the bit depth per subpixel ie. 8, 10 or 12 bits for R, G and B each, resulting in 24, 30 or 36 bits per pixel. With two more bits per subpixel, you get 64 times as many colours, allowing for much smoother gradients.

Again, a more realistic representation of colours, without any sign of banding, is the benefit here.

High Frame Rate (HFR)

Initial Ultra HD TV sets and broadcasts operated at 25 frames per second (in Europe) or 30 fps (North America). The current state-of-the-art is double that – 50 or 60 fps.

With the increased resolution - i.e. 4K - the risk of motion blur increases if the frame rate can't keep up. Raising the frame rate to 100 or 120 fps helps eliminate blur, this is called HFR.

However, it is estimated that a doubling of the frame rate leads to a 50 per cent increase in bandwidth for the compressed video signal (not 100 per cent thanks to compression efficiency increasing with the frame rate). Nonetheless, expect HFR to arrive in the next few years.

8K or Ultra HD 2 (UHD-2)

Even with UHD, HDR, WCG and HFR, progress doesn’t come to an end. In fact, some people are already arguing that 4K resolution isn't enough. The Japanese broadcast industry is pushing ahead for a resolution of 7680 × 4320p, which they call 8K. Others refer to this standard as Ultra HD 2 (UHD-2) as opposed to the existing UHD-1.

Moving to UHD-2 would have huge repercussions throughout the entire production and delivery chain, so few outside of Japan are prepared to even talk about it. LG, Samsung, Sharp and several Chinese manufacturers showcased ‘8K’ displays at CES 2016 in Las Vegas earlier this year, but apart from being technology statements, the only practical use for these products in the foreseeable future is likely to be digital signage. 

An outside bet could be the return of 3D TV (remember that?). 8K allows for autostereoscopic 3D, ie. glasses-free 3D with an effective resolution of 4K. 

MORE: Japan plans for 8K TV broadcasts

MORE: LG reveals first production 8K TV

Modular displays

The switch from analogue, 4:3, SD TV sets to digital, 16:9, HD TV sets was driven by the flat-panel form that plasma and LCD displays enabled - in stark contrast to their CRT predecessors. Could there be another such transition in the future to drive adoption of 8K 120fps displays? Certainly if TV dimensions keep growing, the sets could become too large to handle.

One format that manufacturers such as Sony, Samsung and Google have started working on is modular screens. These potentially giant screens would be made from compact, stackable building blocks.

If this technology becomes available commercially at affordable prices, it will be time to make space for a wall-to-wall, floor-to-ceiling, immersive display in your living room. Less TV screen, more TV wall. Or perhaps we'll all be watching in VR by then...

Of course, not all this technology is in place yet. It's unlikely to arrive with a big bang either, but instead it will trickle down to consumer TVs, one step at a time.

So should you wait until all this dust has settled before buying a new TV? In a word, no. You'd be waiting a very long time. And plenty of innovation - from 4K to OLED to HDR - is already available.

But if there's one thing we know about consumer technology, it's that there's always something new on the horizon.