I’ve done the measuring, and your TV's Filmmaker Mode isn’t just great for picture quality – it’s great for saving energy, too

The 65-inch Sony A95L OLED TV pictured on a white shelf. On the screen is a still of Adam Sandler in Happy Gilmore 2, and in the corner of the photo is a green badge that says 'Sustainability Week'.
(Image credit: What Hi-Fi? / Netflix (Happy Gilmore 2))

I have spent the last three days measuring TVs.

That's obviously not unusual for a TV reviewer, but this time I wasn't measuring dimming zones, input lag or peak brightness – I was measuring energy use.

It's something I've wanted to do for a long time, and the plan is to introduce power draw figures to some of our reviews in the future, but our first annual Sustainability Week was the kick up the backside that I needed to actually make a start.

Now, measuring the energy use of something like a TV is a very time-consuming process – you need to connect an energy meter and let it run for a while, and I wanted to test various different aspects – so this is only the start of the research, but it has already produced some results that surprised me.

SDR vs HDR

The 65-inch LG G5 OLED TV pictured on a wooden rack. On the screen is a still from Netflix F1 series Drive to Survive.

(Image credit: What Hi-Fi? / Netflix (Drive to Survive))

I first wanted to find out whether different content types affected power draw, so I played seven different 10-minute clips through the 65-inch LG G5 and Sony Bravia 8 II in our test room, and the 65-inch Sony A95L that I use at home.

Those clips were the 4K Blu-ray of the super-bright Pan, which I ran in HDR10; the far-less-bright Blade Runner 2049 on 4K Blu-ray, also in HDR10; Toy Story 4 in 4K and Dolby Vision from the integrated Disney Plus app; the 1080p, SDR Blu-ray of True Grit; an episode of the latest series of QI from BBC iPlayer in HD; the very first episode of the Mr. Bean TV show, in standard-def from Amazon Prime Video; and 10 minutes of Sky Sports News, sent from a Sky Stream puck.

That gave me a lot of data, but I'm not going to go into that in detail now, partly because I'm saving it for a future feature, but mostly because it's only really interesting because of where it sent me next.

You see, the measurements suggest that the source and resolution make little difference to the amount of power the TV uses – the only thing that really makes a difference is whether the content is in HDR or SDR, with HDR content using less power when in the least processed modes, but more power in modes such as Dynamic or Vivid.

That makes sense when you think about it. It's brightness that uses power, and in modes such as Filmmaker (or Professional in the case of Sony), HDR will be brighter than SDR.

But in broadly brighter modes, particularly Standard or Dynamic/Vivid, the TV will bring SDR content up to HDR-like brightness levels, and will employ additional processing for that task (as well as all of the additional processing those modes tend to involve), hence SDR content will then use more power than HDR.

Filmmaker Mode (or equivalent) can save energy (and money)

OLED TV: LG OLED42C3

(Image credit: Future)

What I really got obsessed by, though, was the overall power draw of different picture presets.

It was clear that the brighter modes would use more power, but how much more? And how much more would that extra energy cost?

So, I did yet more testing on my Sony A95L at home, as well as some research on how much the average household uses its TV, what the HDR/SDR content split is like, and how much electricity currently costs.

First up, the power draw of the four main presets, which are, from least to most processed, Professional, Cinema, Standard and Vivid:

Swipe to scroll horizontally
Header Cell - Column 0

Professional

Cinema

Standard

Vivid

HDR power per hour (kWh)

0.096

0.096

0.096

0.156

SDR power per hour (kWh)

0.084

0.108

0.114

0.156

According to Ofcom, in 2024, the average Brit spent just over 4.5 hours watching TV and video content per day.

Of this 4.5 hours, 84 per cent was through the TV set (as opposed to a smartphone, tablet, etc), so about 3.75 hours (3 hours and 45 mins) of watching content through the TV per day.

So, in a normal year, we're looking at 1369 hours of TV viewing.

Ofcom doesn't produce figures on HDR vs SDR (at least, not that I've seen), but Philips last year told me that its data shows that only 4-8 per cent of viewing through its "high-end" range, which includes all of its OLED models, is in HDR.

Let’s say, because the What Hi-Fi? audience is undoubtedly more passionate than average, that our readers watch 8 per cent HDR content. That would be 110 hours of HDR viewing per year, and 1259 of SDR.

Let's look at the energy use figures for the different modes on an annual basis, then, based on 110 hours of HDR and 1259 hours of SDR:

Swipe to scroll horizontally
Header Cell - Column 0

Professional

Cinema

Standard

Vivid

HDR power per year (kWh)

10.560

10.560

10.560

17.160

SDR power per year (kWh)

105.756

135.972

143.526

196.404

I don't know about you, but kWh mean very little to me – what I wanted to know is how this translated to monetary cost.

For the purpose of the UK energy price cap, the current average electricity cost is 25.73p per kWh.

So let's turn those above energy figures into pounds and pence by multiplying them by £0.2573:

Swipe to scroll horizontally
Header Cell - Column 0

Professional

Cinema

Standard

Vivid

HDR cost per year

£2.72

£2.72

£2.72

£4.42

SDR cost per year

£27.21

£34.99

£36.93

£50.53

Total cost per year

£29.93

£37.70

£39.65

£54.95

As you can see, over the course of a year, the Professional preset of my 65-inch Sony A95L (which, again, is very similar to the Filmmaker Mode of other TVs) will cost roughly 25 per cent less than the Standard preset that many people will default to.

But spare a thought for those people who think they need to turn everything up to 11 and choose the Vivid mode: they could save £25 a year by switching to Professional.

As an aside, this testing reminded me just how horrible modes such as Vivid and Dynamic are. Garish, blinding and horribly noisy, you couldn't pay me £25 to use it on my TV for a week, let alone a year.

Professional/Filmmaker is where you get the most cinematically authentic delivery with most TVs. So, to my mind at least, it's better as well as cheaper.

But is your TV cheaper to run than your kettle?

A photo of a kettle next to a large OLED TV in a living room

(Image credit: What Hi-Fi? / Netflix (Happy Gilmore 2))

Now, while I know I should be, I'm not someone who really has much idea of how much energy is used by the various appliances around my home, so I was taken aback when Ketan Bharadia (What Hi-Fi?'s Technical Editor) suggested that a TV only uses about as much power as a kettle.

Well, I set about doing another round of testing, and he's broadly correct.

While the data doesn’t seem terribly reliable, it’s suggested that Britons boil the kettle an average of 4 times per day, so that would be 1460 times in a typical year.

Let’s say (optimistically) that they’re boiling just 500ml each time – that’s the minimum for a lot of kettles and generally considered enough for two mugs of tea or coffee.

My kettle (a Kenwood Mesmerine ZJM811OR, if you must know) uses 0.069kWh to boil 500ml of tap water.

Multiplied by 1460 hours, that’s 100.74kWh per year, at a cost of £25.92 using the price cap average.

That's less than running my Sony A95L in Professional mode for a year, but only by £4.

The A95L is a 65-inch, flagship-grade OLED, too, so will use more power than most. There's every chance that many people are spending more each year running their kettle than their TV.

That's something that will require lots more data to prove, though, so expect more energy usage data on our TV reviews in the future.

In the meantime, give Filmmaker Mode (or your TV's equivalent) a go – it might save you some money as well as give you the best, most authentic picture quality.

MORE:

Here are the best TVs you can buy right now

Check out all of our Sustainability Week coverage

‘Great sound shouldn’t cost us the planet’ – how Cambridge Audio wants to make hi-fi green

This new TV trend championed by Hisense and Samsung is a huge win for sustainability – now I wish every TV manufacturer would join in

Tom Parsons

Tom Parsons has been writing about TV, AV and hi-fi products (not to mention plenty of other 'gadgets' and even cars) for over 15 years. He began his career as What Hi-Fi?'s Staff Writer and is now the TV and AV Editor. In between, he worked as Reviews Editor and then Deputy Editor at Stuff, and over the years has had his work featured in publications such as T3, The Telegraph and Louder. He's also appeared on BBC News, BBC World Service, BBC Radio 4 and Sky Swipe. In his spare time Tom is a runner and gamer.

You must confirm your public display name before commenting

Please logout and then login again, you will then be prompted to enter your display name.