The cost of a 4K TV might be more than you think. Preliminary figures from a British Gas Home Energy report with the Centre for Economics and Business Research (CEBR), due to be released in January 2017, have revealed that since 2014 an additional 11 gigawatt hours (GWh) of electricity was necessary to power 4K TVs compared to their HD (high definition) counterparts. The report claims 4K TVs use up to a third more energy than HD models.
While this currently equates to just £1.8m of extra cost, by 2019 it's predicted that the energy usage will increase by 4264 per cent to 480 GWh, which equates to an £82m increase across the UK.
It is estimated that in three years approximately nine million homes will have a 4K television. Approximately two million households are expected to own one by the end of 2016.
According to the report, the average cost of powering a TV in 2001 was £14 a year, and by 2008 that figure had increased by 44 per cent to £20/year. There was subsequently a decline over the next seven years to £18 pa as TVs became more energy efficient, but that trend now seems likely to be reversed with the increasing popularity of 4K models.
British Gas has issued a number of tips on how to reduce the energy consumption of a 4K TV, including not leaving it in standby mode when you aren't using it and choosing a 4K TV with energy efficiency modes.
Daniel Colford, smart energy expert at British Gas, says: “TV has long been considered the nation’s favourite pastime and as such people will always look to upgrade to the latest technology to improve their viewing experience. With living rooms now awash with technology and entertainment gadgets, many of which routinely use power even if on standby, we recommend taking a closer look at each device to see how its energy use can be reduced and getting smart meters installed to monitor overall household energy consumption.”
In Japan 8K televisions are already in development for the 2020 Olympics, with Panasonic and Sony partnering with Japanese national broadcaster NHK.