If you’ve read any of our TV, smartphone, tablet or projector reviews, it's likely that you will have come across the terms “nits” and “lumens” in the specifications and picture sections. While we often list these down and use them as part of the specification comparison with other products, it's important to know what these terms actually mean, and what role they play in how these displays work.
What is a Nit?
Nits are used specifically when we talk about TVs and handheld devices such as smartphones and tablets. It is a unit of measurement that relates to the brightness level of visible light (luminance) within a specific area.
A nit is equivalent to the light from a traditional candle per square metre, meaning that a display that is rated at 600 nits is equivalent to the light emitted from 600 candles in an area of one square metre.
You may have also come across the term candela, and nits and candela are in fact interchangeable terms, with nits effectively acting as a shorthand for “candela per square metre”. Nit has become the standardised term in the world of displays, so you're unlikely to see TV or phone screens described in terms of candela.
Why are Nits important?
At the most basic level, a nit is a unit to measure how bright a screen can go. Brightness plays an important role in the visibility of displays, as well as a vital role in the viewing of HDR content.
Films and television shows are generally mastered to a minimum of 400 nits, but many are mastered at a much higher 1000 nits. There are, in fact, some movies that are mastered as high as 4000 nits, though those are far less common. Ultimately, any display that isn’t at least as bright as the movie's peak brightness level, won't be able to display it exactly as intended – though most employ clever processing to make the most of the brightness available.
While 400 nits is technically HDR, you generally want a TV that reaches a brightness of over 600 Nits if you want a genuinely impactful HDR experience. If possible, it's worth going for a premium HDR TV that can produce 1000 nits or more, seeing as that is a common peak brightness level for HDR movies on streaming services and 4K Blu-ray. One of the best TVs at that level is the LG G2, which can hit that level of brightness, but also combines that with perfect blacks on account of its OLED panel.
TVs with even higher brightness levels are on the horizon, with the likes of the LG OLED G3 and Panasonic MZ2000 utilising Micro Lens Array (MLA) technology to reach the dizzying heights of between 1500 and 2000 Nits. This could have a serious impact on the HDR performance we see from these next-generation TVs, with higher nit figures theoretically meaning we see the brightest and most advanced HDR pictures yet.
However, it is important to remember that more nits doesn't automatically mean a better picture, as the increased luminance can result in other aspects of the picture being sacrificed. Most notably, particularly with standard OLED models, colour volume is at stake, with brighter content potentially becoming over-exposed and washed-out as the brightest, white OLED element overpowers the less bright, coloured OLEDs.
With backlit models, increased brightness comes with a risk to black depth, as boosting the brightness to certain parts of the picture, regularly results in light seeping into areas that should be dark.
What is a Lumen?
Lumens refer to the amount of light directly emitted from a source, such as a lightbulb. In the realm of AV, lumens tie into projectors, which use a light source to power an image, whether a traditional lamp or laser system. A higher lumen count means a brighter light source, and therefore a brighter image.
In the world of projectors, you may find the term ANSI lumens, which stands for the American National Standards Institute – this relates to a more accurate system of measuring lumens by including many more variables. While both figures might not show on all projector specifications, ANSI lumens are considered more accurate.
A rough guide to finding the ANSI lumen figure from regular lumens is to divide the Lumen count by 2.4. E.g. 3000 LED lumens / 2.4 = 1250 ANSI lumens.
Why are Lumens important?
Projectors work very differently from TVs, with the light produced from the source reflecting from a surface instead of being emitted directly. Therefore, depending on the type of projector and its positioning, the light source needs to be bright enough to remain visible in various conditions. It's no secret that projectors work best in dark rooms, just take your local cinema as an example, as both natural light and ambient light in the room can cause visibility issues.
Home cinema projectors should output at least 1000 lumens as a minimum, however, they often sit around the 1500 to 2000 mark. Top-tier projectors, such as Sony's VPL XW-7000ES, output at over 3000 lumens in order to produce a dazzlingly bright and punchy picture.
Some projectors extend to the 4000 lumens range, but these are often meant for commercial use, meaning they probably won't be a good fit for your home cinema setup. These are instead designed to work in large offices or multi-purpose spaces during daylight hours for business use, and not for watching films and TV shows.
Much like nits, higher lumen counts don't always mean a better picture, as over-exerted brightness can compromise colour volume, weaken black levels and impact contrast.
Lumens vs Nits: whats the difference?
The key difference is that nits and lumens represent different amounts of brightness, with one nit being roughly equal to 3.426 lumens, meaning you can compare the brightness of a projector or TV by multiplying or dividing as needed, for example:
A TV classed at 900 nits emits light equal to a projector rated at 3083.4 lumens. As a point of reference with real products, this makes the LG OLED65G2 (around 900 – 1000 nits) roughly equivalent to the Sony VPL-XW7000ES, which has a claimed brightness of 3200 Lumens.
Best HDR TVs: the ultimate high dynamic range TVs
Best projectors: Full HD, 4K, portable and short throw