You see it in the headlines. You see it in reviews. The brightness wars waged among TV brands have never been more intense. But what is with this obsession with brightness on TVs? How bright does a TV even need to be? How much should you even care about brightness ratings when shopping for a new TV?
Whether you are a bona fide TV enthusiast or you’ve just started your journey into researching a new TV to buy, I think it’s important that we have a conversation about TV brightness.
And there’s no better time. Take a look at what went down at CES in January. On the grandiose side of the scale, we had TCL announcing a 115-inch TV with peak brightness poking up to the 5,000-nit level. Not to be outdone, Hisense announced a 10,000-nit, 110-inch TV.
Meanwhile, Samsung and LG touted new OLED panels that promised to be just a smidge brighter than last year, which was enough to get folks excited. In the video super-nerd world, some of us are very excited about a new 4,000-nit Sony mastering monitor that video pros could soon be using.
So, what is with all this brightness talk? How and why are the brightness wars a thing? What is a nit, and should you even care?
We have a nit-uation here
For those of you who didn’t get that objectively terrible joke, a nit is a unit of brightness, and it’s used to express how bright a display can get. Nit is shorthand for the much more scientific measurement of candelas per square meter.
How many nits a display can put out has, for better or worse, become a yardstick for quality.
Now, you may have heard about lumens as a measurement of brightness and wondered why we don’t apply those to TVs, especially when you see projectors with brightness capabilities expressed in lumens. The reason is that lumens are applied to things like lamps or light bulbs and their ability to shine light in all directions. Nits, by contrast (see what I did there?), are used when the light is highly directional. This is why projector lamps are measured in lumens, but the brightness coming off a projector screen is measured in nits — just like a TV, laptop, computer monitor, or phone.
How many nits a display can put out has, for better or worse, become a yardstick for quality. In some ways that’s kind of fun. But in other ways, it’s a problem. I’ll explain why, and we’re going to get into whether you should even give a nit (no, I’m not done yet) about all this. And by the time we’re done here, you’re going to be TV brightness experts. But before we get into the nits and bolts, let’s talk about how we got to this point.
And, spoiler alert: It’s me. Hi. I’m the problem. It’s me. Well, at least a little nit.
Looking on the bright side
First, we must acknowledge that brightness is exciting, right? We know this. Humans love bright, shiny things. It’s just built into us. I’m sure there’s a deep explanation for that. But for now, all we have to do is acknowledge this truth.
Contrast is the element of image quality that has, by far, the highest degree of impact.
When it comes to TVs and other displays, brightness plays a role in visibility, yes. But in terms of what makes a picture look attractive to us, it’s the brightness against darkness, or contrast, that really lights up our excitement. This is because contrast is the element of image quality that has, by far, the highest degree of impact. You know who knew this perhaps better than anyone? Ansel Adams — an American photographer who made a career out of producing stunning black-and-white photos. Just look at some of his work and it becomes clear that color, while quite enjoyable, is not required for beautiful imagery. These black-and-white images are gorgeous because of contrast.
In understanding that contrast has such a high impact on picture quality, and that brightness plays a huge role in achieving stunning contrast, you can understand why TV brands and TV fans would get excited at the prospect of TVs that have the capability of achieving higher and higher levels of brightness. And that’s where folks like me — TV reviewers and tech journalists — come in.
We have a sort of chicken-or-egg situation going on here. I don’t know if TV brands started it or if TV reviewers and tech journalists started it, but at some point, how many nits a TV could put out became the most exciting spec to talk about. (I know I’ve been responsible for trumping up nit ratings in articles and videos for years now.) In fact, the nit has become so inextricably linked with enthusiasm for TVs that I nicknamed fans of my content Nit Nerds and then set about creating a line of merch around the brand.
How bright is bright enough?
But where does it end? How bright is bright enough? Do we really need all this brightness? At what point will a TV’s brightness morph from asset to liability?
To answer those questions, we first need to talk about the different ways brightness is used in televisions.
Average Picture Level — or APL, as we say in the industry — refers to the average brightness of an image on a display. From the content creation side of things, we can point to TV shows like Game of Thrones, House of the Dragon, and The Witcher as shows that have a lot of low-APL scenes. In other words, they look dim to a lot of folks. Some may say — and they have — they are sometimes too dark to be watchable.
On the opposite side of the spectrum, we have high-APL content such as Burn Notice and Death in Paradise — sun-drenched shows that are bright, vivid, and colorful by nature.
A TV that’s properly set up will make these shows look as the creators intended. But lots of us just like bright images. So we may crank up the brightness to get a higher APL out of our TV. On the other hand, some of us need our TVs to have a lower APL because we watch in a dedicated dark media room or maybe at night in our bedrooms, where a TV set to put out high APL would simply be uncomfortable to watch. You don’t want to be squinting the whole time you’re watching shows, movies, or games, right?
Most TVs today can be dimmed so they have a low enough APL to be comfortable. But for those of us who need our TV to be really bright, perhaps because the TV is often watched during the day in a sun-soaked room, how bright a TV can get in terms of APL is important.
I think we’ve all gone outside on a sunny day and had difficulty seeing our phone or laptop screens if they weren’t set to be bright enough, right? Same deal with your TV inside — or outside, come to think of it. You need your TV to be bright enough to be visible — let alone have any contrast — in bright environments. That’s why outdoor TVs, especially, market their high brightness as an attribute.
And then there’s brightness for the purpose of displaying really intense, dazzling HDR highlights. HDR stands for high dynamic range and, relatively speaking, is rather new to the world of TVs. (If you don’t HDR from TVs, you might well know it from phone cameras.)
Some folks want a TV to have a high peak brightness capability, not because they want the whole screen to sear their eyeballs — though some out there do want that and always use the Vivid picture preset on their TVs to get it — but because they want to see that super-contrasty image that can only come from having very high-brightness objects in the image. We call the small high-brightness areas “specular highlights.” Examples might be the sun gleaming off the chrome of a car, or the stadium lights gleaming off a football helmet. It could also be a super-bright candle in an otherwise dark room.
Some folks even want a TV that can have a high average picture level and still have enough power in reserve to put out high-impact specular highlights.
So, we have basically four scenarios that illustrate how brightness can be used. One may be a small, bright object in an otherwise dark background. Or a small, bright object in a well-lit scene.
Or, it could be a large, bright object on a dark background — like a moon, or the Death Star. Or it could be a large, bright object on a well-lit background.
That last one — a large bright object on a well-lit background — can be especially hard to pull off. But when done right, it can be downright dazzling. Especially when you can get that HDR highlight impression in a room with the lights on.
I explain all of that in support of two important points about TV brightness. One: A TV’s high peak brightness ability or high nit rating isn’t just about whether it can light up your whole neighborhood on a dark night or sear your retinas like the sun itself. It’s about being able to replicate the kind of sparkle and contrast we often see in real life. And, two: When we worry about whether a TV will be “too bright,” we’re really worried about how the TV will use its brightness power. This leads us to the next critical point of this discussion.
The real question is: Will a TV use its power for good? Or for evil?
You know that classic Stan Lee line from Spider-Man: “With great power comes great responsibility.” And, yeah, perhaps I’ve overused that trope when talking about TV brightness, but it really does apply. When we look at TV brands’ claims about how bright their TVs can get, it’s fair to get excited. But we should also mix in a healthy amount of skepticism. Because the real question is this: Will a TV use its power for good? Or for evil?
OK, good and evil may not be the right metaphor here. What I really mean is will a TV be responsible about how it uses its high-brightness power? Will it put the power into just the right spots? Or will it be a hot, uncomfortable mess when you watch said TV?
When Dolby created the spec for Dolby Vision HDR, it wanted the platform to be scalable all the way up to 10,000 nits. That number is not arbitrary. Some scientists probably spent no small amount of time determining that, in the right situations, 10,000 nits was a magic number for HDR highlights in demanding situations. Situations for the home, I mean. We’re not talking about commercial signage here.
Until very recently — think late 2023 — a TV achieving up to 10,000 nits peak output was a pipe dream. I honestly didn’t think we’d see a TV that could punch that hard for a few more years. I was as shocked as anyone that Hisense said it would deliver a 2024 TV that could do so. And I’ll remain skeptical until I’ve seen it myself.
It’s important to understand that such a brightness claim is tied to a standard that involves taking measurements from a very small spot on a screen.
Clockwise form top left, you’ll see what a 10% white window looks like on a 65-inch TV, then a 5% window , then a 3% window, and, finally, a 1% percent window.
When TCL says its TV will do 5,000 nits, I believe that claim is based on a testing methodology that sees the TV doing nothing else other than displaying a white window at either 3% or even 1%. Same for Hisense and its claim of 10,000 nits – I’m willing to bet that’s a 1% window where all the TV’s power is going to this one thing.
Light up the whole TV with an image of the beach, sun sparkling off the water, and those specular highlights are unlikely to get up to 5,000 or 10,000 nits on those TVs I just mentioned.
We can assume these new flagship TVs … will be able to have both punchy HDR highlights and generally super-bright screens.
But that’s true of any TV’s brightness rating. That peak brightness claim is based on ideal conditions, with virtually no other strain of responsibility on the TV.
However, when we talk about high peak brightness capability, we also know that the TV can also achieve a super-high average picture level relative to TVs that are rated with a lower peak brightness number. So we can assume these new flagship TVs from TCL, Hisense, and others will be able to have both punchy HDR highlights and generally super-bright screens.
The question is: Will they do the job well? And the “job” I’m referring to is something called tone mapping.
Tone mapping
Tone mapping has to be thought of differently than it used to be.
Until now, the content we got could contain brightness information that was beyond a TV’s capability. For example, the movie Batman v Superman: Dawn of Justice was mastered to 4,000 nits. The disc has information on it that spans from pure black all the way up to 4,000 nits of peak brightness. That 4,000-nit information is scant — it probably comes up just a few times in the movie, like here.
The thing is, up until recently, there weren’t any TVs that could do 4,000 nits. So, then, what does the TV do when it gets instructions it can’t execute? It does the tone-mapping job.
Tone mapping is the process in which a TV’s processor has to take the brightness information it gets from a piece of content – whether that’s a show on Netflix or a 4K Blu-ray disc – and make it work on a TV.
I think the best illustration I can offer uses a ruler or tape measure. Let’s say a brightness range of zero to 4,000 nits can be represented by four feet worth of a tape measure. That’s the signal coming from a Blu-ray disc to the TV, for example. And each of the little lines on the tape measure is increments of brightness – so, we’ll say each centimeter up the tape measure is just a bit brighter than the one before.
Now, let’s say we have a TV that has a peak brightness of 2,500 nits. The above photo shows what that would look like on our tape measure.
The challenge here is to take that 4,000 nits’ worth of tape measure — not just the total length, but the space in-between each of those little increments — and scale it down so that it fits within a shorter space.
So we have to figure out how to fit 4,000 nits of range into a 2,500-nit space. So not only are we going to have to drop the ceiling, so to speak, but we’re going to want to keep all these increments, so we’ll have to make the distance between those increments shorter. We’ll have to make the centimeters closer together.
We’ve already seen that how good a TV is at this tone-mapping job can vary wildly.
That’s what a TV’s tone mapping does — or at least that’s one of the things it does. It also has to do this job in reverse. Most of the content we get is mastered to 1,000 nits. And there are lots of TVs that can get brighter than that. So the art here is in taking this 1,000-nit signal and expanding it UP to 2,500 nits. So, we now raise the ceiling, and we make the space between the increments — the distance between centimeters — greater so that we can have equal-sized steps from the bottom to the top.
Now, take that idea and scale it up. Way up. If most of the content we can get is at 1,000 nits, then a TV with 5,000 nits is going to have to do a lot of stretching and scaling – tone mapping. A TV with 10,000 nits? It’s just doing a bigger version of that job, but tone mapping is a little harder.
We’ve already seen that how good a TV is at this tone-mapping job can vary wildly. Some TVs are exceptionally good at it, while others have left much to be desired. And while companies like Sony and LG have historically done a really impressive job of tone mapping, it was only recently that Hisense and TCL stepped up their tone-mapping game, and this was when the stakes were much lower.
The future looks bright
So, again, the question shouldn’t be about whether a TV is made to be too bright. The question is whether that TV can use its power judiciously so that we get a good experience out of that brightness, not one that makes us need to squint or put on shades.
I know mant of you may have thought: “I feel like my TV is too bright as it is. Why in the world would I want a brighter TV? I’ll just get blasted out of the room!”
To you, I would say this: Your TV needs adjustment, or it needs to do a better job of tone mapping. When a TV is doing its thing well, it will keep the APL at a comfortable level while delivering sparkling HDR highlights, so you get a gorgeous, high-contrast, yet comfortable viewing experience.
Time will tell whether 2024 TVs will do right by us with all this brightness power.