• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.

Darknight

"I'd buy that for a dollar!"
Member
Oct 25, 2017
22,765
Everything you said is reasonable. I just wish people didn't hype this like we just cured the video game equivalent of cancer, given how early implementation was. Making it the selling point of modern GPUs I think undermined the importance of raytracing a bit, given how poorly it has been implemented so far. We're going to wait a few years regardless, but I'll hopefully have more to be happy about then.

Are there even 20 games out that utilize ray tracing? There's so little content out there now that it really is premature to throw the towel in on it. I think with the next gen consoles having some form of ray tracing hardware in them, you're going to see a lot more game developers implement it to some degree which should accelerate how often it's used as well as start getting developers into creating hopefully good habits in how to use it. That alone should start the ball rolling more, but even then, this will still be first gen level, so it's best to keep expectations in check a bit. Once it's standard across all GPUs and game development, that's when it should really take off.
 

leng jai

Member
Nov 2, 2017
15,114
I can only hope, because the current implementation in GPUs left me very underwhelmed. The only good thing is the time-saving labor it will have on developers, so I can be happy on their behalf. HDR is just been disappointing in general to me, and the amount of work needed to both get a good television, the time needed to find good adjustments, and even hoping the game or show does proper HDR makes it not worth the effort.

Games are still hit and miss for HDR, PC especially. UHD films is where the most reliable and best implementations are. 100% support and 80% of them look incredible.
 

Davilmar

Member
Oct 27, 2017
4,264
Are there even 20 games out that utilize ray tracing? There's so little content out there now that it really is premature to throw the towel in on it. I think with the next gen consoles having some form of ray tracing hardware in them, you're going to see a lot more game developers implement it to some degree which should accelerate how often it's used as well as start getting developers into creating hopefully good habits in how to use it. That alone should start the ball rolling more, but even then, this will still be first gen level, so it's best to keep expectations in check a bit. Once it's standard across all GPUs and game development, that's when it should really take off.

Naw bro, I'm not throwing in the towel on it. I'm just saying it was incredibly stupid to make it a focal point given how young and experimental the technology is, and that it likely warped some people's opinion and expectations for what the technology could do (myself included initially). I wish they hadn't made raytracing a forced implementation on the higher-end GPUs like NVIDIA did, and I wouldn't have been so disappointed initially. I know that we need adoption to help with costs and maturation, but they pushed the damn thing out the door when it was clearly not ready. I would have been far less harsher if these GPUs were done like three or four years in the future, when the technology was becoming more adopted and we could implement it better. My wallet took that giant L with raytracing GPUs last year, with little to show for it.

Games are still hit and miss for HDR, PC especially. UHD films is where the most reliable and best implementations are. 100% support and 80% of them look incredible.

The moment I had to look up forums to get an idea of what I had to do to properly get HDR done well on my television, to find out my console also didn't do HDR well was when I told myself "this is too much damn work for some weak ass visual effects. I'm out." and just played the game normally.
 

Darknight

"I'd buy that for a dollar!"
Member
Oct 25, 2017
22,765
Naw bro, I'm not throwing in the towel on it. I'm just saying it was incredibly stupid to make it a focal point given how young and experimental the technology is, and that it likely warped some people's opinion and expectations for what the technology could do (myself included initially). I wish they hadn't made raytracing a forced implementation on the higher-end GPUs like NVIDIA did, and I wouldn't have been so disappointed initially. I know that we need adoption to help with costs and maturation, but they pushed the damn thing out the door when it was clearly not ready. I would have been far less harsher if these GPUs were done like three or four years in the future, when the technology was becoming more adopted and we could implement it better. My wallet took that giant L with raytracing GPUs last year, with little to show for it.

But they can't mature the tech and get the cost down without implementing it in some form. At some point, the first generation was going to be painful for someone. What they have now is a reasonable starting point I think.
 

Gunship

Member
Oct 28, 2017
428
Playing the CoD MW campaign in HDR was quite something.

This was the first big game I played on my new 4k HR set - the luminosity and brightness of the signage in Piccadilly, the intensity and glare given off from explosions and fire, the way laser scopes cut through fog, the HDR just looks amazing.
 

Pargon

Member
Oct 27, 2017
11,970
Man TV ERA is super elitist haha.
It's not elitist to say that a display which doesn't meet the minimum standards for HDR is not a good representation of what HDR can do.
  1. ≥1000 nits peak brightness and less than 0.05 nits black level (20,000:1 contrast).
  2. ≥540 nits peak brightness and less than 0.0005 nits black level (1,080,000:1 contrast).
Both are required to have at least a 10-bit panel, and display at least 90% of the Display-P3 color gamut.

Details from RTINGS' review of the Samsung NU6900:
  • 320 nits peak brightness
  • 6100:1 contrast
  • 10-bit panel (but with banding near black)
  • 65% Display-P3 gamut coverage
Falls a bit short, doesn't it?
Can you still be happy with a TV that doesn't meet the HDR specifications, and still see some kind of difference in games from enabling HDR? Sure. But it's not really a representation of what HDR is supposed to be.
It sounds like it's still a pretty good TV for $230. You'd have to pay significantly more for far less performance just a few years ago. But that doesn't make it a great TV for HDR.

Got a ks8000 and I can't tell the difference between hdr off and on
Your SDR settings are probably way out of spec; higher brightness, wider gamut etc.
The difference between HDR and SDR should be easily noticeable when calibrated.
But many people prefer to watch SDR boosted to a much higher brightness and in a wider color gamut than intended…

HDR and Raytracing are one of those things that I will never seemingly understand the huge hype for, and will continually roll my eyes at people […]
What is difficult to understand about the image being able to go much brighter, higher contrast, and display much more vivid colors?
It can produce much more real -or unreal- images than SDR is capable of.

Naw bro, I'm not throwing in the towel on it. I'm just saying it was incredibly stupid to make it a focal point given how young and experimental the technology is, and that it likely warped some people's opinion and expectations for what the technology could do (myself included initially). I wish they hadn't made raytracing a forced implementation on the higher-end GPUs like NVIDIA did, and I wouldn't have been so disappointed initially. I know that we need adoption to help with costs and maturation, but they pushed the damn thing out the door when it was clearly not ready. I would have been far less harsher if these GPUs were done like three or four years in the future, when the technology was becoming more adopted and we could implement it better. My wallet took that giant L with raytracing GPUs last year, with little to show for it.
It was the perfect opportunity for NVIDIA to implement ray tracing when they did.
They were already so far ahead of the competition in performance that they could spend a lot of their silicon budget on adding completely new features while still offering the fastest GPUs available.
They'll be on their second generation of RT hardware -probably having learned a lot from the first- by the time next-gen consoles are here and AMD are releasing their first-gen RT hardware.
 

RivalGT

Member
Dec 13, 2017
6,385
The difference you are seeing must be something other than HDR, because the TV doesn't get bright enough for HDR. Regardless enjoy the new TV.
 

Phil me in

Member
Nov 22, 2018
1,292
It's not elitist to say that a display which doesn't meet the minimum standards for HDR is not a good representation of what HDR can do.
  1. ≥1000 nits peak brightness and less than 0.05 nits black level (20,000:1 contrast).
  2. ≥540 nits peak brightness and less than 0.0005 nits black level (1,080,000:1 contrast).
Both are required to have at least a 10-bit panel, and display at least 90% of the Display-P3 color gamut.

Details from RTINGS' review of the Samsung NU6900:
  • 320 nits peak brightness
  • 6100:1 contrast
  • 10-bit panel (but with banding near black)
  • 65% Display-P3 gamut coverage
Falls a bit short, doesn't it?
Can you still be happy with a TV that doesn't meet the HDR specifications, and still see some kind of difference in games from enabling HDR? Sure. But it's not really a representation of what HDR is supposed to be.
It sounds like it's still a pretty good TV for $230. You'd have to pay significantly more for far less performance just a few years ago. But that doesn't make it a great TV for HDR.


Your SDR settings are probably way out of spec; higher brightness, wider gamut etc.
The difference between HDR and SDR should be easily noticeable when calibrated.
But many people prefer to watch SDR boosted to a much higher brightness and in a wider color gamut than intended…


What is difficult to understand about the image being able to go much brighter, higher contrast, and display much more vivid colors?
It can produce much more real -or unreal- images than SDR is capable of.


It was the perfect opportunity for NVIDIA to implement ray tracing when they did.
They were already so far ahead of the competition in performance that they could spend a lot of their silicon budget on adding completely new features while still offering the fastest GPUs available.
They'll be on their second generation of RT hardware -probably having learned a lot from the first- by the time next-gen consoles are here and AMD are releasing their first-gen RT hardware.

I followed quite a few guides to calibrate my ps pro for hdr properly. In games where you can turn hdr off and on in game I couldn't tell the difference. Unless it was games that uses fake hdr like resident evil 2 where it looked worse.
 

Pargon

Member
Oct 27, 2017
11,970
I followed quite a few guides to calibrate my ps pro for hdr properly. In games where you can turn hdr off and on in game I couldn't tell the difference. Unless it was games that uses fake hdr like resident evil 2 where it looked worse.
Okay, so HDR is calibrated properly, but SDR is probably way out of spec if it looks similar to HDR.
Or your TV does a bad job with HDR.
 

Bunga

Banned
Oct 29, 2017
1,251
Jesus Christ the amount of tools in this thread just shitting on the OP. I've you've got nothing nice to say, just don't even bother? Enjoy your TV dude, if you have noticed a big difference and are enjoying it, don't let anyone tell you otherwise.
 

Marble

Banned
Nov 27, 2017
3,819
Jesus Christ the amount of tools in this thread just shitting on the OP. I've you've got nothing nice to say, just don't even bother? Enjoy your TV dude, if you have noticed a big difference and are enjoying it, don't let anyone tell you otherwise.

What do you mean? The threads title is "I have seen the light of HDR at last." I don't see what is wrong with pointing out OP has not seen proper HDR at all, at least not with this TV. Nothing wrong with that. It's TV companies and their misleading marketing that let's people believe they are seeing real HDR, which they really aren't.

You, on the other hand, are basically saying 'Hey OP, if placebo let's you think your TV has great HDR, don't let anyone else tell you otherwise'. I'd rather have someone tell me the truth, personally.
 

Bunga

Banned
Oct 29, 2017
1,251
What do you mean? The threads title is "I have seen the light of HDR at last." I don't see what is wrong with pointing out OP has not seen proper HDR at all, at least not with this TV. Nothing wrong with that. It's TV companies and their misleading marketing that let's people believe they are seeing real HDR, which they really aren't.

You, on the other hand, are basically saying 'Hey OP, if placebo let's you think your TV has great HDR, don't let anyone else tell you otherwise'. I'd rather have someone tell me the truth, personally.

That's fine but some of the people in this thread have frankly been pretty rude. Take the first reply for instance. Just a drive by shitting on his TV. Not sure how that helps anyone. You can tell someone the truth without being rude about it is what I'm saying.
 

Pargon

Member
Oct 27, 2017
11,970
No.1 reason why many people can't see any difference in HDR.
In fact HDR looks dimmer than SDR at the same backlight level.
Yes, and it's very easy to see why when you realize what's happening:

SDR is mastered to be viewed at 100 nits brightness.
HDR TVs can now push SDR to 500-600 nits; 5-6x the intended brightness.

Meanwhile, a scene that is mastered for 100 nits brightness in HDR will be displayed at 100 nits whether your TV is capable of 1000, 2000, 4000 nits.
So a TV displaying SDR out-of-spec is going to be brighter and more contrasted than HDR.
Some TVs now have options to brighten the HDR picture by going out of spec, but this is achieved by… compressing the dynamic range. So as you brighten the HDR picture, you are making it closer and closer to that boosted SDR one.

It's the same thing for color gamut.
SDR is mastered to Rec.709 colors, so it's not possible for an SDR source to display reds, greens, blues etc. which are as bright and saturated as HDR.
But most TVs let you stretch those Rec. 709 colors to the Display-P3 gamut, which over-saturates the picture.
gamut-expansion3eu7to8ksa.png

A solid, completely saturated red is going to be the same in HDR or SDR -you can't go beyond the TV's capabilities- but while HDR will be displaying accurate color with all its subtleties, SDR in P3 will be displaying an image that is more saturated and vivid than HDR.

EDIT: An example I forgot about - though not the best showcase for HDR.
madmax-highlight-larg7cjon.jpg

Mad Max: Fury Road in calibrated SDR on the left, and HDR on the right.
Color is slightly different (deeper and more natural reds) in HDR, but the main difference is that the bright areas of the picture have detail in them (above 100 nits) instead of washing out to a flat white sky.
You can actually see the sun and the lens flare like you would in real life - though the rest of the picture looks quite similar because it's within the 100 nits range that SDR can display.
This is because HDR can be used to create a subtly more natural image than always having to go significantly brighter or more vivid - because that is not always the director's intent.

madmax-2-largem1k0n.jpg

And this is Fury Road in SDR with the brightness cranked vs HDR.
This displays a much brighter and more vivid picture than HDR; but there's still no detail in the sky at all. It's flat and featureless, and colors in the image look unnatural.
But this is how many people watch SDR, and then wonder why HDR looks "worse" or dimmer.
It's pretty clear in this example, but viewing SDR maybe not quite so out-of-spec is why some people "can't tell the difference" when they switch HDR on.
 
Last edited:

Marble

Banned
Nov 27, 2017
3,819
That's fine but some of the people in this thread have frankly been pretty rude. Take the first reply for instance. Just a drive by shitting on his TV. Not sure how that helps anyone. You can tell someone the truth without being rude about it is what I'm saying.

Yeah well people find everything "rude" nowadays on the internet. If you find that rude I would advise to get of the internet, to be honest. He's just stating facts. Absolutely nothing rude about it. People get insulted so fast these days, jeez.
 

FaceHugger

Banned
Oct 27, 2017
13,949
USA
That's fine but some of the people in this thread have frankly been pretty rude. Take the first reply for instance. Just a drive by shitting on his TV. Not sure how that helps anyone. You can tell someone the truth without being rude about it is what I'm saying.

That's unfortunately a problem with many gaming forums. Basic human decency and decorum is thrown out the window. The curt and unhelpful reply of "NU6900 only hits like 300 nits brightness, you still ain't even seen proper HDR " could have easily been "That model has a low nits count, <insert explaining what that even means here>. So chances are you're not seeing the full effect of HDR. I'd recommend <insert TV models here> and here is why". Again, and unfortunately, such a normal and helpful response is going to be rare on a forum.
 

Bunga

Banned
Oct 29, 2017
1,251
Yeah well people find everything "rude" nowadays on the internet. If you find that rude I would advise to get of the internet, to be honest. He's just stating facts. Absolutely nothing rude about it. People get insulted so fast these days, jeez.

Guy buys something reasonably expensive, he's excited about the results, he comes to share his happiness about that with people and his purchase gets shit on immediately in the most curt way possible. Sorry, but that's the definition of rude. See below for a perfect example of what COULD have been posted. I'm not saying everyone has been unhelpful or rude by the way, many have explained to OP and suggested things etc. but yeah, lot of rude posts too unfortunately.

That's unfortunately a problem with many gaming forums. Basic human decency and decorum is thrown out the window. The curt and unhelpful reply of "NU6900 only hits like 300 nits brightness, you still ain't even seen proper HDR " could have easily been "That model has a low nits count, <insert explaining what that even means here>. So chances are you're not seeing the full effect of HDR. I'd recommend <insert TV models here> and here is why". Again, and unfortunately, such a normal and helpful response is going to be rare on a forum.
 
Last edited:

MazeHaze

Member
Nov 1, 2017
8,570
I bought this same TV as a bedroom TV and I just leave the HDR off tbh. It doesn't get bright enough, has no local dimming, and no wide color gamut. HDR on this TV looks the same as SDR with the brightness cranked.

Doesn't even look anywhere close to the same universe as any of my FALD displays, or my OLED.
 

TurdFerguson

Member
Oct 28, 2017
271
Norway
I was sooooo close to buying an LG C9 55" OLED on black friday, but held back since I have quit my job and will be going back to school in January. Sometimes real life can go fuck itself.

Edit: To be a little more relevant to the thread. I saw the light earlier this year when I purchased a Samsung C32HG70 PC monitor. It's hard to go back to my old non-HDR, 60Hz Sony TV at this point.
 
Last edited:

Sidebuster

Member
Oct 26, 2017
2,405
California
Yeah, I didn't know about what makes HDR, HDR when I got my tv a couple years ago. It really pissed me off because I consider Samsung really walking the line of false advertising. If I remember my TV only gets 500 nits max, only an 8-bit panel and doesn't really have the lighting zones to really make use of HDR. So the "technically true" about it was that it'll take an HDR input... That's it.

So I agree with the people warning that you aren't seeing real HDR, since I thought I was back when I got my TV and thought HDR was some kind of joke because all it did was sap the saturation out of the picture and not much else.

Edit: I don't want these companies getting away with their "technically true" features on their cheaper TV's. I don't think we should let people go around saying a TV has a feature when it really doesn't because it'll only serve to trick people into thinking they're getting something they're really not.
 
Last edited:

Deleted member 49319

Account closed at user request
Banned
Nov 4, 2018
3,672
Jesus Christ the amount of tools in this thread just shitting on the OP. I've you've got nothing nice to say, just don't even bother? Enjoy your TV dude, if you have noticed a big difference and are enjoying it, don't let anyone tell you otherwise.
I think this thread is going towards a very positive direction with some posters guiding OP to another model within the same price range but better IQ?

Yes, and it's very easy to see why when you realize what's happening:

SDR is mastered to be viewed at 100 nits brightness.
HDR TVs can now push SDR to 500-600 nits; 5-6x the intended brightness.

Meanwhile, a scene that is mastered for 100 nits brightness in HDR will be displayed at 100 nits whether your TV is capable of 1000, 2000, 4000 nits.
So a TV displaying SDR out-of-spec is going to be brighter and more contrasted than HDR.
Some TVs now have options to brighten the HDR picture by going out of spec, but this is achieved by… compressing the dynamic range. So as you brighten the HDR picture, you are making it closer and closer to that boosted SDR one.

It's the same thing for color gamut.
SDR is mastered to Rec.709 colors, so it's not possible for an SDR source to display reds, greens, blues etc. which are as bright and saturated as HDR.
But most TVs let you stretch those Rec. 709 colors to the Display-P3 gamut, which over-saturates the picture.
gamut-expansion3eu7to8ksa.png

A solid, completely saturated red is going to be the same in HDR or SDR -you can't go beyond the TV's capabilities- but while HDR will be displaying accurate color with all its subtleties, SDR in P3 will be displaying an image that is more saturated and vivid than HDR.

EDIT: An example I forgot about - though not the best showcase for HDR.
madmax-highlight-larg7cjon.jpg

Mad Max: Fury Road in calibrated SDR on the left, and HDR on the right.
Color is slightly different (deeper and more natural reds) in HDR, but the main difference is that the bright areas of the picture have detail in them (above 100 nits) instead of washing out to a flat white sky.
You can actually see the sun and the lens flare like you would in real life - though the rest of the picture looks quite similar because it's within the 100 nits range that SDR can display.
This is because HDR can be used to create a subtly more natural image than always having to go significantly brighter or more vivid - because that is not always the director's intent.

madmax-2-largem1k0n.jpg

And this is Fury Road in SDR with the brightness cranked vs HDR.
This displays a much brighter and more vivid picture than HDR; but there's still no detail in the sky at all. It's flat and featureless, and colors in the image look unnatural.
But this is how many people watch SDR, and then wonder why HDR looks "worse" or dimmer.
It's pretty clear in this example, but viewing SDR maybe not quite so out-of-spec is why some people "can't tell the difference" when they switch HDR on.
I once made a similar comparison in RDR2 when the HDR fix came out.
The first image is a screenshot from RDR2's "cinematic mode" which is SDR packaged in HDR and the second image from "game mode" which is the game's proper HDR implementation. When compared in photoshop the only difference between the two is sky luminance.
cJBgict.png
3wFiTe3.png
 

Mendrox

Banned
Oct 26, 2017
9,439
Huh?

It's just one of those things. If you have a friend who tells you "omg I just got a new car and it has heated seats! It's amazing I've always wanted to experience heated seats and now finally I get to have it!!!" And then you respond to their enthusiasm with "bro you have a Civic I wouldn't even consider their seat heaters 'heated'" it's just a way to make someone feel bad about their enthusiasm.

Like, when someone is excited about something, it's bad taste to shit on what excited them. At least there's ways to say "your hdr sucks and mine is the best hdr" that aren't as douchey as the first reply. General life advice, don't go around yucking someone's yum.

No. Dropping straight facts about the purchase is better. Sometimes people even rethink their purchase and buy a different better thing, because beforehand they didn't knew better. In this case the OP has a good price, but it's still not really HDR. I guess the TV panel is just better than he is used too. Nothing wrong about telling them what they really bought.
 

MazeHaze

Member
Nov 1, 2017
8,570
What is actually probably happening is the NU6900 has an amazing native contrast ratio for a budget TV, certainly probably better than OP's previous TV or monitor. The NU6900 is a great looking set, but it doesn't do HDR, you have to turn the contrast enhancer on high to get any significant brightness out of it, which crushes and oversaturates the image. It gives it that "vivid preset on a best buy show floor" type look, which will certainly wow the uninformed, but the picture is inaccurate, and it isn't HDR. Imo.


Edit: and everybody hating on people dropping facts is ridiculous. This thread is the equivalent to somebody saying "I finally see the light of 3d movies" and theyre talking about watching a 2d film while wearing the red and blue glasses from a 3d comic book or something. (For the record I am not a fan of 3D movies)
 
Last edited:

Deleted member 17092

User requested account closure
Banned
Oct 27, 2017
20,360
It's not elitist to say that a display which doesn't meet the minimum standards for HDR is not a good representation of what HDR can do.
  1. ≥1000 nits peak brightness and less than 0.05 nits black level (20,000:1 contrast).
  2. ≥540 nits peak brightness and less than 0.0005 nits black level (1,080,000:1 contrast).
Both are required to have at least a 10-bit panel, and display at least 90% of the Display-P3 color gamut.

Details from RTINGS' review of the Samsung NU6900:
  • 320 nits peak brightness
  • 6100:1 contrast
  • 10-bit panel (but with banding near black)
  • 65% Display-P3 gamut coverage
Falls a bit short, doesn't it?
Can you still be happy with a TV that doesn't meet the HDR specifications, and still see some kind of difference in games from enabling HDR? Sure. But it's not really a representation of what HDR is supposed to be.
It sounds like it's still a pretty good TV for $230. You'd have to pay significantly more for far less performance just a few years ago. But that doesn't make it a great TV for HDR.


Your SDR settings are probably way out of spec; higher brightness, wider gamut etc.
The difference between HDR and SDR should be easily noticeable when calibrated.
But many people prefer to watch SDR boosted to a much higher brightness and in a wider color gamut than intended…


What is difficult to understand about the image being able to go much brighter, higher contrast, and display much more vivid colors?
It can produce much more real -or unreal- images than SDR is capable of.


It was the perfect opportunity for NVIDIA to implement ray tracing when they did.
They were already so far ahead of the competition in performance that they could spend a lot of their silicon budget on adding completely new features while still offering the fastest GPUs available.
They'll be on their second generation of RT hardware -probably having learned a lot from the first- by the time next-gen consoles are here and AMD are releasing their first-gen RT hardware.

hmm then by these standards nearly every tv on the market doesn't meet these standards.

very few TVs even with fald hit a 20,000:1 contrast ratio.

Oh wait, nothing other than an oled hits that metric. https://www.rtings.com/tv/tests/picture-quality/contrast-ratio

And oleds aren't bright enough according to lots of people on this board.

oh and then people can't legitimately even tell if a set is an 8bit + frc or a 10 nit panel. lots of high end sets are 8bit panels.

guess we should all just accept that our TVs don't do hdr and be sad about it.

/s
 
Last edited:

MazeHaze

Member
Nov 1, 2017
8,570
hmm then by these standards nearly every tv on the market doesn't meet these standards.

very few TVs even with fald hit a 20,000:1 contrast ratio.

Oh wait, nothing other than an oled hits that metric. https://www.rtings.com/tv/tests/picture-quality/contrast-ratio

And oleds aren't bright enough according to lots of people on this board.

guess we should all just accept that our TVs don't do hdr.

/s
What? OLED and FALD LCDs both hit those metrics. I wouldn't consider any LCD without FALD to be great HDR, I think many would agree.

Edit: and I'd say you can totally have acceptable hdr on less than 20,000:1
 
Last edited:
Oct 27, 2017
1,377
HDR really does add a lot to the image when using a proper display. Unfortunately there's a lot of dubious marketing and information out there. A lot of lower-end TV's that support "HDR" really only support processing the image, not necessarily displaying it properly. On these TV's you may not even be able to tell a difference against SDR, and even worse some may display an image that is less accurate and looks worse than SDR.

Unfortunately TV's that do really good HDR are still prohibitively expensive for a lot of people. As an LG C9 oled owner, HDR really does look great, but it's not such an enormous difference that others make it out to be. If I was looking at cheaper TV's, I would look for something that has a good/accurate 4K SDR image and not even worry about HDR.
 

Pargon

Member
Oct 27, 2017
11,970
hmm then by these standards nearly every tv on the market doesn't meet these standards.
very few TVs even with fald hit a 20,000:1 contrast ratio.
And oleds aren't bright enough.
guess we should all just accept that our TVs don't do hdr.
/s
RTINGS measure LG's C9 OLED as having a peak brightness of 855 nits.
Most high-end LCDs should be capable of 1000 nits white and 0.05 nits black, even if they can't hit 20,000:1 ANSI contrast.
I don't set the standards, the UHD Alliance did.
 

Dizzy Ukulele

Member
Oct 28, 2017
3,013
43 inches is the maximum my gaming room can go to. I've had two HDR TVs for it. The first a Samsung, the second a Philips. They've both been hit and miss. Mostly miss with an occasional 'oh, that actually looks better than SDR, not worse'.
 

Deleted member 17092

User requested account closure
Banned
Oct 27, 2017
20,360
What? OLED and FALD LCDs both hit those metrics. I wouldn't consider any LCD without FALD to be great HDR, I think many would agree.

Edit: and I'd say you can totally have acceptable hdr on less than 20,000:1

And I'd say you can have totally acceptable hdr with less than 600 nits :).

I have a b7, a pixio px277h (ips with around 600 nits), and a tcl 43 s525 (likely 400-500 nits in bright Dv/hdr modes).

They all do a pretty damn fine hdr. Yes the oled excels but I'm not sure that it's such a difference it's worth 5x as much as the tcl and pixio.

accept that people have budgets and some of these budget sets are actually quite good.
 

Terror-Billy

Chicken Chaser
Banned
Oct 25, 2017
2,460
first post immediately shits on the OPs purchase

classic ERA XD
He ain't wrong tho. I used to think HDR was ok because I had one of those screens that hit 300 nits at much. Then, I saw HDR on one of those fancy Sony TVs and just said "fucked, I'm never using HDR until I can afford one of those."
 

Deleted member 17092

User requested account closure
Banned
Oct 27, 2017
20,360
Sure maybe, but you definitely can't have acceptable HDR with 300 nits, no wide color gamut, and no local dimming, like the TV in the OP.

I would agree that wcg is probably the biggest factor here. Hdr even on an IPS looks noticeably better than without even though ips panels are typically only 1000:1. The colors do really shine in gaming and movies even if you don't have a great contrast ratio or fald.

like yes you may need xyz criteria get the best of what hdr has to offer but that doesn't mean you don't get noticeable benefits if you're only hitting 1 or 2 of the marks fully.

Op has already said he's going to exchange for a tcl s525 which frankly given his budget and size requirements is near the best HDR experience he can get in a 43" model. The Q6 spec wise is not worth literally double the price. The vizio m43 is maybe the only other 43" that makes sense, but it's actually inferior to the tcl s525 in a number of areas.
 
Last edited:
OP
OP
Tahnit

Tahnit

Member
Oct 25, 2017
9,965
Make sure you get a TCL 6 series. Anything under that and you're just going to experience the same problems of not having something that displays HDR at a decent level. The TCL 6 series really is the cheapest TV anyone should consider if they're interested in HDR.

i dont think i can get a 6 series in 43 inches.
 
Oct 30, 2017
1,248
Uncharted LL was the first game I played when I got a B8 OLED and the HDR blew me away.

When its done right its gorgeous. Also AC Odyssey. Eyegasm in HDR.
 
OP
OP
Tahnit

Tahnit

Member
Oct 25, 2017
9,965
And I'd say you can have totally acceptable hdr with less than 600 nits :).

I have a b7, a pixio px277h (ips with around 600 nits), and a tcl 43 s525 (likely 400-500 nits in bright Dv/hdr modes).

They all do a pretty damn fine hdr. Yes the oled excels but I'm not sure that it's such a difference it's worth 5x as much as the tcl and pixio.

accept that people have budgets and some of these budget sets are actually quite good.

looking to get the tcl today. it can really go to 500 nits in bright hdr mode? That with wide color gamut sounds like a winner
 

Deleted member 17092

User requested account closure
Banned
Oct 27, 2017
20,360
looking to get the tcl today. it can really go to 500 nits in bright hdr mode? That with wide color gamut sounds like a winner

I can't say for certain. Either way rtings measured 374 nits for the medium mode. That's very close to the Samsung q6 43 and vizio m43 anyway. Those are also in the 400 nit range.

the options are very limited in a 43 inch size and imo the s525 is about the best you can do, and absolute is the best perf/$ in the segment. You spend $150 more for the vizio and $250 more for the Samsung, in order to gain what? 50 nits maybe? Not worth it imo. Vizio has only 12 dimming zones which honestly is worse than just not doing fald at all as the zones are so big it's distractingly noticeable when they light up. The Samsung also has no fald.
 

ohitsluca

Member
Oct 29, 2017
730
My first 4K TV with HDR was a Sony 43" and I was blown away too. And then my mom got a Sony 55" with local dimming and I was blown away. And then I got a 55" LG OLED and I was floored.

You have only seen the tip of the HDR iceberg, my friend
 

DixieDean82

Banned
Oct 27, 2017
11,837
I must have something wrong with my eyes. I have an OLED and I can't notice if HDR is on or off. It makes no difference to me. The whole thing is just the latest marketing gimmick.
 

Thera

Banned
Feb 28, 2019
12,876
France
I must have something wrong with my eyes. I have an OLED and I can't notice if HDR is on or off. It makes no difference to me. The whole thing is just the latest marketing gimmick.
Make a choice, it it your eyes or a marketing gimmick?
It is either a settings problem, the game you tried isn't true HDR or your eyes.
 

Bishop

Member
Nov 1, 2017
74
I must have something wrong with my eyes. I have an OLED and I can't notice if HDR is on or off. It makes no difference to me. The whole thing is just the latest marketing gimmick.
Proper calibration can go a long way. Sometimes people use SDR as intended on a Display store demo. Plus the HDR source as a 4K Blu-ray or even streaming movies in 4K HDR usually are reference material. Properly calibrating your set should show you the big differences between SDR and HDR on such a TV as yours. Not to mention the games that don't do HDR properly, like RDR2.

With your set, you should easily be able to see the difference.
 

Ferrs

Avenger
Oct 26, 2017
18,829
I must have something wrong with my eyes. I have an OLED and I can't notice if HDR is on or off. It makes no difference to me. The whole thing is just the latest marketing gimmick.

What did you tried? Are you sure you have your TV and console properly enabled for HDR? Tone mapping is on?

I have a C8 and I can't understand how that is possible.
 
Oct 25, 2017
1,575
If the faux-hdr TVs don't get bright enough, then what is the improvement the OP is seeing? Is it just placebo, or a difference in settings between SDR and HDR (eg. higher contrast and/or color)
The thing is, that many people don't even realize that lots of non-HDR TVs weren't even presenting the full range of SDR properly. Like mine for instance was considered a decent TV to have, mid-range SDR, but there were SDR shows that were presenting more color data/variance on the new TV. I had previously thought it was just the show (I had my TV calibrated) and the colors being blown out were an artistic choice or to show the intensity of the light in the scene of this show I watched, but on the new TV, despite just being SDR, it was resolving the colors appropriately and resulting in a more dynamic, detailed and natural picture, and that wasn't even HDR.

So even though cases like OP may not be getting proper HDR, it could be a significant step up from their previous display in terms of colors and brightness all the same.
 

DixieDean82

Banned
Oct 27, 2017
11,837
What did you tried? Are you sure you have your TV and console properly enabled for HDR? Tone mapping is on?

I have a C8 and I can't understand how that is possible.
I've tried a lot. I even tried the insect demo on Xbox so I could see before and after. Gears 4 has a side by side HDR and non-HDR mode. I'll say in both examples the HDR looked a little brighter. But something to call 'mindblowing'? No way. Not even close to that.

If it was not side by side I would never know if it was on or not.