would you say its brighter than the samsung? in non dark mode that is?
based on rtings yes it's a bit brighter in both sdr and hdr. Dolby vision is also nice as it does a better job than hdr 10 on lower nit sets.
would you say its brighter than the samsung? in non dark mode that is?
should i return the samsung and go with the tcl? samsung doesnt have the wide color gamut. is the tcl brighter?
If the price is similar, I say switch it out for the TCL. Those 2 sets are pretty similar and don't get bright at all for HDR (around 300nits is like the lowest you can get for an HDR enabled set). But at least the TCL has WCG, so that'll at least get you some real tangible improvement over SDR. It's also slightly better in a few other areas like motion, and it also has proper 24p judder playback while the Samsung doesnt. This is important for watching movies which are filmed at 24fps, on the Samsung it will have a weird judder/stutter due to improper cadence, the TCL however can play it in it's proper 24fps judder free cadence.
Samsung handles reflections slightly better though.
It's funny how Apple doesn't even mention the iPad Pro has HDR support when it can get up to 650nits and supports WCG too.
For the TCL S525, only the 55" and above supports 24p playback with it's native apps (with the use of the Natural Cinema setting). A 43" and 49"/50" should however support 24p playback with external devices that support it.If the price is similar, I say switch it out for the TCL. Those 2 sets are pretty similar and don't get bright at all for HDR (around 300nits is like the lowest you can get for an HDR enabled set). But at least the TCL has WCG, so that'll at least get you some real tangible improvement over SDR. It's also slightly better in a few other areas like motion, and it also has proper 24p judder playback while the Samsung doesnt. This is important for watching movies which are filmed at 24fps, on the Samsung it will have a weird judder/stutter due to improper cadence, the TCL however can play it in it's proper 24fps judder free cadence.
Samsung handles reflections slightly better though.
For the TCL S525, only the 55" and above supports 24p playback with it's native apps (with the use of the Natural Cinema setting). A 43" and 49"/50" should however support 24p playback with external devices that support it.
Edit: The TCL S405 supported 24p playback through it's native apps until I think Roku OS 8. I think Roku stripped 24p playback out from Roku TV's and I don't know why. I had a Hisense 50R6E Roku TV that didn't support 24p playback through it's native apps but supported it through external devices like a Blu-Ray Player or an Xbox One X.
7.9 HDR Gaming
This is a good TV to play HDR games. This is mainly due to the very low input lag that makes it very responsive, and the fast response time that leaves only a small blur trail behind fast-moving content. The HDR performance of the TV is not very good as it can't get very bright and it does not have a wide color gamut, and thus lacks the means to deliver a great HDR picture.
Thanks guys. I think ill return this set tomorrow and get that TCL. Wide color gamut sounds great and the sets are very similiar.
If you have the MS GamePass there are some of the best HDR showcases included: Gears 5, Forza Horizon 4 and Sea of Thieves all have great HDR implementation.Now i need more HDR showcase games. Any suggestions?
Mostly pc games. dont really have a 4k console
Thanks guys. I think ill return this set tomorrow and get that TCL. Wide color gamut sounds great and the sets are very similiar.
Did they say that for the 2017 model? Has basically the same screen. I'm surprised they barely even mention it or put it on the box anywhere.
It's a massive problem with HDR marketing right now, why do you think there are so many posts here saying they can't see a difference? Every 4K TV under the sun has HDR branding but half of them can barely get past 200-300 nits which is nowhere near bright enough for proper HDR.
Huh?
It's just one of those things. If you have a friend who tells you "omg I just got a new car and it has heated seats! It's amazing I've always wanted to experience heated seats and now finally I get to have it!!!" And then you respond to their enthusiasm with "bro you have a Civic I wouldn't even consider their seat heaters 'heated'" it's just a way to make someone feel bad about their enthusiasm.
Like, when someone is excited about something, it's bad taste to shit on what excited them. At least there's ways to say "your hdr sucks and mine is the best hdr" that aren't as douchey as the first reply. General life advice, don't go around yucking someone's yum.
That why HDR sucks. There are just so many factors that play into it.
- Does your TV have enough nits?
- Does the game even have a good HDR implementation?
- Is your TV set up correctly?
- Multiple HDR standards that are confusing for consumers
Fuck that.
Hey, good for those people that have the proper TV with the proper settings and calibration while playing a game that has a proper HDR implementation.
There are just too many variables to currently call this the game changer that some are claiming it to be.
I have this LG that I purchased bay in late 2016 / early 2017. Can anyone tell me if I should be shopping for a new model for better HDR or does it not really matter / not worth the cost: https://www.lg.com/us/tvs/lg-55UF7600-4k-uhd-led-tv
Even my tech manual doesn't list the complete specs. The best I could find was an old review in which they state that years line of 4K IPS models acheived over 700 nits, but no firm range or maximum.
Someone correct me if I'm wrong, but if an LED TV has no local dimming isn't that still fake HDR? Without local dimming, the TV has no choice but to be at 100% backlight. All you'd get is a bright AF image with greys instead of blacks. It doesn't matter which TV has more nits if they both don't have local dimming. Hell, the brighter one is probably ultimately worse.
Your TV doesn't even have HDR. That said, wait until more HDMI 2.1 sets are common before upgrading at this point.
I still think the HD transition was way, way smoother and easier for consumers to understand.What it does is a "game changer" though. Just because we're in the transition phase of how it's implemented doesn't change what it actually does and the impact it has on the video quality. I think people really forget how the HD transition went down. It had it's own set of growing pains which we now look past because the dust finally settled.
Do people forget that what people classify as HD has three different resolutions? How about how every set didn't support all three resolutions? Or how about how even if it did, you had to consider what the native resolution of the screen is because just like HDR just because it claims it was 1080p, doesn't mean it was 1080p? Do people remember a time when HDMI wasn't the standard and how some sets had it and some didn't? How about even if you had HDMI, you might be screwed for not having HDCP? Anyone remember component video which was the other method to get an HD signal? Heck does anyone remember how DVI was actually an input on TVs for HD?
People take for granted how things settled down and don't remember the transition period with all the growing pains involved. HDR is going through the same thing right now. This too will settle but it doesn't change the impact that HDR has as a feature moving forward.
Placebo and high contrast most likely, some of the faux-HDR tvs come shipped with super high contrast and colour settings.
The most common analogue to this situation is probably the "HD Ready" tvs back in the mid 2000s when they would come with resolutions lower than 720p but would accept a 720P+ signal and scale it.
Well, I don't think it's just "placebo"... My LG 32UK550 monitor only reaches around 300 nits at maximum brightness, but the difference when using my PS4 is noticeable. I'm not saying that the effect "blows me away", but so far I've tested it by turning it on and off on Deus Ex, The Last of Us and Hellblade, and the presence of HDR makes the scenes look richer... It definitely looks "flatter" when turning HDR off. So, there's a noticeable effect even if the brightness levels are suboptimal for 'proper HDR'.Placebo and high contrast most likely, some of the faux-HDR tvs come shipped with super high contrast and colour settings.
The most common analogue to this situation is probably the "HD Ready" tvs back in the mid 2000s when they would come with resolutions lower than 720p but would accept a 720P+ signal and scale it.
I still think the HD transition was way, way smoother and easier for consumers to understand.
Basically you had HD Ready TVs supporting 720p and Full HD TVs supporting 1080p resolutions.
When you bought a HD Ready TV you knew your games would run at 720p.
Your TV manufacturer didn't tell you that it actually supported Full HD but downscaled to 720p.
The closest comparison that could be made would be if all games ran at variable framerates... maybe... but it depends on what TV you have...
That's the problem with HDR right now. Every TV manufacturer claims support on basically every TV model but doesn't tell you that the panel is actually shit and you won't see the difference. Even if you have a good panel you might still not see a difference because games have a shitty HDR implementation or only fix / patch it in weeks or months after the initial release. Oh, and make sure you go on internet forums and hope that someone has the same model TV as you so that person can tell you how to configure your TV so that you might experience true HDR.
Nah, fuck that.
I had to turn up the contrast quite a bit. Mines a different tv though (Samsung ks8005), but it looked washed out for the longest time.I need a better tv. I have a Sony 43X800D. HDR looks washed out no matter what settings are recommended to me by people. I've spent countless hours trying to get it to look good, and can't do it.
HDR and Raytracing are one of those things that I will never seemingly understand the huge hype for, and will continually roll my eyes at people arguing about not having televisions or computer monitors or other peripheries that can properly display or demonstrate its features. Just seems like something that continues to string me along again and again.
Placebo and high contrast most likely, some of the faux-HDR tvs come shipped with super high contrast and colour settings.
The most common analogue to this situation is probably the "HD Ready" tvs back in the mid 2000s when they would come with resolutions lower than 720p but would accept a 720P+ signal and scale it.
The bolded is absolutely not true. You had to factor in the native res of the display which wasn't obvious to your normal consumer. Hell they still do this where you can find a TV that is "1080p" but all it does is accept the signal and then down scales it. TV manufacturers really did handle the resolution in a similar fashion as how HDR is misleading.
One problem is that people treat HDR as a checklist binary feature. They shouldn't. It doesn't help when people try to point this out that others yell at them for being elitist either.
Wait, you don't get the benefits of ray tracing? Forget the early implementation of it, but the general concepts of it?
Oh sure, the technical aspect of being able to achieve one of the crown jewels of rendering is fascinating. Aside from that? Meh. That comes with the mediocre implementation, and that it doesn't fundamentally change how I play the game itself. Get me a game that demonstrated how much light plays a role in design and gameplay (i.e., Splinter Cell and others), and that will impress me as a gamer.
We're just at the beginning of ray tracing, obviously the way it's done in games right now is limited and inefficient. Give it a few years and it will completely change how games are rendered. It is also potentially a huge time saver for developers themselves.
I have edited my post. I actually didn't know HD Ready didn't always mean 720p. That is indeed similar to the HDR situation but it's only a small part of the problem.
I still stand by my opinion that the HDR situation is way worse.
So you buy an HD Ready TV and you get a panel with a sub-720p resolution. That sucks but there's nothing you can do to fix that.
With HDR you need to fiddle with all kinds of settings and even if your TV supports HDR there are a bunch of other factors that decide whether your HDR experience is good or bad. Contrast, black levels, nits, edge lighting, etc. all play a role in this and if any of those factors is not up to par then it ruins the whole experience.
So yes, it might be similar to the HD adoption, but it's actually way worse.
Oh sure, the technical aspect of being able to achieve one of the crown jewels of rendering is fascinating. Aside from that? Meh. That comes with the mediocre implementation, and that it doesn't fundamentally change how I play the game itself. Get me a game that demonstrated how much light plays a role in design and gameplay (i.e., Splinter Cell and others), and that will impress me as a gamer.
I agree with you and have a 2018 LG, but i keep seeing people say that the nits on LG OLED is still too low.Ever since I got my OLED I don't even care about 4K. Give me HDR
Fair enough. I think we're in the very early stages of it, and the results are still clearly showing that, but over time, it's going to be something significant. The fact that we're getting it in some form at all is big enough. It has to start somewhere and mature, but the hype about ray tracing being a reality is totally a reasonable thing. We just need it to mature now.
I agree with you and have a 2018 LG, but i keep seeing people say that the nits on LG OLED is still too low.
Looks good to me, though. Alien 4K and in the dark scenes w/ the flamethrower and my eyeballs are singed.
I need a better tv. I have a Sony 43X800D. HDR looks washed out no matter what settings are recommended to me by people. I've spent countless hours trying to get it to look good, and can't do it.