• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.
  • We have made minor adjustments to how the search bar works on ResetEra. You can read about the changes here.

MazeHaze

Member
Nov 1, 2017
8,579
As for the PC specs: my understanding is that 900 cardscant do HDR
The 980 can at least, not sure about the others
I have the 43X800D too and it seems the only way I can get 4K HDR running properly is through the TV's native Android TV apps. The Grand Tour on the Amazon app looks amazing, as well as all of the 4K video you can watch on the YouTube app.

But when I'm using Windows 10 with 'HDR and advanced color' mode turned on everything looks washed out with some color bleeding on fonts. Watching 4k HDR video via YouTube on my browser with the feature turned off just does not have the same picture quality as the native TV apps.

All that said I barely use the TV's apps as the interface is kinda clunky and slow. So in a sense I gave up on HDR too, except when I really want to watch The Grand Tour.

You can't just turn on HDR mode in windows and expect everything on your PC to be HDR. Your desktop and browser wont ever be HDR thats why they look washed out when you turn HDR mode on. You're only supposed to enable HDR when youre feeding it an HDR source, like a game with HDR support. None of the web browsers support HDR.
 

Koklusz

Member
Oct 27, 2017
2,561
I think this is good place to point out that UHD version of Fury Road is kinda terrible, as it's completely changes color pallette of the movie.

Top shot is HD bluray, bottom is UHD bluray tonemapped to SDR

desktop_2017_11_25_11uoq9s.png


87132422589028509617.png
 
Last edited:

Kompis

Member
Oct 27, 2017
1,021

Madness

Member
Oct 25, 2017
791
UQNd09d.jpg

6d785tu.jpg

JwkzYNc.jpg

BRB6vy5.jpg


I had these images on the other site last year. No changes except base dynamic mode for the first pictures and auto HDR10 kicking in after enabled. As you can see HDR in fact massively improved colors shown as well as highlight contrast changes.
 
Last edited:

Brau

Senior Artist
Verified
Oct 26, 2017
283
Finland
I'll take some photos of my OLED of Fury Road of the exact same places you took. Just so you can see OP that a well-calibrated TV with a good panel will make wonders with HDR. :)
 

Madness

Member
Oct 25, 2017
791
jfWsEen.jpg
k9oas6J.jpg


Two images. One is SDR above and the second is HDR10 enabled. Very subtle but if you zoom into desert and city you see better clarity and color.
 

III-V

Member
Oct 25, 2017
18,827
This is the problem right here.
The issue with SDR is that modern displays have capabilities which can far exceed the SDR spec in terms of brightness and color gamut.
Setting color space to "native" stretches out the film's BT.709 color to the display's native color space; i.e. as vivid and saturated as it can possibly make it, disregarding all color accuracy.
Since it's an HDR display, that means it can make things very vivid. Selecting the native color space option would not be nearly as vivid on a non-HDR display.

So you've taken content designed for a small color space and stretched it out to a much larger color space.
HDR on the other-hand is designed to use that large color space. It gives artists a larger palette to work with, rather than automatically making everything more vivid.
And in fact, the HDR color space is so large that none of the displays available today have full coverage. Your KS8500 has 63% coverage, and their QLED displays still only have 73% coverage.

HDR is about extending the range of brightness and saturation available.
Here's a good comparison of Mad Max: Fury Road on two properly calibrated displays:
madmax-highlight-larg7cjon.jpg

Overall scene brightness is the same, and if anything, the HDR image is somewhat muted compared to the SDR image. However there is a lot more range.
The sky in the SDR image is flat and uniform because there's a limit on how bright things can get. In HDR, the sun and lens flare are very distinct against the sky due to the increased brightness range - and far more obvious in person.
The color of the sand is richer in HDR, rather than being more vivid, with deeper reds than you would get in SDR.
Now HDR content can get more vivid than SDR content is supposed to, but only where appropriate. It doesn't automatically make the picture more vivid than SDR.

Even if you wanted to, you don't have the option of oversaturating the image with HDR like you do with SDR.
Now it would appear that most manufacturers are giving people full control over picture settings in newer HDR displays, which really goes against a lot of what the spec intended to fix, so you are able to increase the color control to 100 and oversaturate the image that way.
But increasing the color control does not increase saturation the same way that increasing the color gamut does.
Increasing the color control will clip colors, so anything which is supposed to be bright and saturated loses all detail and turns into a solid mass of that color. Increasing saturation by expanding the color space does not do this.

It's a similar story with the backlight and dynamic contrast. SDR is intended to be viewed at 100 nits, but most displays can push that to 400 nits or higher. So you can push the brightness to 4x higher than intended.
The HDR spec for brightness is far beyond the capabilities of today's displays, so you don't have the option of using a display which is 4x brighter than the content is designed for.
Since you can't push the brightness as high, you can end up with SDR content that is much brighter on average than HDR content.
It's completely inaccurate and not how it is meant to look at all, but the option is there.

So if you're looking for the brightest, most saturated, and most vivid picture that you can get, you probably want to stick to SDR sources on an HDR display.
Only an HDR display will let you push these things to ludicrous levels without running into issues like clipping, so you are still getting value out of having an HDR display. It's just not what was intended.
This answer is 100% accurate.
 

Tetsujin

Unshakable Resolve
The Fallen
Oct 27, 2017
3,465
Germany
Two images. One is SDR above and the second is HDR10 enabled. Very subtle but if you zoom into desert and city you see better clarity and color.

To be honest, I don't really understand the purpose of these off-screen images people post?
There is a visible difference of course but it's not an accurate representation since the actual photo and most people's screens are SDR. So the supposed better colors are all represented in SDR. So I can see the colors look different in the second image but I could take the upper image and play around with the color saturation and get the same result since both are SDR anyway.

These comparison screens certainly show that there is *a* difference but they don't really show *the* difference.
 

III-V

Member
Oct 25, 2017
18,827
To be honest, I don't really understand the purpose of these off-screen images people post?
There is a visible difference of course but it's not an accurate representation since the actual photo and most people's screens are SDR. So the supposed better colors are all represented in SDR. So I can see the colors look different in the second image but I could take the upper image and play around with the color saturation and get the same result since both are SDR anyway.

These comparison screens certainly show that there is *a* difference but they don't really show *the* difference.
This is true. However,it was useful to post as OP SDR images were mega-oversaturated. The truth then came out that he prefers color at 100.
 
OP
OP
AegonSnake

AegonSnake

Banned
Oct 25, 2017
9,566
This is true. However,it was useful to post as OP SDR images were mega-oversaturated. The truth then came out that he prefers color at 100.
Lol the truth? Do you know how to read? Read the same sentence again. This time try to read the whole thing.

there is definitely something wrong with my tv because even a tv noob like me knows you cannot set color to 100. Maybe i will bring the color down to 75 or something because i can tell i am losing detail but i am so shocked to see the color come back.
To be honest, I don't really understand the purpose of these off-screen images people post?
There is a visible difference of course but it's not an accurate representation since the actual photo and most people's screens are SDR. So the supposed better colors are all represented in SDR. So I can see the colors look different in the second image but I could take the upper image and play around with the color saturation and get the same result since both are SDR anyway.

These comparison screens certainly show that there is *a* difference but they don't really show *the* difference.

I felt it made sense to include the images because it just looked completely different from what I remember watching in cinema, YouTube trailers, hbo and on Blu-ray before i got a hdr tv.

Turns out, the hdr in mad max DOES change the look of this film like a few people have pointed out in the last few pages. I am legit surprised so few have noticed the change because regardless of how accurately dull the colors look in hdr, they certainly don't look like they did in the original movie.
 
Last edited:

jett

Community Resettler
Member
Oct 25, 2017
44,656
OMG. This did it. Setting Color to 100 brought back all the colors and now it has this great reddish hue i saw in the hdr test comparison. i cannot believe that worked. there is definitely something wrong with my tv because even a tv noob like me knows you cannot set color to 100. Maybe i will bring the color down to 75 or something because i can tell i am losing detail but i am so shocked to see the color come back.

DPcCxnrXkAA5OjD.jpg


vs. what i had before at color 50.

DPbbBe0WsAAyGg8.jpg

This is insanity, that isn't what Max Max is supposed to look like. The settings on your TV are completely fucked. I guess it's your preference, but fucked they are.

That aside, HDR is hugely overrated.
 

RoninChaos

Member
Oct 26, 2017
8,338
This is the problem right here.
The issue with SDR is that modern displays have capabilities which can far exceed the SDR spec in terms of brightness and color gamut.
Setting color space to "native" stretches out the film's BT.709 color to the display's native color space; i.e. as vivid and saturated as it can possibly make it, disregarding all color accuracy.
Since it's an HDR display, that means it can make things very vivid. Selecting the native color space option would not be nearly as vivid on a non-HDR display.

So you've taken content designed for a small color space and stretched it out to a much larger color space.
HDR on the other-hand is designed to use that large color space. It gives artists a larger palette to work with, rather than automatically making everything more vivid.
And in fact, the HDR color space is so large that none of the displays available today have full coverage. Your KS8500 has 63% coverage, and their QLED displays still only have 73% coverage.

HDR is about extending the range of brightness and saturation available.
Here's a good comparison of Mad Max: Fury Road on two properly calibrated displays:
madmax-highlight-larg7cjon.jpg

Overall scene brightness is the same, and if anything, the HDR image is somewhat muted compared to the SDR image. However there is a lot more range.
The sky in the SDR image is flat and uniform because there's a limit on how bright things can get. In HDR, the sun and lens flare are very distinct against the sky due to the increased brightness range - and far more obvious in person.
The color of the sand is richer in HDR, rather than being more vivid, with deeper reds than you would get in SDR.
Now HDR content can get more vivid than SDR content is supposed to, but only where appropriate. It doesn't automatically make the picture more vivid than SDR.

Even if you wanted to, you don't have the option of oversaturating the image with HDR like you do with SDR.
Now it would appear that most manufacturers are giving people full control over picture settings in newer HDR displays, which really goes against a lot of what the spec intended to fix, so you are able to increase the color control to 100 and oversaturate the image that way.
But increasing the color control does not increase saturation the same way that increasing the color gamut does.
Increasing the color control will clip colors, so anything which is supposed to be bright and saturated loses all detail and turns into a solid mass of that color. Increasing saturation by expanding the color space does not do this.

It's a similar story with the backlight and dynamic contrast. SDR is intended to be viewed at 100 nits, but most displays can push that to 400 nits or higher. So you can push the brightness to 4x higher than intended.
The HDR spec for brightness is far beyond the capabilities of today's displays, so you don't have the option of using a display which is 4x brighter than the content is designed for.
Since you can't push the brightness as high, you can end up with SDR content that is much brighter on average than HDR content.
It's completely inaccurate and not how it is meant to look at all, but the option is there.

So if you're looking for the brightest, most saturated, and most vivid picture that you can get, you probably want to stick to SDR sources on an HDR display.
Only an HDR display will let you push these things to ludicrous levels without running into issues like clipping, so you are still getting value out of having an HDR display. It's just not what was intended.
This is very informative. Do you have a suggestion for settings on the PS4 and Xbox one X so hdr looks correct?
 

Zing

Banned
Oct 29, 2017
1,771
That aside, HDR is hugely overrated.
I strongly disagree. HDR is the reason I kept my 4K TV after taking one for a test run. 4K was nice, but HDR was the feature that made me feel I could never go back to my old TV. For gaming in particular, HDR performance should be the main focus of any new TV purchase.

I suspect last year's TVs just were just not up to the HDR task, both in hardware capabilities and software engineering. My Sony 900E makes a significant improvement to any and all HDR content I have seen on it. From the very first HDR content, straight out of the box with no change to settings, everything has looked great. I am still simply using the default picture settings. I have had to expend no effort whatsoever to have everything look correct.
 

airjoca

Banned
Oct 30, 2017
805
Portugal
I've seen Mad Max in SDR and HDR on my Oled, and the colors are similar, with mostly the same color settings on both modes.

You need help with proper calibration OP.
 

DOTDASHDOT

Helios Abandoned. Atropos Conquered.
Member
Oct 26, 2017
3,076
This is insanity, that isn't what Max Max is supposed to look like. The settings on your TV are completely fucked. I guess it's your preference, but fucked they are.

That aside, HDR is hugely overrated.

It's anything but the opposite, I mean I stopped playing Destiny 2 for the HDR update in Dec, it's stunning on PC.
 

Brinbe

Avenger
Oct 25, 2017
58,292
Terana
Yeah, I just upgraded to a TCL605 from an older 4K Hisense sans HDR and it's definitely as big a gamechanger as hyped. Spent yesterday watching a bunch of vids on that HDR YT channel and I couldn't go back to viewing shit without it now.
 

Jedi2016

Member
Oct 27, 2017
15,653
The biggest improvement for me is seeing the range of colors and illumination that were present in the theater. While theatrical presentations are 12-bit and UHD only 10-bit, it's a hell of a lot closer to showing us the full range of color than the old 8-bit Blu-rays and DVDs are. Billions of color variations instead of millions. The key word there being "variations".

While HDR can display true colors beyond the ability of 8-bit TVs (brighter reds/greens/blues), it's really the range of colors (hence the name, HDR) that makes the biggest difference. Let's take an example of a scene shot at dusk:

MV5BZDJhOGMwNzItM2QyOC00ZDg2LTk2N2QtOWYxMWM2YWQxMGI3XkEyXkFqcGdeQXVyNDQxNjcxNQ@@._V1_.jpg


Yes, this shot is at dusk... it's supposed to be dark as hell. This is what SDR does to movies... this shot is not supposed to look like this. But because it would look washed out, too dark, and with almost no remaining detail, they have no choice for the home release but to fuck with the levels and crank it up to the point where it's visible on an 8-bit display. It's something I came to call the "home video look" when comparing how things looked in theaters versus what they look like on DVD and Blu-ray. It just looks too artificial. Even in a scene like this, where I'm pretty sure they just used the natural light at the location, it ends up looking like it's been lit in a studio.

Now, bear in mind that what I'm posting below is still an SDR image that I tweaked in Photoshop, but the scene is actually supposed to look more like this:

weqQXI6.jpg


But, with HDR, you'd still be able to make out of all of the detail that's present in the original, you don't get any of the black crush that you see here in SDR. That's why they have to adjust it the way they do. This looks artificially dark because it's SDR, but in HDR, it just looks natural... it looks like they just shot the scene right after sunset when it's starting to get dark (which is exactly what they did).

Way too many movies that I've seen, with shots at dawn/dusk/nighttime, where a scene is supposed to be dark and gloomy, where I've said to myself in the theater that it's too bad the home video is going to lose all that, and confirmed when I get it home and those dim and gloomy shots don't look right. But with UHD and HDR, they do. And it's so nice to have that back on home video, that's been missing for so long that people didn't even realize it was gone.

That's the biggest improvement in HDR for me. It's not about vibrant colors, brighter whites, or darker blacks... it's about getting the natural lighting back that 8-bit home video has stolen from us for the last twenty years.
 

III-V

Member
Oct 25, 2017
18,827
Lol the truth? Do you know how to read? Read the same sentence again. This time try to read the whole thing.

there is definitely something wrong with my tv because even a tv noob like me knows you cannot set color to 100. Maybe i will bring the color down to 75 or something because i can tell i am losing detail but i am so shocked to see the color come back.


I felt it made sense to include the images because it just looked completely different from what I remember watching in cinema, YouTube trailers, hbo and on Blu-ray before i got a hdr tv.

Turns out, the hdr in mad max DOES change the look of this film like a few people have pointed out in the last few pages. I am legit surprised so few have noticed the change because regardless of how accurately dull the colors look in hdr, they certainly don't look like they did in the original movie.
I apologize. It was a joke at your expense. If you are interested, we can work on getting your monitor set up properly at little or no cost - unless you prefer the saturated look.
 

Pagusas

Banned
Oct 25, 2017
2,876
Frisco, Tx
HDR is one of those amazing upgrades that the people who really care about video technology have been clamoring for, but major companies had to basically sneak in on casual consumers with bright flashy tech demos that really don't show off the true improvements, because those improvements are (most of the time) subtle.
 

gigaslash

User requested ban
Banned
Oct 28, 2017
1,122
This thread makes me wanna forget about HDR for a couple of years until all the standards, settings, TVs, etc are sorted out and getting a good picture out of your TV doesn't involve jumping with a tambourine around it for a hour.
 

deathsaber

Member
Nov 2, 2017
3,098
I know nothing of TC's TV- so I plead ignorance there.

But HDR does depend heavily on the tvs- color range and ability to render deep black. If your tv simply can't do a depth of color well, despite having an HDR function- it truly won't do much. That's why they say to avoid the TCL 400 series (which is pretty much the cheapest 4k HDR on the market (under $400 for 49/55 inch), but poor color depth, so it really doesn't do much despite professing to be in HDR mode), but what makes the P series (P605 or P607) so good- because of its great color gamut, local dimming, and deep blacks, which can really make that HDR pop.

Calibration is also key- look your tv up on Rtings and put in whatever settings they recommend to calibrate to. Sometimes the preset color modes are absolute shit, but calibrating can be night and day.
 

MazeHaze

Member
Nov 1, 2017
8,579
I owned OPs TV for a year, and I didn't think the HDR on it was too great honestly. The local dimming was terrible, and any heavy contrast night time scenes always looked washed out because the TV was pushing nits on highlights, but the blacks would get blown out. It's an impressive set for sure, but it doesn't give me the same thing I see in my OLED, which is how the perfect black makes highlights and colors really stand out.
 

StuBurns

Self Requested Ban
Banned
Nov 12, 2017
7,273
It already does? HDR is just higher gamut displays.

I wish I would have seen life of pi in HDR 4k. that island scene would have been fantastic.
HDR doesn't have 'a' standard. There's HDR10, HDR10+, Dolby Vision, HLG, and probably more.

The TVs also don't have a standard. To get the 'Ultra HD Premium' badge they need to be 10bit and hit 1000 nits if it's an LCD, or 450 if it's an OLED, but nothing stops a TV supporting HDR if it doesn't meet the standard.
 

Deleted member 3345

User requested account closure
Banned
Oct 25, 2017
4,967
HDR doesn't have 'a' standard. There's HDR10, HDR10+, Dolby Vision, HLG, and probably more.

The TVs also don't have a standard. To get the 'Ultra HD Premium' badge they need to be 10bit and hit 1000 nits if it's an LCD, or 450 if it's an OLED, but nothing stops a TV supporting HDR if it doesn't meet the standard.

HDR is an explanation of what you can display.
You're talking about nits and I'm talking about color spaces. You can't have an LCD saying it can display HDR gamuts if it doesnt go above the typical RGB gamut.

Ultra HD is anything above the 1080p range. Again that has nothing to do with color spaces.
 

FullMetalx

Banned
Oct 28, 2017
811
Two images. One is SDR above and the second is HDR10 enabled. Very subtle but if you zoom into desert and city you see better clarity and color.

I read the following comments about how these are not clear representations of HDR but am I the only one thinks the first image looks a little more natural? The second looks kind of over saturated and isn't it what a lot of people complained about on the original xbox one? (Microsoft over saturates and crushes black, etc etc)
 

StuBurns

Self Requested Ban
Banned
Nov 12, 2017
7,273
HDR is an explanation of what you can display.
You're talking about nits and I'm talking about color spaces. You can't have an LCD saying it can display HDR gamuts if it doesnt go above the typical RGB gamut.

Ultra HD is anything above the 1080p range. Again that has nothing to do with color spaces.
That's not really the case at all.

HDR stands for High Dynamic Range, that in of itself means nothing really. When people generally say 'HDR on my TV is...' they're referring to the TVs ability to display the recent HDR10 standard, the most wildly used so far. HDR10 is the Rec. 2020 colour space, 10-bit colour and 1000 nit peak luminosity. The 'Ultra HD Premium' badge is given to TVs that are 4k, and meet the criteria for HDR10 support.
 

Deleted member 3345

User requested account closure
Banned
Oct 25, 2017
4,967
That's not really the case at all.

HDR stands for High Dynamic Range, that in of itself means nothing really. When people generally say 'HDR on my TV is...' they're referring to the TVs ability to display the recent HDR10 standard, the most wildly used so far. HDR10 is the Rec. 2020 colour space and 1000/450 nit peak luminosity. The 'Ultra HD Premium' badge is the given to TVs that are 4k, and meet the criteria for HDR10 support.

High Definition Range means it's a a range wider than "typical" Like how High Definition is more pixels than standard definition.
 

BLEEN

Member
Oct 27, 2017
21,890
Does it help to have Dynamic Contrast set to low when watching HDR? Or higher? Or off?
In general, you should always leave any post-processing off. The contrast should be as good as it can get if calibrated correctly. However, if you want a bit of 'pop!' I'd say leave it on Low. It's really up to you but anything higher than Low is going to cause a bunch of issues (crushed blacks/loss of detail, for starters) that make the scenes look much less natural. I personally calibrate my display to look like a film or as close to as the director or whoever is in charge of the final mix. If it's for games, I completely understand leaving Dynamic Contrast on anything other than Off; it helps to make things feel more 3D and crisp.
 

Mario's Nipples

Banned for having an alt account
Banned
Nov 2, 2017
856
France
In general, you should always leave any post-processing off. The contrast should be as good as it can get if calibrated correctly. However, if you want a bit of 'pop!' I'd say leave it on Low. It's really up to you but anything higher than Low is going to cause a bunch of issues (crushed blacks/loss of detail, for starters) that make the scenes look much less natural. I personally calibrate my display to look like a film or as close to as the director or whoever is in charge of the final mix. If it's for games, I completely understand leaving Dynamic Contrast on anything other than Off; it helps to make things feel more 3D and crisp.
I have a KS8000 UK model, which is the KS9000 in the US, I believe. Are there any good/trusted sites that can be used for best settings? I used Rtings, but the image doesn't pop as much as I'd want or expect it to with HDR.
 

BLEEN

Member
Oct 27, 2017
21,890
just like 4k, its a gimmick. 144hz 1ms is where its at
For gaming, hell yeah. This is a gaming forum but not everyone buys TVs to game on. I don't care about the ms or hz because I only use mine to watch film. It is great to have the option to disable all frills and end up with an incredible response time. Options, options, options! The more, the merrier.
 

BLEEN

Member
Oct 27, 2017
21,890
I have a KS8000 UK model, which is the KS9000 in the US, I believe. Are there any good/trusted sites that can be used for best settings? I used Rtings, but the image doesn't pop as much as I'd want or expect it to with HDR.
Yeah. I gotchu. Just give me a moment.
http://www.avsforum.com/forum/139-display-calibration/2539897-samsung-ks8000-calibration.html
These settings are based off the rtings ones. They used professional calibration equipment and software so they should more than get you in the ballpark. If there's not enough pop, tweak the settings after you 1:1 the ones in post #3. A calibrated display almost always makes people scratch their heads going, "wtf? this looks odd/not satisfactory!" Especially when it comes to the removal of out-of-the-box blue-tint. The set will look warmer and whites won't look "as white". But I can assure you, your eyes are just deceiving you. Try those settings out for a full day or two and you will become accustomed to them and see an almost as-true-as-life picture. If there's really not enough pop, up the contrast to the max and/or Dynamic Contrast to Low.

Let me know what you think. It's your set, so whatever makes you happy is what you should go for.

Edit* Forgot to put the link up top and also, if you browse around the thread below there should be multitudes of information on your set and more than likely a bunch of others' settings.
http://www.avsforum.com/forum/166-l...cial-samsung-ks9000-4k-uhd-owners-thread.html
Edit 2* lol sorry, I messed up the links. Should be fixed now.
OrionFalls
 
Last edited:
Oct 28, 2017
13,691
The biggest improvement for me is seeing the range of colors and illumination that were present in the theater. While theatrical presentations are 12-bit and UHD only 10-bit, it's a hell of a lot closer to showing us the full range of color than the old 8-bit Blu-rays and DVDs are. Billions of color variations instead of millions. The key word there being "variations".

While HDR can display true colors beyond the ability of 8-bit TVs (brighter reds/greens/blues), it's really the range of colors (hence the name, HDR) that makes the biggest difference. Let's take an example of a scene shot at dusk:

MV5BZDJhOGMwNzItM2QyOC00ZDg2LTk2N2QtOWYxMWM2YWQxMGI3XkEyXkFqcGdeQXVyNDQxNjcxNQ@@._V1_.jpg


Yes, this shot is at dusk... it's supposed to be dark as hell. This is what SDR does to movies... this shot is not supposed to look like this. But because it would look washed out, too dark, and with almost no remaining detail, they have no choice for the home release but to fuck with the levels and crank it up to the point where it's visible on an 8-bit display. It's something I came to call the "home video look" when comparing how things looked in theaters versus what they look like on DVD and Blu-ray. It just looks too artificial. Even in a scene like this, where I'm pretty sure they just used the natural light at the location, it ends up looking like it's been lit in a studio.

Now, bear in mind that what I'm posting below is still an SDR image that I tweaked in Photoshop, but the scene is actually supposed to look more like this:

weqQXI6.jpg


But, with HDR, you'd still be able to make out of all of the detail that's present in the original, you don't get any of the black crush that you see here in SDR. That's why they have to adjust it the way they do. This looks artificially dark because it's SDR, but in HDR, it just looks natural... it looks like they just shot the scene right after sunset when it's starting to get dark (which is exactly what they did).

Way too many movies that I've seen, with shots at dawn/dusk/nighttime, where a scene is supposed to be dark and gloomy, where I've said to myself in the theater that it's too bad the home video is going to lose all that, and confirmed when I get it home and those dim and gloomy shots don't look right. But with UHD and HDR, they do. And it's so nice to have that back on home video, that's been missing for so long that people didn't even realize it was gone.

That's the biggest improvement in HDR for me. It's not about vibrant colors, brighter whites, or darker blacks... it's about getting the natural lighting back that 8-bit home video has stolen from us for the last twenty years.

Great post thanks for this. Making me want to pick up a new set now LOL
 

StuBurns

Self Requested Ban
Banned
Nov 12, 2017
7,273
1ms, even OLEDs don't have pixel switching that fast.
sorry for my mystype. my argument is nullified.
There isn't an argument. It's not an opinion. HDR refers to luminosity, and there is no standard, as the poster you attempted to correct stated.

There are multiple sets of standards that include HDR, such as HDR10, which also includes a wider colour gamut.
 

99Luffy

Banned
Oct 27, 2017
1,344
So how long before HDR has an actual standard that everyone follows? I'll jump in then.
HDR10 is pretty standard. HDR10+ will soon be out and will be a firmware update. But if you wanna be future proof, you probably wanna wait for the next hdmi spec next year that has built in freesync.
 

ItIsOkBro

Happy New Year!!
The Fallen
Oct 25, 2017
9,510
We should have called the combination of HDR, WCG, and >=10 bit color depth something else. Because right now it seems like we call that HDR and that's derpy.
 

StuBurns

Self Requested Ban
Banned
Nov 12, 2017
7,273
We should have called the combination of HDR, WCG, and >=10 bit color depth something else. Because right now it seems like we call that HDR and that's derpy.
That's what the manufacturers were trying to do with 'Ultra HD Premium', roll 4k, HDR, and WCG into a single term, but it's a mouthful.

I don't think HDR10 or DV is hard to use.
 

99Luffy

Banned
Oct 27, 2017
1,344
That's what the manufacturers were trying to do with 'Ultra HD Premium', roll 4k, HDR, and WCG into a single term, but it's a mouthful.
I think the problem was that only LG and Samsung really used that badge. Its a pretty silly badge too when Samsung has Ultra HD premium while they call their weak HDR displays something like 'HDR Pro.'
To the average consumer that shit is confusing.