• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.
  • We have made minor adjustments to how the search bar works on ResetEra. You can read about the changes here.

Schlomo

Member
Oct 25, 2017
1,133
Thanks to this thread I did some experimenting with my PS4 and OLED B7. Things I've learned:
- Tonemapping in HDR Game Mode is almost the same as HDR Cinema. It preserves detail in bright areas but is overall dimmer. HDR Standard clips brightness instead. The sun in SotC will produce just a giant patch of super bright sky, while on the other modes you can see it shining through the clouds.
- The Witness seems to have exactly the same problem as Monster Hunter: there are no blacks in HDR. Even the darkest cave is only gray, while in SDR the same spot looks pitch black.
- Testing HDR Standard in the Witness, I noticed image retention for the first time on this TV, after looking at a puzzle panel for only 10 seconds. This game is an extreme case due to its stark color contrasts, but considering you can look at puzzles for 15 minutes easily, you can see there's a reason for Game mode not being as bright.
 

True Prophecy

Member
Oct 28, 2017
1,921
Thanks to this thread I did some experimenting with my PS4 and OLED B7. Things I've learned:
- Tonemapping in HDR Game Mode is almost the same as HDR Cinema. It preserves detail in bright areas but is overall dimmer. HDR Standard clips brightness instead. The sun in SotC will produce just a giant patch of super bright sky, while on the other modes you can see it shining through the clouds.
- The Witness seems to have exactly the same problem as Monster Hunter: there are no blacks in HDR. Even the darkest cave is only gray, while in SDR the same spot looks pitch black.
- Testing HDR Standard in the Witness, I noticed image retention for the first time on this TV, after looking at a puzzle panel for only 10 seconds. This game is an extreme case due to its stark color contrasts, but considering you can look at puzzles for 15 minutes easily, you can see there's a reason for Game mode not being as bright.

That seems to be very true, in doing my own research on trying to find the best settings for my C7 I came across this thread on the AVforums and he seemed to have come to the same conclusion as you. http://www.avsforum.com/forum/40-ol...-best-settings-hdr10-lg-oled-2016-series.html

  • I observed that on some movies/games (like FFXV), HDR standard seems to have a really bad EOTF curve for tone mapping, that causes that on really bright scenes, like the ones with a sun (This can be seen specially on FFXV sun), there's like a glare sphere, like it can't decode the scene and it clips hard, so i suppose HDR Game does not have this problem because how his EOTF curve/Tone mapping is
  • Even if i match the color clipping for HDR Standard and HDR Game, HDR Standard still clips detail, while HDR Game not
  • The way HDR Game is done, makes Dynamic Contrast something that we can allow, not like using it on HDR Standard, since the way it's done, it looks better than with HDR Standard, better gamma and less problems, so for dark movies is WAY better than HDR Standard which have huge gamma problems on darkest ires (127 has like 140% gamma, insanely dark)

English is a bit rough but I get his point, he has some screenshot examples too for films, I dont run Dynamic on HIGH because it loses detail IMO medium is my usual go to.
 
OP
OP
EvilBoris

EvilBoris

Prophet of Truth - HDTVtest
Verified
Oct 29, 2017
16,684
Really exceptional work OP!

I'm still very confused about something though, and it's to do with the way some developers have their HDR settings set up. Assassin's Creed: Origins for instance. Max luminance make a lot of sense: just set it to the highest value your TV supports. But I'm confused by paperwhite, and how you know what to set that at without simply eyeballing it. I've got a 2017 LG OLED, so I'd set my max luminance to around 700 or so, but I have no idea what to set paperwhite to.

Paper white creates a real world connection between the game and your room.
Paper white is meant to be luminance of a piece of paper , it's otherwise know and diffuse white. The HDR PQ specifies it to be 100nits, so if we are talking strictly correct that is what you would set it as.

As they have given you control over everything else it makes sense they have let you change this, especially when AC has particularly good dynamic toning.
 

Alo81

Member
Oct 27, 2017
548
Looking at this thread is reminding me what a bummer it is how many games come to PC without the HDR support that was added to their console counterparts. Rise of the Tomb Raider, Forza Horizon 3, Gears of War 4, step your PC game up.
 

ThreepQuest64

Avenger
Oct 29, 2017
5,735
Germany
Can someone explain it in a simple way to me, what HDR exactly does? From what I understand, it makes bright scenes/sources/objects even brighter and darker areas even darker?

My cheap TV supports HDR but it just changes how the game looks, not making it specificaly looking better. It seems I can achieve similar results with ReShade on my PC monitor. Mass Effect: Andromeda looked almost the same to the HDR version with some ReShade tinkering.
 

Deleted member 3203

User requested account closure
Banned
Oct 25, 2017
271
sunny canada
Great thread and OP.

I have a love/hate with HDR. I was excited for it and having had it for three months now has been a mixed bag. Firstly, when HDR is set to the recommended Hight setting, the light output started to cause headaches over extended periods of gameing (unsurprisingly graders on films are encouraged to work for 4 hours vs. the standard 8 for this reason). Secondly, the quality of HDR varies wildly from title to title. Overall, it's been quite good in games. FF15's hot desert setting being almost unbarely bright, though. In movies/television it's a mixed bag. Netflix seems to have no clue what they're doing, with each show blasting you with light levels that feel down right unnatural. Well graded movies, like Blade Runner 2049, takes a more subdued approach (seemingly keeping most of the image in SDR levels, with highlights, etc, reserved for higher light output).

Ultimately, I turned down my TV's HDR setting to Low. I find it's the best middle ground. I keep the contrast and some of the highlight effects - but eliminate the insane light output. I don't get headaches anymore.
 

SK4TE

Banned
Nov 26, 2017
3,977
10,000 nits? My TV does like 1400 and it was supposed to be good. Sony X90. Fuck.
 

Alo81

Member
Oct 27, 2017
548
10,000 nits? My TV does like 1400 and it was supposed to be good. Sony X90. Fuck.

10k nits is mostly just future proofing their games. Better to support far beyond whats possible (since theres no additional performance hit) and not have to update the game for support later on.
 

True Prophecy

Member
Oct 28, 2017
1,921
Can someone explain it in a simple way to me, what HDR exactly does? From what I understand, it makes bright scenes/sources/objects even brighter and darker areas even darker?

My cheap TV supports HDR but it just changes how the game looks, not making it specificaly looking better. It seems I can achieve similar results with ReShade on my PC monitor. Mass Effect: Andromeda looked almost the same to the HDR version with some ReShade tinkering.

Now im probably not the best person to answer this but the way I get it is that it gives access to a much higher color range and a higher peak brightness to blacks. On my OLED TV it really stands out (even more so now I have learned that brightness to HDR brightness slider change), Im not sure how it goes on LCDs but I assume its much the same effect. How good it is or how effective comes to the implementation but to me it gives a more realistic image with the way light shines or is reflected etc.
 
Oct 25, 2017
4,427
Silicon Valley
Can someone explain it in a simple way to me, what HDR exactly does? From what I understand, it makes bright scenes/sources/objects even brighter and darker areas even darker?

My cheap TV supports HDR but it just changes how the game looks, not making it specificaly looking better. It seems I can achieve similar results with ReShade on my PC monitor. Mass Effect: Andromeda looked almost the same to the HDR version with some ReShade tinkering.
It comes down to the actual display for HDR to matter. Ever been sat in front of a fire, at night? Notice how if you look away from it, you can see moonlit detail on the beach or forest, etc, but when you look at or near the fire its emitting a lot of light, but your eyes can adjust to it and see the hues of color within it, as well as the ember-hot wood. Or have you ever had something that is nice, colorful and vivid, but when you see a photo of it later the color is much more specific, and not as vibrant or detailed?

This is because, in general, display technology is limited to a miniscule amount of contrast and color when compared to what our own eyes actually see. With HDR-10 and DolbyVision, the screens use various technology to target that massive range of contrast and color, be it by having several dimming zones where the backlights literally power off in dark scenes and are only lit behind bright parts of the video, or self-emitting pixels like OLED turn off (thus reaching pitch black) and produce massive contrast levels that allow for upwards of 1 billion potential colors to be represented per the arrangement of RGB pixels (that is, the different colored pixels that combine to make different colors).

Of course, you also need the source video / game / etc. to be providing this color and brightness information, which 8-bit compressed streams and game engines don't do. As we move forward, things like physically based rendering and such make it easier for developers to toy with HDR and deliver incredible visuals that sometimes look night and day when compared to SDR.

Maybe not the simplest way to explain it, but it's essentially the continued evolution of reproducing brilliant colors and brightness/darkness that we see in real life on a screen.
 
Last edited:

Skyfireblaze

Member
Oct 25, 2017
11,257
Thanks for the awesome work, I don't have any HDR capable display yet but this thread made me excited for when I have one! :D

As for Monster Hunter, seeing how bad its HDR is I've wondered about something, to me the game looks pretty washed out even in SDR mode, does anyone feel the same?
 

Fredrik

Member
Oct 27, 2017
9,003
Looking at this thread is reminding me what a bummer it is how many games come to PC without the HDR support that was added to their console counterparts. Rise of the Tomb Raider, Forza Horizon 3, Gears of War 4, step your PC game up.
Yup, this is one reason why I mostly game on console now (Xbox One X) even though I have a very capable 1080ti rig as well, that and the fact that I would have to drop my triple screen setup to go with 4K too unless I planned to rob a bank.
 

ThreepQuest64

Avenger
Oct 29, 2017
5,735
Germany
Now im probably not the best person to answer this but the way I get it is that it gives access to a much higher color range and a higher peak brightness to blacks. On my OLED TV it really stands out (even more so now I have learned that brightness to HDR brightness slider change), Im not sure how it goes on LCDs but I assume its much the same effect. How good it is or how effective comes to the implementation but to me it gives a more realistic image with the way light shines or is reflected etc.
Or have you ever had something that is nice, colorful and vivid, but when you see a photo of it later the color is much more specific, and not as vibrant or detailed?
This sounds almost like HDR in photography, which seems to differ from HDR in games, I guess? I'm familiar with the process of HDR in photography producing/faking a higher contrast range, so I can see all details in bright areas like the sky, see every cloud and their details but ALSO every details in the dark areas which aren't crushed, opposed to normal photography focused on the bright area (or vice versa, when you focus on the dark area, and all highlights are overexposed).

I thought HDR on TV/in games want to achieve that, and your explanation sounds like that, but every realization I have seen hadn't showed that. Maybe due to the TV's HDR capabilities?
 
Oct 29, 2017
1,035
Wow, thanks OP. I saw the Ars article the other day but had missed the thread on here so I'm glad it was bumped. A lot of great work here, I understand a few things better now.
 

chaosaeon

Member
Oct 26, 2017
1,116
Great thread OP. I have an anomaly I'm wondering if you'd be interested to check out. The darker chapters in The Last of Us Remastered (Downtown or Museum for example) look terrible and washed out with HDR on no matter what I seem to do with in game brightness to try to compensate for it. Every other game's HDR looks fantastic for me. I've seen others make comments about messed up gamma or something in the game, but never examined it the way you've been doing. It looks so bad you'd think ND of all people would've patched it.
 

SleepSmasher

Banned
Oct 27, 2017
2,094
Australia
You 100% need Smart LED on a samsung JS8500 for HDR as this is what lets the set push light out in a specific zone of the screen or disable the LEDs to give you nice dark blacks.

Dynamic contrast isn't required, but many prefer the extra pop it gives you. I sometimes have it on, but I find I can see it adjusting when I'm watching movie content, so I turn it off
Is dynamic contrast / brightness required for proper HDR in general, or specifically with this Samsung model? I have a TCL capable of HDR/WCG but I believe it's an edge lit model (although when dynamic is on I can see patterns on the screen that resemble an array of lights), so not really sure how efficient it would be in dynamic adjusting everything anyway.
 

Daxa

Member
Jan 10, 2018
622
Great thread OP. I have an anomaly I'm wondering if you'd be interested to check out. The darker chapters in The Last of Us Remastered (Downtown or Museum for example) look terrible and washed out with HDR on no matter what I seem to do with in game brightness to try to compensate for it. Every other game's HDR looks fantastic for me. I've seen others make comments about messed up gamma or something in the game, but never examined it the way you've been doing. It looks so bad you'd think ND of all people would've patched it.
Are you using Full RGB range (instead of Limited)? TLOU2:R was how I realized my settings had been wrong on the PS4 (for games), and I was finally able to actually see stuff in the sewer-like levels and such after using Full.
 
Oct 25, 2017
4,427
Silicon Valley
This sounds almost like HDR in photography, which seems to differ from HDR in games, I guess? I'm familiar with the process of HDR in photography producing/faking a higher contrast range, so I can see all details in bright areas like the sky, see every cloud and their details but ALSO every details in the dark areas which aren't crushed, opposed to normal photography focused on the bright area (or vice versa, when you focus on the dark area, and all highlights are overexposed).

I thought HDR on TV/in games want to achieve that, and your explanation sounds like that, but every realization I have seen hadn't showed that. Maybe due to the TV's HDR capabilities?
Technically HDR photography is a way to combine different exposures that you normally don't see IRL. There is also the HDR effect that changes exposure when you travel from brightly lit outdoors into simmer indoor environments, or the opposite and inverse of that (see most modern games with realistic lighting engines).

What HDR displays attempt to achieve is actually reproducing the contrast and color that we are able to see in real life with our own eyes. That means that light sources actually appear as if they are illuminating the scene and the Shadow areas are very dark, but not crushed if we look there (unless crushed by stylistic choice in a movie, for instance).

If the HDR capabilities and/or calibration of your TV are not very profound, it will certainly struggle to achieve this. You need a combination of a display that can both be very bright and almost pitch black (usually LEd backlit LCD or similar) or something that can go to absolute black (like OLED) and then is bright enough at maximum to achieve the level of contrast per pixel / region to do this.

Finally, you need a source that has 10-bit color or greater (wide color gamut) that contains the information needed to display that kind of range.

10bit.png


As you seem to be aware, not all TVs are created equal and not all lighting conditions are conducive to certain TVs. Websites like www.rtings.com have pretty good reviews on 4K TVs that include things like HDR quality, both in overall picture quality and also input lag for gaming.
 

Mr Punished

Member
Oct 27, 2017
597
OUTER HEAVEN
That seems to be very true, in doing my own research on trying to find the best settings for my C7 I came across this thread on the AVforums and he seemed to have come to the same conclusion as you. http://www.avsforum.com/forum/40-ol...-best-settings-hdr10-lg-oled-2016-series.html



English is a bit rough but I get his point, he has some screenshot examples too for films, I dont run Dynamic on HIGH because it loses detail IMO medium is my usual go to.
The problem is OLEDs aren't capable of outputting enough overall screen brightness to accommodate such an extreme tone mapping, hence why one needs dynamic contrast to add screen vibrancy. However, even with dynamic contrast set to high on my C6 the updated game mode still isn't as vibrant in scenes as HDR standard. Biggest problem with dynamic contrast is it also messes up gamma levels, brightness, contrast, and causes some real ugly posterisation. The Last Guardian is especially bad when it comes to the posterisation, the sky is filled with nasty colour banding and all sorts, none noticeable on HDR standard. So the question is really if those extra details in highlights are worth the sacrifice of overall screen brightness and vibrancy? In my opinion, no. Keep in mind I have a C6 which is 3D capable, so my dim HDR game mode is especially dim. HDR clips for a reason, it's all about striking a balance that keeps a vibrant screen while still outputting more detail than your SDR counterpart.

The dim HDR game mode on my OLED at default crushes blacks, and outputs an incredibly dim scene. Sapienza in Hitman goes from a beautiful sun soaked Italian coastal town to an overcast day, can I see more detail when I look at the sun? Yeah, but is the sacrifice to the rest of the screen worth it? Hell no. My C6 had an older firmware with a game mode that clipped as much as HDR standard and was as vibrant. I went through a rather rigorous process to downgrade my firmware and have been happy since. Just about every game I play looks vibrant and beautiful (Horizon, my god), and I don't need to use dynamic contrast with it's nasty side effects either. Also, no problems with image retention and I'm playing a lot of HDR games.

LG provide a HDR standard mode with a bright and vibrant screen, and a cinema mode that doesn't clip highlights. So as a movie watcher you can pick your poison, but why is there no bright HDR game mode that imitates HDR standard but with a low input lag so one doesn't have to rely on dynamic contrast? It's obviously doable considering my now ancient C6 is handling it just fine. I don't think IR is a real threat considering my use of the TV either, just really confuses me. I won't be updating my C6 any time soon so that I can keep my vibrant game mode, but I'd love to update and would be happy with LG providing a bright HDR game mode option for me and the 2017-18 models.
 

chaosaeon

Member
Oct 26, 2017
1,116
Are you using Full RGB range (instead of Limited)? TLOU2:R was how I realized my settings had been wrong on the PS4 (for games), and I was finally able to actually see stuff in the sewer-like levels and such after using Full.
No I've left the RGB Range at full and have never changed it. I know the washed out limited look you're talking about though. But every other games' HDR looks fantastic for me with proper deep blacks and rich colors etc. Only in Last of Us Remastered's dark chapters like Downtown does HDR look faded and washed out. I've read posts from people saying they think the whole chapters have had their lighting or gamma messed up on the dev side of things which is surprising. That's why I'm curious to see what a test would show like the kind going on in here.
 

Kyle Cross

Member
Oct 25, 2017
8,431
I get strong color banding in HDR when using yuv422, I instead need to use yuv420 where I get much less banding (tho still not completely gone). Thoughts on this? Is yuv420 that much worse than 422?
 

Mr Punished

Member
Oct 27, 2017
597
OUTER HEAVEN
No I've left the RGB Range at full and have never changed it. I know the washed out limited look you're talking about though. But every other games' HDR looks fantastic for me with proper deep blacks and rich colors etc. Only in Last of Us Remastered's dark chapters like Downtown does HDR look faded and washed out. I've read posts from people saying they think the whole chapters have had their lighting or gamma messed up on the dev side of things which is surprising. That's why I'm curious to see what a test would show like the kind going on in here.
The downtown chapters have been washed out since the PS3 days. There are many other chapters in the Last of Us with proper blacks, I think being washed out is just the intended look for those chapters. Your RGB range however shouldn't impact HDR, limited RGB is the only option for current HDMI cables when it comes to HDR so your PS4 should default that setting no matter what you choose. With SDR content though you'll wanna make sure your TV is set to full if your PS4 is full, and then limited if vice versa.
 

Cantona222

Chicken Chaser
Member
Oct 30, 2017
1,136
Kuwait
After knowing the "Lore" of HDR I came to know that my newly purchased TV (65" SAMSUNG MU7000) can do a maximum of only 338nit HDR :(

The TV is arriving today by the way.
 

Petran

Member
Oct 29, 2017
2,034
After knowing the "Lore" of HDR I came to know that my newly purchased TV (65" SAMSUNG MU7000) can do a maximum of only 338nit HDR :(

The TV is arriving today by the way.
It also has one of the best contrast ratios, so depending your lighting conditions and your fiddling around with settings you can have some very good results.
Sound and viewing angle though are not the best (those deep blacks look grayer from the sides), so sit straight in front and use a soundbar\home theater for best results. Its a good TV, enjoy
 

nikasun :D

Member
Oct 30, 2017
3,166
So ideally I would have two HDR settings when playing with lights on in the room and without lights on? When I don't have the lights on, should I then use darker HDR settings?
 

Cantona222

Chicken Chaser
Member
Oct 30, 2017
1,136
Kuwait
It also has one of the best contrast ratios, so depending your lighting conditions and your fiddling around with settings you can have some very good results.
Sound and viewing angle though are not the best (those deep blacks look grayer from the sides), so sit straight in front and use a soundbar\home theater for best results. Its a good TV, enjoy
Great tips. Thanks.
 

X1 Two

Banned
Oct 26, 2017
3,023
Technically HDR photography is a way to combine different exposures that you normally don't see IRL. There is also the HDR effect that changes exposure when you travel from brightly lit outdoors into simmer indoor environments, or the opposite and inverse of that (see most modern games with realistic lighting engines).

What HDR displays attempt to achieve is actually reproducing the contrast and color that we are able to see in real life with our own eyes. That means that light sources actually appear as if they are illuminating the scene and the Shadow areas are very dark, but not crushed if we look there (unless crushed by stylistic choice in a movie, for instance).

If the HDR capabilities and/or calibration of your TV are not very profound, it will certainly struggle to achieve this. You need a combination of a display that can both be very bright and almost pitch black (usually LEd backlit LCD or similar) or something that can go to absolute black (like OLED) and then is bright enough at maximum to achieve the level of contrast per pixel / region to do this.

Finally, you need a source that has 10-bit color or greater (wide color gamut) that contains the information needed to display that kind of range.

10bit.png


As you seem to be aware, not all TVs are created equal and not all lighting conditions are conducive to certain TVs. Websites like www.rtings.com have pretty good reviews on 4K TVs that include things like HDR quality, both in overall picture quality and also input lag for gaming.

That color comparison at the bottom is incredible. It's also misleading, stupid and outright wrong.
 
Oct 25, 2017
4,427
Silicon Valley
That color comparison at the bottom is incredible. It's also misleading, stupid and outright wrong.
There is no easy way to compare SDR to HDR displays to people on standard screens.

As for flat our wrong, care to explain? Are you trying to tell me that 8-bit content can reproduce up to 1 billion colors like 10-bit? Or that SDR TVs can achieve the same color and contrast as HDR sets?

I'm being serious and will remove the image if you can give me an appropriate explanation to your response, such as another way to visually communicate the facts.
 

Petran

Member
Oct 29, 2017
2,034
That color comparison at the bottom is incredible. It's also misleading, stupid and outright wrong.
There is no easy way to compare SDR to HDR displays to people on standard screens.

As for flat our wrong, care to explain? Are you trying to tell me that 8-bit content can reproduce up to 1 billion colors like 10-bit? Or that SDR TVs can achieve the same color and contrast as HDR sets?

I'm being serious and will remove the image if you can give me an appropriate explanation to your response, such as another way to visually communicate the facts.
no easy way to show on a SDR screen, thats for sure.
on a HDR screen though, if you save a screenshot on scorpio and then proceed to look at it, up top it has an option "SHOW IN SDR/HDR" toggle, and that is the easiest way I've found yet to compare a single screen.
I get what the screen above means, its just that showing that left side gradient as a depiction of what people have been watching up to now, is bound to cause some ...questions :)
 
Oct 25, 2017
4,427
Silicon Valley
I think those are good questions, because threads like these bring answers to what they are trying to represent, what HDR is, and which games (and movies) are utilizing this to the fullest.
 

Smokey

Member
Oct 25, 2017
4,176
Awesome work OP!

I have one thing to add:
I currently use a HDFury Linker(here) to inject custom metadata for my LG C7. This is needed because both PS4 and Xbox One are sending empty metadata to the TV. (at least that is what my Linker is receiving) Normally this would mean that the TV wont do any tonemapping, but unfortunately the LG OLEDs are trying to tonemap the full HDR range (which means 10000nits!) to the capabilities of the TV. This is the reason why the HDR game mode appears dim: It literally reduces the brightness of the whole image. (basically the blue/green/yellow color coded parts of your images will be displayed much dimmer on an LG OLED than intended)
Also this means that the HDR brightness sliders in games wont do anything for the LG OLEDs.

My solution is to send custom metadata (1000nits peak brightness) via the linker, so that APL (Average Picture Level) for the "SDR" part of the image is retained.

I've never heard of this device. So you're saying that when using HDR Game Mode and the HDFury, you are regaining the brightness that is lost between HDR Standard and HDR Game Mode?
 

LiK

Member
Oct 25, 2017
32,099
Someone hire this man.

Btw, how come Destiny 2 wasn't included? I thought the HDR PQ was incredible.
 
OP
OP
EvilBoris

EvilBoris

Prophet of Truth - HDTVtest
Verified
Oct 29, 2017
16,684
One interesting thing about how Horizon is mastered is that is seems like it is compressing the highlight range and using most of the contrast in the "normal" SDR range, without really abusing the super low luma area. That's kind of what photographic film does, actually.

In theory that is how all HDR10 content should be like that most of the time, as the majority of what you see doesn't fall into the range of the highlights. Those highlights are going to be used for things that we've traditionally found hard to represent accurately.
Neon lights are a really good example of this, you can look at really great photo of a neon light and it doesn't look like the real thing, however with the additional luma and those additional colors that exist in the HDR colourspaces, you can much more accurately represent them. It's one of the reason that I think that Agents of Mayhem looks really great and you can actually feel the benefit as the game can display those colours that have previously been missing alonside the extra brightness. The accurate representation of coloured LED and Neon lights is really great.

Much like a camera, the dynamic range of what is being captured is limited by either the film or the sensor, which is uniformly sensitive. There is often more detail behind this, which can either be expressed or controlled with dodging and burning , by controlling the input to the sensor/film with a grad filter or by the post processing that occurs afterwards. But this is all because the end display format, whether it is print or a display had no way to present the naked image as it would have appeared to the photographers eye.

You'll see games already doing this type of real time exposure adjustment in order to present the most relevant data to the player, this might be overexposing the sky to give the perception of brightness or it my be raising the illumination of a dark area in order to let the play actually see.

With the HDR displays, you aren't needing to pick slice of what the player is looking at and adjust it as aggressively at the expense of another part of the image.


Below is a theretical histogram of SDR and HDR, they should be indentical, apart the the HDR version has more overhead to capture that brightness.
hdr_range.png
 
OP
OP
EvilBoris

EvilBoris

Prophet of Truth - HDTVtest
Verified
Oct 29, 2017
16,684
Someone hire this man.

Btw, how come Destiny 2 wasn't included? I thought the HDR PQ was incredible.


I've got some of Destiny 2, I just had a lot of examples to be shown of lots of games and it was one I was a little less sure about, I'm still learning about this as I go along also.

Destiny is actually a Perfect example of the above. It looks like the same game, you don't put it on and go "oooh HDR has been enabled".
However you see the HDR part of the image where you should be able to see it, giving you a more lifelike expression of light and darkness.

Like some of the other implementations that are putting aside the fact the format has been designed for theatrical use, you simply adjust the brightness to your preference. With the brightness on the default setting, you will hit a very peak value of 4000nits, with most maxing out at 1000.

PgSQWSb.jpg


lNFoB3j.jpg


VDNJzk8.jpg
 

kanuuna

Member
Oct 26, 2017
726
Really useful thread, and really goes to show how important pushing peak brightness is. As much as I adore how OLED sets look today, the relatively low peak brightness really doesn't seem optimal for HDR going forward. Micro LED can't get here soon enough.
 
OP
OP
EvilBoris

EvilBoris

Prophet of Truth - HDTVtest
Verified
Oct 29, 2017
16,684
Really useful thread, and really goes to show how important pushing peak brightness is. As much as I adore how OLED sets look today, the relatively low peak brightness really doesn't seem optimal for HDR going forward. Micro LED can't get here soon enough.

Kinda cool knowing that the games even now have detail that we can't see , playing those games 5 years from now, they will actually look even better.

Let's hope that all the manufacturers give us BC, so we can continue to enjoy them.
 

EsqBob

Member
Nov 7, 2017
241
EvilBoris
What is the bit value for the maximum SDR brightness (100 or 150?) in HDR? Does 10 bit HDR handle low light detail better, worse or the same compared to SDR video?
 

Gestault

Member
Oct 26, 2017
13,371
This is an interesting way to analyze the output more quantitatively, which is super useful. Great work, OP. So much of what I see from other HDR analysis just boils down to the presence of HDR10 and whether the game "looks good" and "gets bright," so much of which is down to artstyle anyway.
 
OP
OP
EvilBoris

EvilBoris

Prophet of Truth - HDTVtest
Verified
Oct 29, 2017
16,684
EvilBoris
What is the bit value for the maximum SDR brightness (100 or 150?) in HDR? Does 10 bit HDR handle low light detail better, worse or the same compared to SDR video?

Technically there isn't an exact nit value for SDR content as it is handled entirely differently, cinema screens are 48nits, standard CRT TVs were around about 100. When an regular Blu ray is mastered then the reference screen will be between 80-120 nits.
For the sake of these comparison I've said we look at the values between 0-150 as being roughly approximate to the SDR part of the image.

And no, it doesn't really handle it better, there isn't really a great deal of extra bandwidth allowed for the darker areas and the actual HDR10 format actually dictates that those darker areas remain truly dark, which as is the case for most viewers, means that they cannot actually see it in their own viewing conditions.

It's a bit of a can of worms, ultimately you have to remember that Dolby Vision was designed for Theatres, not for home, which in turn means that HDR10, which uses the exact same output principles is also pretty unsuitable.

The formats are designed to standardise the content and ensure it is produced in a controlled manner and that the hardware is displaying it in a controlled manner, but as the formats (that have been adopted and pushed by TV manufacturers) ultimately weren't intended for TVs in people homes, games and UHD movies are already fudging their outputs to try and bypass some of the "rules" dictated by the format.