HDR games analysed

EvilBoris

Member
Oct 29, 2017
5,644
I've spent the last couple of weeks investigating the HDR output of various games.
When it comes to SDR video , you may be familiar with RGB values 0-255 , with 0 representing the colour black and 255 representing white (if you've ever used a colour picker in MS paint or a word processor, you may have seen this type of number).
HDR10 /Dolby Vision is a little bit different, not just because it uses a scale of 0-1023, but because each of these data values represent not just black to white (or colour), but also a measure of luminance in Nits, which isthe intensity of the light (how bright it is)

Unlike previous video formats, these values are defined and are absolute. A value of 0 will always represent no light at all (total black) , a value of 1023 will always represent 10000nits of luminance, a value of 769 will always represent 1000nits.

So if a modern HDR TV is fed these values, they should be exactly outputting the amount of light described by the value given.
HDR10 and Dolby vision both used this system, and can refereed to as PQ based HDR

Now as it stands, there aren't many TVs that get to the heady heights of 10k nits, you are lucky if you can get one that goes above 1500 at the moment.
When the signal being received goes beyond the hardware capability of the display, the TV chooses how it handles this, most manufacturers simply clip the white values above a level chosen by them. They may also choose a soft roll off and try to make the shift into the clipped values less obvious.
In order to do this, the when the content is mastered/produced for HDR10 and Dolby vision specify some additional information about the image content in the form of metadata, this metadata usually says what is the most intensely brightest value that will be seen in the game (or movie) and what the average luminance is across all of the content. These values are defined by the display the content was mastered on.
Most UHD content current is being mastered for 1000nit screens or 4000nit screens.

The purpose of this metadata is so an SDR image (or something inbetween SDR and HDR) can be derived from the the original HDR content in the event that the content is viewed on a display that does not reach the peak Luma of the display that the content was mastered on.

So if you movie has been mastered on a 1000nit reference display, and you are using an OLED screen with a 650nit max output the TV can use this metadata to try best decide how to display the information that can't otherwise be displayed, due to hardware limitations.
Once you are using a display that meets or exceeds the peak brightness of the content, the metadata actually becomes irrelevant.


So with this in mind I've been looking at how games have been mastered, what options do they present to the user to adjust the image , what do these options actually do and what is the relationship between these things and how the HDR looks.
Videogames have a big advantage over movies in that the image is generated in real time, so the image can be adjusted at will

Due to the nature of HDR content, this is actually really easy to measure, all we need is an un-tonemapped screenshot or video capture and from this we can look at the code values that have been used in various in parts of the image image.
We can see if the game is actually outputting anything that is black (or has a cinematic grade been applied with raised blacks) we can also see what is the very brightest value that the game is going to try and use to represent something like the sun.
So I've been looking at the make up for various different games for Xbox (which allows for HDR screenshot output) to try and understand what the in-game adjustments actually do and how I should be using them to ensure that I get the best from my display.

The goal of HDR is to to transport more information to a display , so that more intense light can be display in parts of the image where it is required, typically you'll see these brightest points in specular highlights, explosions and the sun.

Let's have a look at a few of really good examples of in game HDR. They all have slightly different settings and different approaches to how the tone mapping is going to be performed.

In order to better visualize the output of the game in a non HDR10 / SDR format, I've created a method of producing maps of the luminance.
Using this scale you can quickly get an overview of what is dark, what is light and what is really intensely light.



The vast bulk of what we see is going to sit between 0-150 nits, anything above that is the "Extra" luma that HDR offers.


Star Wars : Battlefront 2
Actually all of the HDR compatible Frostbite games I looked at (Battlefield 1,Mass Effect) all use the same setup.
Metadata output will be 100000 and the actual tone mapping is performed by the game via the HDR slider, with 0 nits being the very most left value and 10k nits being the most right value.



As we can see, the sun itself is outputting at 10k nits, things that should be totally dark are as they should be, you'll see that the specular highlights reflected on the top of the gun are also hitting between 4000 and 10000 nits.
The Dice Frostbite games are actually really interesting in that you can turn the HDR slider down to 1 click from the left , which will given you 100/200 nits depending on which game it is, essentially toning the game to SDR. This gives you a really nice way to see where your new fancy TV is showing off.

Another fun thing you can do with the DICE game is to move the slider to 0% and literally turn off the lights, you can see just how real-time the lighting as you tell the engine the brightest any source of light can be is 0.

Rise of the Tomb Raider
So this game has a similar setup, a brightness slider which controls the black point (move to lowest point) and then a secondary HDR slider which controls the peak brightness.
Tomb Raider has been capped at 4000nits. Like the Frostbite games, either set the slider to the Max to let your TV tone map, or follow the on screen instructions to try to eyeball the peak brightness and let the game output.

RotTR is particularly great as there are loads of specular highlights, not just in the normal places you'd expect to see them in real life such as on the shiny ice and the twinkles in the snow as they reflect from the sun.


But also in lower light conditions on less obviously "shiney" surfaces, so as on this insanely high resolution boot.


It appears that once the output reaches 4000nits, any level above this jumps straight to 10k nits (which the display will clip anyway, as presumably the metadata is telling the display it's 4000nit)
It's not data that is missing, it can be brought into visibility occasionally, which suggests that it is an artistic decision or part of the process for grading the game for HDR.



Assassin's Creed Origins

AC:Origins is another game that really does HDR well, like Tomb Raider is also capped at 4000nits (the pink bits) and offers a brightness slider which should be set as low as possible , based upon your viewing environment. Also it offers you a Max Luminance, which is neatly labeled in nits.
There is also a "Paper white" scale, so as well as having sliders that dictate the darkest something can be in game, the brightest it can be, the game gives you a scale that allows you to adjust one of the mid points: How bright is a piece of paper.



Ubisoft's recommendation for the paper white slider is
adjust the value until the paper and hanging cloth in the image almost saturates to white.
However like the brightness slider, this is here to allow you to adjust the output of the game to match your viewing conditions, if you are in a cinema like controlled light source environment technically you would set to 80nit, however as your surrounding light increases you will prefer higher values.



Setting the games to the technical correct settings also highlights how HDR10/Dolby vision is too dim for many consumers. You'll also see how developers are still getting to grips with these new technologies, you'll see that the HUD in AC becomes a little too dark as the main game image becomes calibrated correctly.


Moving on, we can see 3 Microsoft 1st Party Offerings which are all working with a full 10000nit outputs

Forza Horizon 3

A really great example of how HDR doesn't need to be used just for crazy bright sun-spots, magic and fires.
Cloudy days still have very bright skies, photographers have had to use a few tricks to deal with the contrast issues they cause. Forza really shines here, it's almost enough to make me feel a sensation cold and drabness when viewed with my own eyes.

You can see the skie reaching 4000 or so nits, tires grill being suitably unlit and dark and then the headlights pumping out a full 10k nit output.


In a night time environment you can see the game making full use of the darker end of the scale, whilst the explosions and headlights are still illuminating in a realistic fashion.


This is also a good example , even in SDR how a very dark image which adheres to the correct HDR10 standards will be perceived as being too dark or dim or "crushing blacks" . As we can see from the Luma map, the detail is all actually there, but human eyes cannot adjust to see such details until they are exposed to low levels of light for 10 minutes or so, when certain chemical reactions occur within the cells of the eyes. This obviously is an issue for many consumers who probably aren't in lightning conditions conducive to this happening.

Gears of War 4


INSECTS


All 3 games work with 2 sliders
A brightness slider, this controls black levels, but also controls an aspect of contrast, pushing max nit output up to 10000k
The a secondary HDR slider which also allows the max output to be set below that which the other slider adjusts to.
Forza and Gears simply refer to Brightness and HDR , however INSECTS refers to this as HDR Contrast and HDR brightness.

Now lets look at a few of other implementations

Shadow of Mordor
SoM offers a super simplified approach, the game is always outputting 10k nit maximum and the tone mapping is left entirely to the TV. This is interesting, as we know that the developers have never actually seen the game outputting at 10k nits as there is no such display available.
There is a traditional Brightness slider which allows users to specify the black point to their taste/viewing conditions.

Here we can see the obvious place to look for 10k nits, the sun and the specular highlights. You can also see that as it should be, the shadowed sided of the player character is as dark as it should be.


Agents of Mayhem
Similar approach here, except the game is capped at 1000nits and has obviously been graded/toned with this in mind, as it's a fairly achievable output from a consumer display. I don't think it's a coincidence that they have made the game with this in mind and it's actually one of the games that has a really great looking HDR output.





Again, like other games, adjust slider to the left to improve black level, this will adjust the overall brightness and impact slightly on the max brightness. Adjust to your lightning conditions/taste.

DEUS EX : Mankind Divided
Lots of raised blacks in this perhaps as an artistic choice, but also a bug which causes grading to to totally fail if the in game brightness slider goes below a value of 35%.


This looks like the result of some kind of flawed curve adjustment.


40-45% will give you 1000nit output without raising the blacks too much.

Final Fantasy XV
From one Squarenix meh, to a Squarenix wow.
1000 nit fixed max output and a simple brightness slider to drop black levels.
Really fantastic grading throughout and in various lightning conditions.





Even the Titlescreen has 2D elements optimised for HDR.


Monster Hunter World
Much like Deus Ex, Monster Hunter World appears to operate within 4000nits, however much like DEUS EX, when HDR is enabled, the game appears to have severe black level problems. At the default brightness setting, this is what we are getting.



All mids and highlights, where are the shadows?

So with a quick and dirty level amendment, we can remove the extra HDR luminance from the data and take a look at the histogram



If we compare this to an in game SDR shot taken just moments later



we can see the significant shift that there is between the SDR and the HDR toning. Contrast and black levels are totally out of sorts in HDR.
This can be remedied slightly by dropping the brightness down as low as it goes, but there is not enough of a change to make it right. This appears as if it is at least partly caused by some kind of eye adaptation that is occuring.


Whew!
That was a lot of images!

So this explores one side of how games are made, what it doesn't explore is the metadata side of what the games are outputting. It wouldn't surprise me if there were titles that had mismatched metadata, but because the metadata is static, as soon as you make any image adjustments, it would technically become wrong anyway.

Update 1:
Figured out how to run it on video

Horizon Zero Dawn : Frozen Wilds

Even from this short clip we can see how to do HDR right, almost everything you see sits within the standard SDR range, however the highlights on Aloy's weapons glimmer in the sun and the sparkle that runs down her back are heading towards the 10k nit level. You can see the clouds are all highly illuminated and sit between 1000-4000nits with the sun itself hitting 10k

That is hitting upwards of 4000nits from what I can see.

Uncharted 4

Got some HDR footage of Uncharted 4 too, another one that is going over 4000nits towards 10k

 
Last edited:

anexanhume

Member
Oct 25, 2017
4,302
Nice work OP!

Is HDR something we’re going to regret developers having this type of control over? I’m also curious how these games will age as displays improve.

As far as image processing, I assume TVs don’t just clip brightnesses above their max range. Don’t they re-normalize to their capability, or is it more complex than that? Will formats with dynamic metadata being the norm ease any of the concerns with how HDR is handled ?

Edit: and will HDR re-map mods for PC become a thing?
 
Last edited:

dieter

Member
Oct 27, 2017
115
Thanks for doing this, really interesting. I've been loving HDR in Origins, even on my "budget" HDR set it looks great, can't wait to one day get a decent set.
 

qwilman

Member
Nov 4, 2017
32
Savannah, GA
The Monster Hunter shot is interesting to me, as it almost feels like a conscious visual design choice for the home hub, rather than an oversight. Before I start talking too far out my ass, could you do the same tests in a few other areas that are intended to be darker? Maybe somewhere deep in the forest of the opening area, or heck, even the same area at night? I'm curious what results the cat canteen would give you late at night.
 

noyram23

The Fallen
Oct 25, 2017
5,746
That's a great read OP, so regarding MHW the closest we could get to a proper HDR is to lower the brightness to lowest? Also have you tried Horizon Zero Dawn? It's probably the best HDR I have seen.
 

GHG

Member
Oct 26, 2017
6,648
Great work EvilBoris

Are you going to be conducting this research for new HDR games as they come out or is that too much to ask :P ?
 

BBQ_of_DOOM

Member
Oct 25, 2017
6,878
Tremendous thread. Also helps folks to understand what HDR really does and what it can add.

That being said, MHW's HDR is awful. Probably the worst I've seen. PUBG's is barely better and that's notoriously awful.
 

MongeSemNome

Member
Jan 29, 2018
189
Brazil
Dude, awesome post!

But now I have to be aware of TVs mit limitations, as it seems. Are they even legally obligated to give these data?

Also, your main point seems to be that the game's setting is much more important than the TV itself.
 

reid23

Member
Oct 27, 2017
57
Horizon: Zero Dawn and Shadow of the Colossus have the best HDR I have ever seen. SoTC is so drastic that, when paired with the Resolution mode, is like a remaster of a remaster. It’s like a different game compared to 1080/SDR.
 

Deleted member 17491

User requested account closure
Banned
Oct 27, 2017
1,099
Have you checked how much impact the required chroma subsampling at higher resolutions has causing the YCbCr colour space to be used. Leaving only the 64-940/960 range to be usable versus the 0-1023 range.
 

Lost

Banned
Oct 25, 2017
3,105
Phenomenal thread! Wow. I’m amazed at the amount of effort put into this.

This is what I was expecting Digital Foundry to do.

I’d love to see the results of games like Horizon or Call of Duty.
 

DarthWalden

The Fallen
Oct 27, 2017
2,080
I got nothing to add to the conversation but that was really interesting and I wanted to thank you for taking the time to put that together.
 
OP
OP
EvilBoris

EvilBoris

Member
Oct 29, 2017
5,644
Nice work OP!

Is HDR something we’re going to regret developers having this type of control over? I’m also curious how these games will age as displays improve.

As far as image process, I assume TVs don’t just clip brightnesses above their max range. Don’t they re-normalize to their capability, or is it more complex than that? Will formats with dynamic metadata being the norm ease any of the concerns with how HDR is handled ?
It depends on the TV, but when it comes to luminance most of them (as far as I know) to pretty much just clip, some alter it in the run up to the clip, more quickly arriving at the peak brightness of the hardware to provide additional perceptual contrast.

I'd imagine that dynamic metadata won't really become a thing for games, because as highlighted here, you can simply specify the peak brightness for the display in a menu and the game itself can ensure that no essential data is lost. Maybe at a later date you'll be able to specify the peak brightness in a dashboard setting and not even worry about it.

Dynamic metadata won't really make a great deal of difference, TVs will catch up with the peak value of reference monitors before we know it and then you don't need metadata at all.
 

LordofPwn

Member
Oct 27, 2017
2,436
Great work OP. A couple extra details for ya, Dolby Vision supports up to 12 bit or 0-4095. and I believe the minimum average nit for an HDR display is 300 (I remember reading this but I haven't been able to get confirmation.)
 
Last edited:

Lukemia SL

Member
Jan 30, 2018
5,575
Great thread, I absolutely love HDR. I hope more and more games support it down the line.
Shadow of the Colossus being my latest and God of War being my next. I think all games must implement it. And no it doesn’t only benefit realistic games.
 

anexanhume

Member
Oct 25, 2017
4,302
It depends on the TV, but when it comes to luminance most of them (as far as I know) to pretty much just clip, some alter it in the run up to the clip, more quickly arriving at the peak brightness of the hardware to provide additional perceptual contrast.

I'd imagine that dynamic metadata won't really become a thing for games, because as highlighted here, you can simply specify the peak brightness for the display in a menu and the game itself can ensure that no essential data is lost. Maybe at a later date you'll be able to specify the peak brightness in a dashboard setting and not even worry about it.

Dynamic metadata won't really make a great deal of difference, TVs will catch up with the peak value of reference monitors before we know it and then you don't need metadata at all.
Ah yeah, a dashboard setting with optional game override would be nice.
 

Genetrik

Member
Oct 27, 2017
796
My god amazing and cryptic thread to me. So I have a Sony 75 ZD9 and tweaked the picture settings but to be honest i would need an expert to tell me if I did it right. Soooo..... :D
 

Mullet2000

Member
Oct 25, 2017
2,043
Toronto
It is annoying, I've picked a bit where the grade is at it's worst, it looks much better in other conditions, but you can see it's a long way from looking as it should.
It's noticeable enough in general that I turn off the HDR to play the game on my HDR TV, so while you picked the worst spot, yeah, it's still bad enough that I'm avoiding it.
 

TuMekeNZ

Member
Oct 27, 2017
594
Auckland, New Zealand
Love your work!!
Definitely find good use of HDR is more visually impressive than a res increase (obviously having both is the best). Just picked up a 1X yesterday and booted up The Witcher 3 to see what a difference the update made and wow!!! The colours just pop so much more now and the draw distance/detail is so much better.
Forza Horizon 3 has always impressed but with the 4K update the HDR now really shines, even after having my 1S since launch.
Will be trying Gears 4 tonight after work... saving the best for last I feel.

On the PS side GT Sport on my Pro has some of the best HDR implementation on any system.
 

qwilman

Member
Nov 4, 2017
32
Savannah, GA
It is annoying, I've picked a bit where the grade is at it's worst, it looks much better in other conditions, but you can see it's a long way from looking as it should.
It's noticeable enough in general that I turn off the HDR to play the game on my HDR TV, so while you picked the worst spot, yeah, it's still bad enough that I'm avoiding it.
Can we see a couple better spots then? Not "nuh-uh"ing the post or anything, I'm just curious about the difference.
 

LightEntite

Banned
Oct 26, 2017
3,078
Great thread!

Experienced HDR with FFXV last year. Instantly became a believer. The game feels as though it actually gains more detail, it's crazy. Horizon was crazy too.

Getting an HDR television within the next 2 months, can't wait
 

golem

Member
Oct 25, 2017
1,548
Great work OP! I thought it would take developers a while to get to grips with HDR and figure out best practices but it looks like alot of companies are well on their way. How do you feel about HDR10 vs a dynamic metadata approach like DV in games? Is HDR10 good enough as seen in todays results or could they be pushed further with DV/new standards?
 

Spider-Man

Member
Oct 25, 2017
1,973
Thank you for this. I've tried nearly all of these in HDR and felt the same way about these. Of course I didn't go into this much crazy analysis but just impressions from playing.

FH3 is glorious. Drone mode is just a blast for eye candy.
 
OP
OP
EvilBoris

EvilBoris

Member
Oct 29, 2017
5,644
So there is a whole other selection
My god amazing and cryptic thread to me. So I have a Sony 75 ZD9 and tweaked the picture settings but to be honest i would need an expert to tell me if I did it right. Soooo..... :D
Generally, as a rule of thumb

for each game

Brightness as low as it will go
HDR slider either
As high as it will go, or as high as it goes before the reference disappears.

You may way to raise the in game brightness slider up a little until you feel happy if you are playing in an illuminated room.
 

FarisR

Member
Oct 25, 2017
1,317
Hopefully the next-gen consoles have some standards in place, and some adjustments possible on a system-level (that all developers work with).
 

Waikipedia

Member
Oct 26, 2017
71
Thanks for that. I did notice that the hdr in mh world is wonky. It looked washed out or had a raised gamma.
 

MagicPhone

Member
Oct 25, 2017
151
Los Angeles, CA
This is a must needed thread, thank you so much. I got a new tv this year with HDR (the TCL everyone loves), and although it's not the BEST for HDR, I spent a week thinking my tv setup was broken because games looked worse. After a few months, I've learned that some games look great and some look blah with HDR. For instance, I love Horizon Zero Dawn and AC: Origins' HDR, but think Everybody's Golf is awful and Forza Horizon 3's makes me a worse driver because it's too dark.

Again, thank you so much for this thread, finally I feel a bit sane, with science to prove it.
 

Ramjag

Member
Oct 25, 2017
3,728
This is great and I’m always looking for information on this topic. I hope the HDR on monster hunter gets improved.
 

Relix

Member
Oct 25, 2017
2,160
Amazing work OP. Love stuff like this. Still getting used to HDR in my LG OLED but man did it look great on FF and Origins. Uncharted looks beautiful as well.

Also... Altered Carbon on Netflix has an amazing Dolby Vision implementation. Even streaming it looked mind-blowing
 

Tilt_shift

Banned
Oct 29, 2017
201
Australia
Great thread and fantastic effort. Any chance you can do GT sport? Its supposed to have one of the best HDR implementations in gaming too.
 

10k

Member
Oct 25, 2017
4,048
Toronto, Ontario, Canada
I've spent the last couple of weeks investigating the HDR output of various games.
When it comes to SDR video , you may be familiar with RGB values 0-255 , with 0 representing the colour black and 255 representing white (if you've ever used a colour picker in MS paint or a word processor, you may have seen this type of number).
HDR10 /Dolby Vision is a little bit different, not just because it uses a scale of 0-1023, but because each of these data values represent not just black to white (or colour), but also a measure of luminance in Nits, which isthe intensity of the light (how bright it is)

Unlike previous video formats, these values are defined and are absolute. A value of 0 will always represent no light at all (total black) , a value of 1023 will always represent 10000nits of luminance, a value of 769 will always represent 1000nits.

So if a modern HDR TV is fed these values, they should be exactly outputting the amount of light described by the value given.
HDR10 and Dolby vision both used this system, and can refereed to as PQ based HDR

Now as it stands, there aren't many TVs that get to the heady heights of 10k nits, you are lucky if you can get one that goes above 1500 at the moment.
When the signal being received goes beyond the hardware capability of the display, the TV chooses how it handles this, most manufacturers simply clip the white values above a level chosen by them. They may also choose a soft roll off and try to make the shift into the clipped values less obvious.
In order to do this, the when the content is mastered/produced for HDR10 and Dolby vision specify some additional information about the image content in the form of metadata, this metadata usually says what is the most intensely brightest value that will be seen in the game (or movie) and what the average luminance is across all of the content. These values are defined by the display the content was mastered on.
Most UHD content current is being mastered for 1000nit screens or 4000nit screens.

The purpose of this metadata is so an SDR image (or something inbetween SDR and HDR) can be derived from the the original HDR content in the event that the content is viewed on a display that does not reach the peak Luma of the display that the content was mastered on.

So if you movie has been mastered on a 1000nit reference display, and you are using an OLED screen with a 650nit max output the TV can use this metadata to try best decide how to display the information that can't otherwise be displayed, due to hardware limitations.
Once you are using a display that meets or exceeds the peak brightness of the content, the metadata actually becomes irrelevant.


So with this in mind I've been looking at how games have been mastered, what options do they present to the user to adjust the image , what do these options actually do and what is the relationship between these things and how the HDR looks.
Videogames have a big advantage over movies in that the image is generated in real time, so the image can be adjusted at will

Due to the nature of HDR content, this is actually really easy to measure, all we need is an un-tonemapped screenshot or video capture and from this we can look at the code values that have been used in various in parts of the image image.
We can see if the game is actually outputting anything that is black (or has a cinematic grade been applied with raised blacks) we can also see what is the very brightest value that the game is going to try and use to represent something like the sun.
So I've been looking at the make up for various different games for Xbox (which allows for HDR screenshot output) to try and understand what the in-game adjustments actually do and how I should be using them to ensure that I get the best from my display.

The goal of HDR is to to transport more information to a display , so that more intense light can be display in parts of the image where it is required, typically you'll see these brightest points in specular highlights, explosions and the sun.

Let's have a look at a few of really good examples of in game HDR. They all have slightly different settings and different approaches to how the tone mapping is going to be performed.

In order to better visualize the output of the game in a non HDR10 / SDR format, I've created a method of producing maps of the luminance.
Using this scale you can quickly get an overview of what is dark, what is light and what is really intensely light.



The vast bulk of what we see is going to sit between 0-150 nits, anything above that is the "Extra" luma that HDR offers.


Star Wars : Battlefront 2
Actually all of the HDR compatible Frostbite games I looked at (Battlefield 1,Mass Effect) all use the same setup.
Metadata output will be 100000 and the actual tone mapping is performed by the game via the HDR slider, with 0 nits being the very most left value and 10k nits being the most right value.
Setting the brightness slider to it's lowest allows for true 0 black levels.



As we can see, the sun itself is outputting at 10k nits, things that should be totally dark are as they should be, you'll see that the specular highlights reflected on the top of the gun are also hitting between 4000 and 10000 nits.
The Dice Frostbite games are actually really interesting in that you can turn the HDR slider down to 1 click from the left , which will given you 100/200 nits depending on which game it is, essentially toning the game to SDR. This gives you a really nice way to see where your new fancy TV is showing off.

Another fun thing you can do with the DICE game is to move the slider to 0% and literally turn off the lights, you can see just how real-time the lighting as you tell the engine the brightest any source of light can be is 0.

Rise of the Tomb Raider
So this game has a similar setup, a brightness slider which controls the black point (move to lowest point) and then a secondary HDR slider which controls the peak brightness.
Tomb Raider has been capped at 4000nits. Like the Frostbite games, either set the slider to the Max to let your TV tone map, or follow the on screen instructions to try to eyeball the peak brightness and let the game output.

RotTR is particularly great as there are loads of specular highlights, not just in the normal places you'd expect to see them in real life such as on the shiny ice and the twinkles in the snow as they reflect from the sun.


But also in lower light conditions on less obviously "shiney" surfaces, so as on this insanely high resolution boot.


It appears that once the output reaches 4000nits, any level above this jumps straight to 10k nits (which the display will clip anyway, as presumably the metadata is telling the display it's 4000nit)
It's not data that is missing, it can be brought into visibility occasionally, which suggests that it is an artistic decision or part of the process for grading the game for HDR.



Assassin's Creed Origins

AC:Origins is another game that really does HDR well, like Tomb Raider is also capped at 4000nits (the pink bits) and offers a brightness slider which should be set as low as possible , based upon your viewing environment. Also it offers you a Max Luminance, which is neatly labeled in nits.
There is also a "Paper white" scale, so as well as having sliders that dictate the darkest something can be in game, the brightest it can be, the game gives you a scale that allows you to adjust one of the mid points: How bright is a piece of paper.



Ubisoft's recommendation for the paper white slider is

However like the brightness slider, this is here to allow you to adjust the output of the game to match your viewing conditions, if you are in a cinema like controlled light source environment technically you would set to 80nit, however as your surrounding light increases you will prefer higher values.



Setting the games to the technical correct settings also highlights how HDR10/Dolby vision is too dim for many consumers. You'll also see how developers are still getting to grips with these new technologies, you'll see that the HUD in AC becomes a little too dark as the main game image becomes calibrated correctly.


Moving on, we can see 3 Microsoft 1st Party Offerings which are all working with a full 10000nit outputs

Forza Horizon 3

A really great example of how HDR doesn't need to be used just for crazy bright sun-spots, magic and fires.
Cloudy days still have very bright skies, photographers have had to use a few tricks to deal with the contrast issues they cause. Forza really shines here, it's almost enough to make me feel a sensation cold and drabness when viewed with my own eyes.

You can see the skie reaching 4000 or so nits, tires grill being suitably unlit and dark and then the headlights pumping out a full 10k nit output.


In a night time environment you can see the game making full use of the darker end of the scale, whilst the explosions and headlights are still illuminating in a realistic fashion.


This is also a good example , even in SDR how a very dark image which adheres to the correct HDR10 standards will be perceived as being too dark or dim or "crushing blacks" . As we can see from the Luma map, the detail is all actually there, but human eyes cannot adjust to see such details until they are exposed to low levels of light for 10 minutes or so, when certain chemical reactions occur within the cells of the eyes. This obviously is an issue for many consumers who probably aren't in lightning conditions conducive to this happening.

Gears of War 4


INSECTS


All 3 games work with 2 sliders
A brightness slider, this controls black levels, but also controls an aspect of contrast, pushing max nit output up to 10000k
The a secondary HDR slider which also allows the max output to be set below that which the other slider adjusts to.
Forza and Gears simply refer to Brightness and HDR , however INSECTS refers to this as HDR Contrast and HDR brightness.

Now lets look at a few of other implementations

Shadow of Mordor
SoM offers a super simplified approach, the game is always outputting 10k nit maximum and the tone mapping is left entirely to the TV. This is interesting, as we know that the developers have never actually seen the game outputting at 10k nits as there is no such display available.
There is a traditional Brightness slider which allows users to specify the black point to their taste/viewing conditions.

Here we can see the obvious place to look for 10k nits, the sun and the specular highlights. You can also see that as it should be, the shadowed sided of the player character is as dark as it should be.


Agents of Mayhem
Similar approach here, except the game is capped at 1000nits and has obviously been graded/toned with this in mind, as it's a fairly achievable output from a consumer display. I don't think it's a coincidence that they have made the game with this in mind and it's actually one of the games that has a really great looking HDR output.





Again, like other games, adjust slider to the left to improve black level, although this does appear to have an effect on

DEUS EX : Mankind Divided
Lots of raised blacks in this perhaps as an artistic choice, but also a bug which causes grading to to totally fail if the in game brightness slider goes below a value of 35%.


This looks like the result of some kind of flawed curve adjustment.


40-45% will give you 1000nit output without raising the blacks too much.

Final Fantasy XV
From one Squarenix meh, to a Squarenix wow.
1000 nit fixed max output and a simple brightness slider to drop black levels.
Really fantastic grading throughout and in various lightning conditions.





Even the Titlescreen has 2D elements optimised for HDR.


Monster Hunter World
Much like Deus Ex, Monster Hunter World appears to operate within 4000nits, however much like DEUS EX, when HDR is enabled, the game appears to have severe black level problems. At the default brightness setting, this is what we are getting.



All mids and highlights, where are the shadows?

So with a quick and dirty level amendment, we can remove the extra HDR luminance from the data and take a look at the histogram



If we compare this to an in game SDR shot taken just moments later



we can see the significant shift that there is between the SDR and the HDR toning. Contrast and black levels are totally out of sorts in HDR.
This can be remedied slightly by dropping the brightness down as low as it goes, but there is not enough of a change to make it right. This appears as if it is at least partly caused by some kind of eye adaptation that is occuring.


Whew!
That was a lot of images!
Incredible thread.

Also to the bolded.
 

Echo

Member
Oct 29, 2017
5,319
Mt. Whatever
Wow this is amazing effort and great thread!

I knew Monster Hunter was fucked lol. I really hope they get that fixed. Even though the shadows are borked, I noticed that HDR pretty much eliminates color banding so I leave it on.

So to setup HDR properly you first turn it on in-game, and then adjust in-game brightness for black levels? See, I never understood that part. The games always ask you to adjust brightness to make something "barely visible" but I never knew if you were suppossed to do that before or after turning HDR on. Now I know, and this will probably help a whole bunch with my perception of HDR.