Another couple of weeks of playing around and I've managed to refine my techniques a little further, found out how to apply them to video and have pulled out a few other great examples.
This is a follow up to this thread
https://www.resetera.com/threads/hdr-games-analysed.23587/
I've spent some more time with a couple of Frostbite games, as these have a really great dynamic tone mapping systems that allow you to very specifically tune the picture to function perfectly, without any clipping or harsh loss of detail on any HDR TV.
Star Wars Battlefront 2
This is a game that can do a full 10000nit HDR output, the in game HDR slider allows you do cap the lighting engine at a maximum value, then everything between the SDR range and that point is brilliant scalled accordingly, bypassing your TV's need to attempt to do this without any context of what the image is.
in the example below I am moving the HDR slider from the top 10000 nit to the lowest point which is 0 nit.
This is a really good example as we have a natural phenomenon of the Sun , but also the lightsaber which is also particularly illuminated
You can see at the upper values the pinks and whites appearing in the sunspots (4000-10000nit) and the lightsaber, but as we move that slider downwards, the lighting is being reconfigured. However you will see that the bulk of the image remains the same, as these tones all sit within the SDR range, which always will make up the bulk of what we are looking at.
This is until you tell the lighting engine that 0nit is the brightest anything can be and the screen essentially turns black.
What is particularly evident if you look at this on a TV, is how the lightsaber (when the HDR is set too high) actually loses much of it's colour as your TV clips out the data as unuseable, producing a very light green colour also white colour, not just in the central emissive part, but way out into the "glow" too.
The lightsaber is also a good example of how HDR is an artists tool, just like any aspect of colour and the technology needs to be used correctly to ensure it looks as the artists intend.
Horizon Zero Dawn : Frozen Wilds
Even from this short clip we can see how to do HDR right, almost everything you see sits within the standard SDR range, however the highlights on Aloy's weapons glimmer in the sun and the sparkle that runs down her back are heading towards the 10k nit level at times. You can see the clouds are all highly illuminated and sit between 1000-4000nits with the sun itself hitting 10k.
Horizon doesn't have any peak brightness slider, so much like other games that are handling this 10k nit output the same, (Shadow of Mordor as an example) we don't know what the game is communicating to the TV with metadata.
It may be telling it that it's a 10k nit piece of content (as the data is clearly there) or the game itself may be have been graded/mastered to "target" say 1000nits, but not hard capped by the engine.
Note how the sparkles on Aloy's back are 10k (white) at certain points, but for the vast majority of the time are around 1000k. (red)
This is something I would like to explore more, but it requires some hardware that can read the HDMI HDR metadata.
Uncharted 4
Got some HDR footage of Uncharted 4 too, this appears to have the same setup as Horizon and a few other games.
Steep
This recently got an HDR update and is another super pretty game. Everybody loves a sunet.
Steep has super amazing real time lighting and time of day that you can alter at the press of a button, so here is an example of cycling through lots of different lighting conditions.
Just like Uncharted and Horizon, we have no HDR brightness slider. However unlike those 2 games, we see that nothing ever exceeds 4000nits.
The reason that 4000nits is an important number in several games and has it's own colour in my visualizations is that it is the current maximum value for Dolby vision (although it supports upto 10k).
Dolby's reference monitor is liquid cooled behemoth LCD screen with RGB FALD , true blacks and a 4000nit output.
Sony have a reference OLED monitors which is 1000nit.
Essentially a AAA game or Movie is likely to have been graded at one of these 2 outputs and when we are seeing certain values appear we can make some assumptions that the game has actually been tested at these outputs, which may be why when we see these numbers, they are typically really natural looking implementations.
There are a number of games that are using 4000nits as an in engine limitation (Steep, Tomb Raider, AC:Origins)
including one of the soon to be undisputed Kings of HDR output
Sea of Thieves
Which looks incredible and natural at every turn
Halo 3 (Backwards Compatible)
I was really skeptical about what they were doing with the HDR in the BC games (partly because there was conflicting information about it), but it appears to be just as fantastic as many other titles.
the underlying lighting engine was was running in HDR, with the final image presentation to the screen being tone mapped down to HDR, now with some amazing wizardry, they've gone back and via the emulation swapped out the tone mapping for an HDR workspace.
Halo 3' appears to be targeting 1000nits as the peak brightness, on the default (normal) brightness setting you get a wonderfully exposed image with all the extra luma from the HDR.
It was always a good looking game, but it really is amazing how good it looks with 4K and HDR.
One of the most amazing scenes for showing off Halo 3's Lighting.
Bonus 360 vs X enhanced Halo 3 image
Mirror's Edge
Another HDR compatible BC title, initially I thought this was some kind of psedo HDR, as I was constantly coming across clipped whites with very bright light sources and this still may be the case. The HDR component of the game caps out at 1000nits,
But I think that is probably intentional, as it disappears as you get closer. Perhaps a combination of LOD and a stylistic choice of high intensity surface glare, like that you get on a road on a really sunny day. That is certainly the look of Mirrors Edge I have in my mind from 10 years ago...
Such a nice looking game, it really reminds me of being in Tokyo.
You can kind of see it happening in this shot here, the diagonal pillar in the centre is brighter and less detailed at a distance
Claybook (Early Access)
So here is something a little different, Claybook launched last week and is HDR compatible.
It's a really beuatiful game, quite unlike anything else and it looks particularly great in 4K.
So far, what i can see from the game is that rather unusually the game is operating entirely within the SDR range, even with the HDR enabled. The game is capped at 200nit (from what I can see)
This may be an artistic choice or perhaps a workaround to enable the 10bit colour on the console.
It's likely a combination of the 2, the game has a very matte , flat pastel look, with raised blacks and toned highlights, which is a very popular look in other media and the basis for several instagram filters.
Recore
For my next project I may look at the specifics of how the slider affect the output further. Different games take different approaches.
Here is a shot of recore for example with the brightness as low as it can go
It's still hitting 4000k + on the highlights, well in excess of what any screen will offer, so you could play it like this, however I imagine that the overall image is both less bright than intended and darker than you would like to see, so you would probably raise it above that.
If you go the other extreme and take the brightness to the top you will see that the whole image has become brighter and every value has become magnified, which will literally burn your eyes out!
Recore actually what appears to be a broken HDR calibration slider, it is actually asking you to adjust your peak brightness until it hits 200nits.
Destiny 2
Another really great example of how HDR is supposed to work, It looks like the same game, you don't put it on and go "oooh HDR has been enabled".
However you see the HDR part of the image where you should be able to see it, giving you a more lifelike expression of light and darkness.
When brightness is set to 4 game looks like it is targetting 1000nits, with very very bright things sometimes breaching this limit.
Adjusting the in game will change the overall brightness and you'll breach that 1000nit target more often
Destiny 2 does have a couple of weird things going on
Just as the game does in the SDR mode, the game has a brightness control which is defining the darkest point in the game, which would be fine.
Except the actual screen for calibration I suspect is broken, as the first 4 values according to all the data are mostly sitting below 0nits. This is actually another part of an image where a TV will typically tone map.
It would all be ok if you could set your black point close to true code value 0, but when you actually go look at the game it has a profound negative effect on the overall image and the actual game output doesn't reflect the changes you've made.
Below are 3 images, one with the game set at 7 (top) , the middle is at 4 and the bottom is at 1
As you can see, setting the lowest setting actually darkens the whole image, removing the HDR aspect from the game and giving you an overall darker image than it's SDR counterpart.
The highest setting according to the settings screen should still maintain black areas, but you'd see that the shadows are actually now too bright.
4 appears to achieve dark areas and hits that 1000nit spot in the highlights.
Not only is the bright settings screen possibly not accurate, but the game itself can't switch between HDR and SDR as every other game I've tested can. It causes the Xbox to stop displaying an image. I suspect that would normally be the type of thing that certification would check.
Hitman
One of my favourite games this gen, at least in part to how wonderful and varied the locations are.
Hitman is the only example of a game that maxes out at 2000nit when set via the in game HDR setting.
Hitman has a settings anomoly, where the gamma setting which you can edit in SDR mode carries across into HDR, so if you have adjusted it you will need to set it back to 1.0
Another really good game to test out your TV and even calibrate it.
If we take a trip into the HDR settings and take some measurements, we can see exactly what the game is allowing to adjust
The HDR slider is labeled at it's lowest point 4.00 (400nit) and at the top end 20.00 (2000nit) , if you have calibrated your TV using one of the frostbite games or AC:Origins, you can expect the same value you end up with to be the same here in hitman.
Below is a comparison of looking at the sun (which will be commonly defined as one of the brightest things in game) at 400nit vs 2000nit.
You can see here that the lighting engine behaves a little bit differently to other games, the bloom around the sun doesn't change so much, but the intensity of the sun itself does.
Again, you can see the same thing occurring in other places, the only thing that is affected are things that are very bright. Again, this explains why the game is consistently really good looking across various different dispays, the game doesn't wildly deviate from how it would look in SDR.
This is a follow up to this thread
https://www.resetera.com/threads/hdr-games-analysed.23587/
I've spent some more time with a couple of Frostbite games, as these have a really great dynamic tone mapping systems that allow you to very specifically tune the picture to function perfectly, without any clipping or harsh loss of detail on any HDR TV.
Star Wars Battlefront 2
This is a game that can do a full 10000nit HDR output, the in game HDR slider allows you do cap the lighting engine at a maximum value, then everything between the SDR range and that point is brilliant scalled accordingly, bypassing your TV's need to attempt to do this without any context of what the image is.
in the example below I am moving the HDR slider from the top 10000 nit to the lowest point which is 0 nit.
This is a really good example as we have a natural phenomenon of the Sun , but also the lightsaber which is also particularly illuminated
You can see at the upper values the pinks and whites appearing in the sunspots (4000-10000nit) and the lightsaber, but as we move that slider downwards, the lighting is being reconfigured. However you will see that the bulk of the image remains the same, as these tones all sit within the SDR range, which always will make up the bulk of what we are looking at.
This is until you tell the lighting engine that 0nit is the brightest anything can be and the screen essentially turns black.
What is particularly evident if you look at this on a TV, is how the lightsaber (when the HDR is set too high) actually loses much of it's colour as your TV clips out the data as unuseable, producing a very light green colour also white colour, not just in the central emissive part, but way out into the "glow" too.
The lightsaber is also a good example of how HDR is an artists tool, just like any aspect of colour and the technology needs to be used correctly to ensure it looks as the artists intend.
Horizon Zero Dawn : Frozen Wilds
Even from this short clip we can see how to do HDR right, almost everything you see sits within the standard SDR range, however the highlights on Aloy's weapons glimmer in the sun and the sparkle that runs down her back are heading towards the 10k nit level at times. You can see the clouds are all highly illuminated and sit between 1000-4000nits with the sun itself hitting 10k.
Horizon doesn't have any peak brightness slider, so much like other games that are handling this 10k nit output the same, (Shadow of Mordor as an example) we don't know what the game is communicating to the TV with metadata.
It may be telling it that it's a 10k nit piece of content (as the data is clearly there) or the game itself may be have been graded/mastered to "target" say 1000nits, but not hard capped by the engine.
Note how the sparkles on Aloy's back are 10k (white) at certain points, but for the vast majority of the time are around 1000k. (red)
This is something I would like to explore more, but it requires some hardware that can read the HDMI HDR metadata.
Uncharted 4
Got some HDR footage of Uncharted 4 too, this appears to have the same setup as Horizon and a few other games.
Steep
This recently got an HDR update and is another super pretty game. Everybody loves a sunet.
Steep has super amazing real time lighting and time of day that you can alter at the press of a button, so here is an example of cycling through lots of different lighting conditions.
Just like Uncharted and Horizon, we have no HDR brightness slider. However unlike those 2 games, we see that nothing ever exceeds 4000nits.
The reason that 4000nits is an important number in several games and has it's own colour in my visualizations is that it is the current maximum value for Dolby vision (although it supports upto 10k).
Dolby's reference monitor is liquid cooled behemoth LCD screen with RGB FALD , true blacks and a 4000nit output.
Sony have a reference OLED monitors which is 1000nit.
Essentially a AAA game or Movie is likely to have been graded at one of these 2 outputs and when we are seeing certain values appear we can make some assumptions that the game has actually been tested at these outputs, which may be why when we see these numbers, they are typically really natural looking implementations.
There are a number of games that are using 4000nits as an in engine limitation (Steep, Tomb Raider, AC:Origins)
including one of the soon to be undisputed Kings of HDR output
Sea of Thieves
Which looks incredible and natural at every turn
Halo 3 (Backwards Compatible)
I was really skeptical about what they were doing with the HDR in the BC games (partly because there was conflicting information about it), but it appears to be just as fantastic as many other titles.
the underlying lighting engine was was running in HDR, with the final image presentation to the screen being tone mapped down to HDR, now with some amazing wizardry, they've gone back and via the emulation swapped out the tone mapping for an HDR workspace.
Halo 3' appears to be targeting 1000nits as the peak brightness, on the default (normal) brightness setting you get a wonderfully exposed image with all the extra luma from the HDR.
It was always a good looking game, but it really is amazing how good it looks with 4K and HDR.
One of the most amazing scenes for showing off Halo 3's Lighting.
Bonus 360 vs X enhanced Halo 3 image
Mirror's Edge
Another HDR compatible BC title, initially I thought this was some kind of psedo HDR, as I was constantly coming across clipped whites with very bright light sources and this still may be the case. The HDR component of the game caps out at 1000nits,
But I think that is probably intentional, as it disappears as you get closer. Perhaps a combination of LOD and a stylistic choice of high intensity surface glare, like that you get on a road on a really sunny day. That is certainly the look of Mirrors Edge I have in my mind from 10 years ago...
Such a nice looking game, it really reminds me of being in Tokyo.
You can kind of see it happening in this shot here, the diagonal pillar in the centre is brighter and less detailed at a distance
Claybook (Early Access)
So here is something a little different, Claybook launched last week and is HDR compatible.
It's a really beuatiful game, quite unlike anything else and it looks particularly great in 4K.
So far, what i can see from the game is that rather unusually the game is operating entirely within the SDR range, even with the HDR enabled. The game is capped at 200nit (from what I can see)
This may be an artistic choice or perhaps a workaround to enable the 10bit colour on the console.
It's likely a combination of the 2, the game has a very matte , flat pastel look, with raised blacks and toned highlights, which is a very popular look in other media and the basis for several instagram filters.
Recore
For my next project I may look at the specifics of how the slider affect the output further. Different games take different approaches.
Here is a shot of recore for example with the brightness as low as it can go
It's still hitting 4000k + on the highlights, well in excess of what any screen will offer, so you could play it like this, however I imagine that the overall image is both less bright than intended and darker than you would like to see, so you would probably raise it above that.
If you go the other extreme and take the brightness to the top you will see that the whole image has become brighter and every value has become magnified, which will literally burn your eyes out!
Recore actually what appears to be a broken HDR calibration slider, it is actually asking you to adjust your peak brightness until it hits 200nits.
Destiny 2
Another really great example of how HDR is supposed to work, It looks like the same game, you don't put it on and go "oooh HDR has been enabled".
However you see the HDR part of the image where you should be able to see it, giving you a more lifelike expression of light and darkness.
When brightness is set to 4 game looks like it is targetting 1000nits, with very very bright things sometimes breaching this limit.
Adjusting the in game will change the overall brightness and you'll breach that 1000nit target more often
Destiny 2 does have a couple of weird things going on
Just as the game does in the SDR mode, the game has a brightness control which is defining the darkest point in the game, which would be fine.
Except the actual screen for calibration I suspect is broken, as the first 4 values according to all the data are mostly sitting below 0nits. This is actually another part of an image where a TV will typically tone map.
It would all be ok if you could set your black point close to true code value 0, but when you actually go look at the game it has a profound negative effect on the overall image and the actual game output doesn't reflect the changes you've made.
Below are 3 images, one with the game set at 7 (top) , the middle is at 4 and the bottom is at 1
As you can see, setting the lowest setting actually darkens the whole image, removing the HDR aspect from the game and giving you an overall darker image than it's SDR counterpart.
The highest setting according to the settings screen should still maintain black areas, but you'd see that the shadows are actually now too bright.
4 appears to achieve dark areas and hits that 1000nit spot in the highlights.
Not only is the bright settings screen possibly not accurate, but the game itself can't switch between HDR and SDR as every other game I've tested can. It causes the Xbox to stop displaying an image. I suspect that would normally be the type of thing that certification would check.
Hitman
One of my favourite games this gen, at least in part to how wonderful and varied the locations are.
Hitman is the only example of a game that maxes out at 2000nit when set via the in game HDR setting.
Hitman has a settings anomoly, where the gamma setting which you can edit in SDR mode carries across into HDR, so if you have adjusted it you will need to set it back to 1.0
Another really good game to test out your TV and even calibrate it.
If we take a trip into the HDR settings and take some measurements, we can see exactly what the game is allowing to adjust
The HDR slider is labeled at it's lowest point 4.00 (400nit) and at the top end 20.00 (2000nit) , if you have calibrated your TV using one of the frostbite games or AC:Origins, you can expect the same value you end up with to be the same here in hitman.
Below is a comparison of looking at the sun (which will be commonly defined as one of the brightest things in game) at 400nit vs 2000nit.
You can see here that the lighting engine behaves a little bit differently to other games, the bloom around the sun doesn't change so much, but the intensity of the sun itself does.
Again, you can see the same thing occurring in other places, the only thing that is affected are things that are very bright. Again, this explains why the game is consistently really good looking across various different dispays, the game doesn't wildly deviate from how it would look in SDR.