• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.

EvilBoris

Prophet of Truth - HDTVtest
Verified
Oct 29, 2017
16,678
So you may of seen some of my other threads taking a look at the HDR output of various games, trying to understand why a game might look particularly good and which game are going to get the best out of a premium HDR TV.

There have been great games ; Call of Duty WW2, Sea of Thieves
and some stinkers ; Monster Hunter World , Deus Ex.

But for the most part, the developers that have gone down the route of going to the effort of implementing HDR have done so with success.


Last week Nier Automata launched on Xbox One, alongside various improvements for the game on the X, HDR became an advertised new addition to the title as it belatedly came to Microsoft's platform.

Starting to play the game, it quickly became apparent the game didn't have the same wow effect that has I have become attuned to whilst playing other games, at first I thought this was maybe just the look of the game. Others online were commenting the game looked dim, even comparing screenshots of the game versus other version, noting the colours were further muted vs the PC/PS4 with the game looking a little washed out.

My first look at the game appeared as if the game was totally SDR, with not a glimmer of any peak brightness close to what we would expect.



Another look at the game made it clear that the game was so low in it's dynamic range, that using my current methods for determining the output actually didn't work on Nier, so I had to actually build some new tools to help identify the very low peaks

Tuning my tools to find the highest peak in an image led me to find that the game did have a very low contrast output, with the highest peaks I could find hitting 600ish nits. You can see them as red pixels in the image below.
pZ70DaB.jpg



So I went on to accept that the game had actually been mastered for 600nits, which seems like an odd value, however it's not unheard of (HDR favourite Planet Earth 2 is mastered to a similar level)

As you progress further through the game, the environments change and I started to reach greener , more foliage laden areas and the game triggered a cutscene, which looked almost entirely different to the game I was just playing (apart from being a poorly quality 1080p video) it was noticeably lighter, brighter and more colourful than the game I was just playing.
It's unusual for devs to encode in game FMVs in HDR and this was likely to been rendered from another version of the game, however I was confused as to why the tone and mood of the game looked so significantly different.

So I disabled HDR.....
Oh. The game offered more contrast, was brighter and more saturated. Essentially the exact opposite of what I would expect to happen.


So I started digging deeper.
First of all I needed to indentify whether this difference was simply a processing variation between SDR/HDR on my TV.
The simplest way to double check this is to take some raw screenshots with HDR enabled and disabled and compare them, which was what I did.
It became almost immediately obvious that the 2 screenshots I had taken were almost identical, with some very minor differences in black level, white level and exposure.

In the below image your can see the 2 Raw screenshots from the Xbox (labeled SDR and HDR output) The SDR image is PNG from the Xbox and the HDR image is a 10bit JXR image from the xbox.

You can then see that I have converted each image to match the other output.



The game is entirely in SDR and is expanded from 8bit to 10bit and then adjusted to fit into the appropriate place within an HDR output.
Nothing is gained, in fact things that would have previously been darker now become lighter than then should be, resulting in a lower contrast, washed out image.

This is the HDR equivalent of upscaling and claiming the game is a higher resolution than it actually is. But then actually up-scaling it incorrectly and reducing the resolution a bit

It's super disappointing that a developer has done this and we should expect better. HDR is already a total mess of a format with such varied support and capability. It's clearly possible at this point in time with such a new technology for a developer to misunderstand and produce poor results, but this feels more like an effort to deceive.
 
Last edited:

TemplaerDude

Member
Oct 25, 2017
2,204
Yeah, this is the kind of thing that makes people question HDR. Unacceptable implementation, to say the least.
 

Lukemia SL

Member
Jan 30, 2018
9,384
Can you check out Crash Bandicoot N Sane too? Something seems really off with it to me. I can't pinpoint it. The colours are hella weird too.

Yeah, this is the kind of thing that makes people question HDR. Unacceptable implementation, to say the least.

I don't question it most of the games I have played have been remarkable, but implementation is everything with it.
Crash and to some extent the Witcher 3 are not the best. Now Nier.
 

Deleted member 36578

Dec 21, 2017
26,561
This is a good critique. HDR is still a weird mythical beast to my eyes .I still don't know if I've actually witnessed it for myself. It's certainly a huge mess of a format and I guarantee a lot of consumers that have it enabled don't even realize what it is or what it's doing.
 

Jedi2016

Member
Oct 27, 2017
15,598
You'd think at least one person on the team would understand how HDR actually works, and that this is not it.
 

WillyFive

Avenger
Oct 25, 2017
6,976
That's pretty dirty.

I also think upscaling is also super dirty, but since people were fine with that, then people will probably be fine with this.
 

SK4TE

Banned
Nov 26, 2017
3,977
This is a good critique. HDR is still a weird mythical beast to my eyes .I still don't know if I've actually witnessed it for myself. It's certainly a huge mess of a format and I guarantee a lot of consumers that have it enabled don't even realize what it is or what it's doing.
All I know is it makes lights and neon look really good.
 
Can you briefly explain what the actual parameters are for HDR concerning real-time rendering? Is it a specification that needs to meet certain numbers/variables to be considered "HDR"? Is it an actual process? Is it a protocol of some kind?
 

StuBurns

Self Requested Ban
Banned
Nov 12, 2017
7,273
This is a good critique. HDR is still a weird mythical beast to my eyes .I still don't know if I've actually witnessed it for myself. It's certainly a huge mess of a format and I guarantee a lot of consumers that have it enabled don't even realize what it is or what it's doing.
When done well, it makes a huge difference.

GT Sport, FFXV, TLG, it's a night and day experience for sure.
 

FiXalaS

Member
Oct 27, 2017
6,569
Kuwait.
You made the tools?

Interesting write-up, disappointing because this is false advertising, but the game is really great I hope people stay for the robots as well.
 

Deleted member 36578

Dec 21, 2017
26,561
When done well, it makes a huge difference.

GT Sport, FFXV, TLG, it's a night and day experience for sure.

I wish I could come over at see your set up lol. Whenever I look at hdr enabled games it just looks different to me. Not necessarily better or worse.
 

JahIthBer

Member
Jan 27, 2018
10,372
Amazing investigation skills, as a fan of Ultra HD Blu-Ray's, some of them have piss poor HDR conversions too.
 

mjc

The Fallen
Oct 25, 2017
5,879
I'm not sure if I should be more miffed at Square for saying it's HDR or Microsoft for allowing it to be advertised as such.
 

StuBurns

Self Requested Ban
Banned
Nov 12, 2017
7,273
I wish I could come over at see your set up lol. Whenever I look at hdr enabled games it just looks different to me. Not necessarily better or worse.
I've had two HDR TVs, the Bravia XE80, and the LG C7, and yeah, the C7 is certainly the more impressive HDR showing, but the Bravia was still a big jump over SDR.
 

pswii60

Member
Oct 27, 2017
26,647
The Milky Way
At least fake HDR gives us 10 bit colour, so that's something.

But yeah, it's disappointing. But not surprising for this game tbh, and I wouldn't worry about it becoming a trend.
 

M1chl

Banned
Nov 20, 2017
2,054
Czech Republic
I'm not sure if I should be more miffed at Square for saying it's HDR or Microsoft for allowing it to be advertised as such.
MS probably look at metadata outputted from console and if there is a flag, that is a HDR10 game, well they just slap "HDR Game" on store and on that list. There is no physical copies so outrage would probably be bigger then. If it would be printed on case.
 

Deleted member 36578

Dec 21, 2017
26,561
I've had two HDR TVs, the Bravia XE80, and the LG C7, and yeah, the C7 is certainly the more impressive HDR showing, but the Bravia was still a big jump over SDR.

That C7 looks like a hell of a tv. If I had a bunch of extra money right now I'd go out and get one immediately lol
 

JahIthBer

Member
Jan 27, 2018
10,372
MS probably look at metadata outputted from console and if there is a flag, that is a HDR10 game, well they just slap "HDR Game" on store and on that list. There is no physical copies so outrage would probably be bigger then. If it would be printed on case.
I think it's more like Microsoft paid Square to have HDR exclusivity (Tomb Raider) or simply asked to have HDR support for the Xbox release, but Square kinda just shoehorned it in.
 

pswii60

Member
Oct 27, 2017
26,647
The Milky Way
MS probably look at metadata outputted from console and if there is a flag, that is a HDR10 game, well they just slap "HDR Game" on store and on that list. There is no physical copies so outrage would probably be bigger then. If it would be printed on case.
Well yeah, can't expect Microsoft to analyse the HDR output range for every game they certify. It's Platinum/Squenix that bare the responsibility here obviously.
 

M1chl

Banned
Nov 20, 2017
2,054
Czech Republic
I think it's more like Microsoft paid Square to have HDR exclusivity (Tomb Raider) or simply asked to have HDR support for the Xbox release, but Square kinda just shoehorned it in.
Don't know obviously, it was just my take how they determine quite simply if it's HDR enabled or not. I did not think about some exclusivity. Seems pointless at this scale.
 
OP
OP
EvilBoris

EvilBoris

Prophet of Truth - HDTVtest
Verified
Oct 29, 2017
16,678
Can you briefly explain what the actual parameters are for HDR concerning real-time rendering? Is it a specification that needs to meet certain numbers/variables to be considered "HDR"? Is it an actual process? Is it a protocol of some kind?

The parameters are no different for real time rendering, but the actual thresholds for what is considered HDR are non enforced.

SDR / Standard Dynamic Range content (or "LDR" Low Dynamic Range content) were typically made to be displayed at around 100nits, if you are at a theatre, this will be the brightest something could be.

As soon as you start producing content that goes above that, you are creating content with a larger dynamic range and you need the display to be able to view it.
HDR movies are available that range anywhere from 500 nits to 4000 nits, with a view that in future this will go as high as 10,000.

TV manufacturers were desperate to launch new TVs and a new format and in doing so they started producing displays that went above the SDR technology, but there was huge variation in capability. Displays were being sold as HDR that were not really much higher than an SDR display.

Consumers got confused, content didn't look much better if better at all and confidence was low. The manufacturers had started selling a new technology that was not standardized, which would have made it difficult to sell an improved set later on down the line.

A few of the manufacturers go together in an attempt to set a threshold and standardise HDR capabilities.
sony-ultra-hd-premium-logo-1.jpg

UHD Premium was created. In order to be a Ultra HD Premium display you needed to be able to display 1000nits of brightness and less than 0.05nits on screen simultaneously.
Now because the manufacturers were also trying to sell OLED screens, which were not able to come close to this threshold, they bent the rules and lowered the requirement for peak OLED screens to 540nits, because they could go darker 0.0005nits with ease. This was deceptive, as HDR content is all about the extra brightness and not about dark imagery.

Sony famously refused to participate in the standard as they were concerned that this would not leave enough low/mid range products capable of meeting this criteria, which was where they typically sell the vast majority of their displays.
So they went with their own format
sony-4k-hdr-tv.jpg



This meant that the TV could accept and 4K and HDR signal, but was no guarentee of anything else.

More recently they have backpeddle and started referring to their TVs as XDR / EDR (Extended Dynamic Range).
Because as the others had previously, they became aware that they were going to sell a load of TVs as "HDR" and then consumers would not understand that the experience would be better with new models later on down the line and would not feel compelled to purchase an actual HDR TV form Sony.


So there has been loads of politics and bullshit from Day Zero of HDR and Displays.

That probably hasn't fully answered your question, but it's hard to keep it briefer than that :P
 

True Prophecy

Member
Oct 28, 2017
1,919
Thanks for this research, guess I won't double dip on the XB1X version. I really hope Digital Foundry will get to this level about HDR one day.
 
OP
OP
EvilBoris

EvilBoris

Prophet of Truth - HDTVtest
Verified
Oct 29, 2017
16,678
Is this something similar to what you mentioned TW3 does, just done poorly?

I *think* the Witcher 3 is doing something similar, except they have made some effort to have the output actually drive your TV to display some high peaks in vaguely appropriate places.
 

c0ld

Member
May 13, 2018
7
I hope HDR upscaling doesn't become a trend... Thank you for your analysis, I really enjoy the work you are doing.
 

Weltall Zero

Game Developer
Banned
Oct 26, 2017
19,343
Madrid
That's gotta be a bug. This is far worse than upscaling; an equivalent with resolution would be the game rendering native at 720p TVs, but upscaling from 540p on 1080p TVs. And even then I think that wouldn't look as bad as whatever the fuck they're doing in HDR.
 

Conkerkid11

Avenger
Oct 25, 2017
13,943
How difficult is it for developers to implement proper HDR to the point where faking it made more sense to them? Like, what goes into making a game use proper HDR?
 

Falconbox

Banned
Oct 27, 2017
4,600
Buffalo, NY
yeesh, that last tweet looks awful. The HDR output makes the scene look dark and you can barely see anything, while SDR is normal lighting.
 

Dictator

Digital Foundry
Verified
Oct 26, 2017
4,928
Berlin, 'SCHLAND
This is a very important thread, as it talks about how content for HDR still has to be mastered and *tone mapped* to a degree for nit levels (presuming they are not fully user configurable, which would be best). It is not just the RAW HDR image! It is a rough thing, and I really hope the HDR standard and game devs start thinking about automated or use automated ways of solving this problem - this looks pretty bad as per the OP.
 
The parameters are no different for real time rendering, but the actual thresholds for what is considered HDR are non enforced.

SDR / Standard Dynamic Range content (or "LDR" Low Dynamic Range content) were typically made to be displayed at around 100nits, if you are at a theatre, this will be the brightest something could be.

As soon as you start producing content that goes above that, you are creating content with a larger dynamic range and you need the display to be able to view it.
HDR movies are available that range anywhere from 500 nits to 4000 nits, with a view that in future this will go as high as 10,000.

TV manufacturers were desperate to launch new TVs and a new format and in doing so they started producing displays that went above the SDR technology, but there was huge variation in capability. Displays were being sold as HDR that were not really much higher than an SDR display.

Consumers got confused, content didn't look much better if better at all and confidence was low. The manufacturers had started selling a new technology that was not standardized, which would have made it difficult to sell an improved set later on down the line.

A few of the manufacturers go together in an attempt to set a threshold and standardise HDR capabilities.
sony-ultra-hd-premium-logo-1.jpg

UHD Premium was created. In order to be a Ultra HD Premium display you needed to be able to display 1000nits of brightness and less than 0.05nits on screen simultaneously.
Now because the manufacturers were also trying to sell OLED screens, which were not able to come close to this threshold, they bent the rules and lowered the requirement for peak OLED screens to 540nits, because they could go darker 0.0005nits with ease. This was deceptive, as HDR content is all about the extra brightness and not about dark imagery.

Sony famously refused to participate in the standard as they were concerned that this would not leave enough low/mid range products capable of meeting this criteria, which was where they typically sell the vast majority of their displays.
So they went with their own format
sony-4k-hdr-tv.jpg



This meant that the TV could accept and 4K and HDR signal, but was no guarentee of anything else.

More recently they have backpeddle and started referring to their TVs as XDR / EDR (Extended Dynamic Range).
Because as the others had previously, they became aware that they were going to sell a load of TVs as "HDR" and then consumers would not understand that the experience would be better with new models later on down the line and would not feel compelled to purchase an actual HDR TV form Sony.


So there has been loads of politics and bullshit from Day Zero of HDR and Displays.

That probably hasn't fully answered your question, but it's hard to keep it briefer than that :P

So wait, HDR is strictly a higher contrast ratio? Does that mean that the source is responsible for sending a signal that will look good when run through HDR?
 

Verelios

Member
Oct 26, 2017
14,876
The parameters are no different for real time rendering, but the actual thresholds for what is considered HDR are non enforced.

SDR / Standard Dynamic Range content (or "LDR" Low Dynamic Range content) were typically made to be displayed at around 100nits, if you are at a theatre, this will be the brightest something could be.

As soon as you start producing content that goes above that, you are creating content with a larger dynamic range and you need the display to be able to view it.
HDR movies are available that range anywhere from 500 nits to 4000 nits, with a view that in future this will go as high as 10,000.

TV manufacturers were desperate to launch new TVs and a new format and in doing so they started producing displays that went above the SDR technology, but there was huge variation in capability. Displays were being sold as HDR that were not really much higher than an SDR display.

Consumers got confused, content didn't look much better if better at all and confidence was low. The manufacturers had started selling a new technology that was not standardized, which would have made it difficult to sell an improved set later on down the line.

A few of the manufacturers go together in an attempt to set a threshold and standardise HDR capabilities.
sony-ultra-hd-premium-logo-1.jpg

UHD Premium was created. In order to be a Ultra HD Premium display you needed to be able to display 1000nits of brightness and less than 0.05nits on screen simultaneously.
Now because the manufacturers were also trying to sell OLED screens, which were not able to come close to this threshold, they bent the rules and lowered the requirement for peak OLED screens to 540nits, because they could go darker 0.0005nits with ease. This was deceptive, as HDR content is all about the extra brightness and not about dark imagery.

Sony famously refused to participate in the standard as they were concerned that this would not leave enough low/mid range products capable of meeting this criteria, which was where they typically sell the vast majority of their displays.
So they went with their own format
sony-4k-hdr-tv.jpg



This meant that the TV could accept and 4K and HDR signal, but was no guarentee of anything else.

More recently they have backpeddle and started referring to their TVs as XDR / EDR (Extended Dynamic Range).
Because as the others had previously, they became aware that they were going to sell a load of TVs as "HDR" and then consumers would not understand that the experience would be better with new models later on down the line and would not feel compelled to purchase an actual HDR TV form Sony.


So there has been loads of politics and bullshit from Day Zero of HDR and Displays.

That probably hasn't fully answered your question, but it's hard to keep it briefer than that :P
Huh, this was an interesting read.
 

gabdeg

Member
Oct 26, 2017
5,954
šŸ
Devs put in HDR toggles into all games please. The higher res on Nier looks great but it stinks having to turn off HDR system-wide so the game doesn't turn into a dim gray mess.
 

CoLD FiRE

Banned
Nov 11, 2017
369
Thank you for pointing this out. Hopefully they fix this, that low contrast look is awful.

Yeah, that's not happening. They kept the PC version broken with even bigger problems than a fake HDR implementation since release and never fixed it.

It's really a shame how this game has been handled since it is a really great game and deserves more care from Platinum and Square Enix.