• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.
  • We have made minor adjustments to how the search bar works on ResetEra. You can read about the changes here.
OP
OP
EvilBoris

EvilBoris

Prophet of Truth - HDTVtest
Verified
Oct 29, 2017
16,684
Nier Automata is not easy on the eyes at the best of times... Adding HDR is like trying to polish a turd.

Shame, because the game is great.

It is a really great game, I've been really enjoying this. The lack of HDR doesn't dampen that, it's just important that we are aware that there are some poor examples of game labelrd HDR and this is the highest profile games which has done this.
 

Pargon

Member
Oct 27, 2017
12,021
So what is actually hdr? The colours become more colourful than ever before? Like blue becomes more specific like sapphire blue, pink into magenta or live pink?
HDR allows displays to use an extended range of brightness, and an extended color gamut.
This may only help if you know how to read a histogram, but here's a comparison between HDR and SDR for the same scene:
hdr_ranger2si4.png


For all objects below 100 nits brightness in the scene, HDR and SDR look the same. But HDR has additional highlight details that extend beyond 100 nits.

madmax-highlight-larg7cjon.jpg


That would be similar to the sun being visible against the sky in this example.
The SDR image on the left cannot go brighter than 100 nits, while the sun in HDR may be 500 nits against a 100 nit sky.
The foreground is roughly similar in brightness on both, since it falls below that 100 nit threshold.

Of course HDR also permits scenes that have a higher average brightness overall. Not all scenes are going to be kept below 100 nits, so you could have an outdoor scene that is twice as bright as an indoor scene in HDR, while they would both be a similar brightness in SDR due to that 100 nits limit.

The color gamut is also expanded for HDR - though current displays cannot cover the full rec.2020 gamut, they only cover the P3 colorspace.
gamutmosss.jpg


The outer triangular shape covers the range of colors that the human eye can see, while the inner triangles show the color range covered by SDR (Rec.709), our current HDR displays (P3), and future displays covered by the HDR spec (Rec.2020).

With full P3 coverage, displays can show much deeper and more vibrant reds and greens, and slightly richer blues - but that depends on the content.
HDR content is not automatically more vibrant and saturated than SDR - that only applies if the content itself doesn't fit into the range of colors that Rec.709 can display.

But a lot of filmed content does exceed what the Rec.709 gamut can display, if the camera was capable of capturing it.
pointergamutkds7x.png


Here we see a chart that shows surface colors (Pointer's gamut) which is essentially all the colors found in nature that are not direct light sources.
Rec.2020 covers most of it, but P3 is still missing a lot in the blue/green region.

It's also worth pointing out that, if you configure an HDR display incorrectly, it's possible to essentially stretch out Rec.709 colors across the P3 or Rec.2020 color space, resulting in an extremely vivid and saturated image at all times with SDR content.
That's not how it is supposed to look, but some people do that and prefer it (and then wonder why HDR looks "dull").
I've run across a couple of HDR versions that just feel like the devs have implemented a sunglasses filter. It's crazy frustrating.
That often means your SDR settings are significantly brighter than they're supposed to be, as I've previously demonstrated.
Using the Mad Max example again, let's say that you have a 500 nit HDR display.
The sky is supposed to be displayed at 100 nits, and the sun is supposed to be 500 nits. 500 nits really stands out against a 100 nit background.
But SDR allows you to turn up the brightness as much as 5x what it is supposed to be, so that now the entire sky is 500 nits.
madmax-2-largem1k0n.jpg


This is only possible because of the limited dynamic range of SDR.
If you were to increase the brightness of the HDR image by 5x to match it, you would need a display capable of 2500 nits for the sun, so that it still stands out against the sky.
Even if you did have the option to increase the brightness of the HDR picture by 5x, without the display having that 2500 nits brightness you would end up compressing/clipping the highlights in the HDR image so that it looks identical to the overbright SDR image.

But if you're looking for an overbright and oversaturated image, an HDR display is going to be more capable of it than an SDR display.
You're just going to be disappointed with how true HDR content looks in comparison to it, since that forces most displays into showing more accurate brightness and saturation.
There was a topic a while back where someone wanted just that:
I bought a KS8500 last year and have had a miserable time trying to figure out what HDR brings to video games. Long story short, i just dont see the stunning difference people claim to see with HDR.

I haven't seen any movies in HDR thanks to Sony's ridiculous decision to ship the Pro without a 4k drive. But with Vudu finally adding HDR10 support to some Samsung TVs and the Roku sticks, i picked up a roku stick + on Black Friday and finally got to see Mad Max in HDR today and yes, i could immediately tell the difference, it looked dull and flat, and just plain wrong.

The Orange hue that made the movie so visually striking is replaced with this dull brown filter. I took some off screen shots to compare, and i know some videophiles here think that's blasphemy, but the difference b/w HDR and non-HDR is so striking, it easily comes across in pictures.

DPbbBe0WsAAyGg8.jpg:large

DPba2HWWkAAaP9q.jpg:large


Just look at how boring the so-called HDR shot looks. The orange color is just gone.

The next two shots are from the sandstorm scene, they are blurry but the get the point across.
DPbpmRVW4AANJmv.jpg:large

DPbpiv0XUAE-Vcn.jpg:large


I just dont understand HDR. With some games, the difference is so subtle, i go 'is that it?' Some games like ratchet and uncharted, i cant even tell the difference. I was really hoping for movies to add that wow factor that justifies this $1k tv and the $8 price tag for hdr movies, but the first impressions of mad max have completely soured me from spending another $8-10 on more hdr movies.

P.S I believe there is something wrong with this HDMI plug that comes with these Samsung tvs. mine might be defective. i have seen demos on the youtube app and some movies look incredible. this roku stick goes into the hdmi slot and while i get the HDR pop up, i just dont get this eye catching stunning image everyone keeps raving about. Mad max was supposed to be this life altering experience, and it just looks wrong. I really hope its an issue with my HDMI slots, but PS4 pro and roku all say my tv supports HDR.

P.P.S Bonus screenshots. guess which one is HDR?
DPbpv-lW4AA44FU.jpg:large

DPbsIWlX4AANBzy.jpg:large
The SDR images are ridiculously bright and oversaturated - but if that's what you want, the option is there on an HDR display (when viewing SDR content).
That's not the case, own a pretty awesome OLED tv which makes everything look great. Can get pretty freaking bright. Movies like IT or MAD MAX have plenty of super bright scenes and my tv doesn't do any funky ABL stuff with those, so that's not the issue. It's the games and their lightning exposure, which sucks. Maybe there's no other way to do it atm, but still sucks
OLEDs can do 150 nits full-screen brightness. Many HDR movies have scenes with an average brightness of 200, 300, 400 nits. Some discs have a maximum frame-average brightness of over 800 nits.
OLEDs are great for lower-brightness HDR content, but they fall short of a proper HDR presentation.
Since they have only gone from ~120 to ~150 nits in the last 5 years (though peak brightness has improved a lot), and they're still using that WRGB pixel structure to cheat brightness measurements, I'm not sure that it's a problem they will be able to solve.
We probably need µLED displays to have per-pixel brightness control and high brightness HDR combined.

That doesn't mean OLEDs are bad for HDR - they're the best display for many HDR sources. But they fall short for high brightness HDR.
Do you know this old article by Nixxes/Nvidia about HDR in Rise of the Tomb Raider?
https://developer.nvidia.com/implementing-hdr-rise-tomb-raider
There they claim games have rendered internally at HDR for years. Maybe Halo 5 uses some tech which is incompatible with the type of conversion detailed there. But even Halo 3 via backwards compatibility managed to produce pretty good HDR, didn't it?
Yes, games have been rendering in HDR and with high bit-depth buffers for a long time now.
The conversion to an output suitable for an HDR display is not necessarily an easy one though, as these engines were designed with an output to SDR in mind

It would be nice if game developers even started to output >8-bit in SDR, and actually dithered their conversions properly to eliminate banding.
Funnily enough, there's actually a mod for NieR:Automata on PC which does just that, replacing many shaders to add dither and minimize banding.
 
Last edited:

Justsomeguy

Member
Oct 27, 2017
1,712
UK
Just to note, the B7 isn't the previous year's C7. It's the same year. It's actually basically the same TV. The B7 has a different stand and the trim around the outside is silver, not black like on the C7.

The C series used to be curved, but now B and C is literally just an aesthetic difference. I wanted the C because it's a black screen, so I wanted black trim, but it's the same TV. Actually it sucked, because in the UK, the C7 was exclusive to a single retailer, so it didn't get quite the same price cut over the black friday deal that I grabbed it during.
Ahh yes cheers, the 7 is the year not the letter.
 

bulletyen

Member
Nov 12, 2017
1,309
Huh? You know they worked with Philips to make CD and DVD, right? Or a bunch of other people for BluRay?

The 'standard' isn't a format spec, it's a performance spec, and they still make TVs that operate within that performance spec, like the XD80 I had. Sony make the highest performing HDR TV right now, in fact, according to rtings.com they make the top three.

While it would be obscene to slap a HDR badge on a 150nit TV just because it technically accepted HDR signal and mapped to that low dynamic range the TV had, much like '1080p' TVs last gen often had like 768p displays, it's not like Sony is actually doing that, they're just not saying they'd adhere to the 1000nit requirement. The reality is, this idea of peak performance contrast is not a determining factor in HDR performance quality.

This is something that's often in these threads. Yeah, it's super cool when you do a night race in a game, and it's very dark, and the car lights are really bright. It's neat. But that's not really what HDR is about, it's about granulation within bright and dark areas. In TLG, when you're outside and can resolve all the detail of the bright white cliffside, or in RE7, when the darkness contains tons of detail and not just a murky black mess like SDR is.

The bright on dark 'pop' is the big 'wow' moment if you want to demo HDR to someone, but it's not what HDR is really about in application, and any and all standards should reflect the actual value the technology offers.
Oh I know their history. I have nothing against Sony. I love them. But they're known for adhering to their own standards of quality, always trying to redefine existing technology on their own terms, whether they are better or worse. This is not a controversial statement. Look at their historical laundry list of both successful and failed attempts at redefining formats or creating semi-new ones. For every bluray success they have a betamax failure. Yet it is this very drive to achieve quality at high cost that makes them so admirable.

HDR is in its infancy, so the exact standards are not set in stone across the industry. It may be performance based, but it's still a standard with perimeters that defines the resulting hardware. All I'm saying is Sony is out again to define emerging technology on their own terms, like when HD consumer video was becoming possible and they wanted bluray to be the standard instead of HD dvd. And I gladly welcome it. In fact I would have bought a Sony 4K TV instead of LG, as I am well aware of the benefits, but it was simply out of my price range (again, classic Sony, to offer higher quality than its competition but at a much higher price).
 

Ikaruga

Member
Oct 27, 2017
1,055
Austria
Tbh I have HDR TV and I still don't know when it's used and not used.
I have one too but I know it isn't really good at displaying HDR, if you really want a good one you might want to look out for an OLED TV, cause only they are able to power and light each of the 8 million pixels individually. They suck as monitors though(burn in), that's why my 4K TV is using an LC Display.
 
OP
OP
EvilBoris

EvilBoris

Prophet of Truth - HDTVtest
Verified
Oct 29, 2017
16,684
Oof that Reddit thread.
OLED Defense force is on high alert.
 

Deleted member 16609

User requested account closure
Banned
Oct 27, 2017
2,828
Harlem, NYC
Post #153 is a prime example of why HDR has long ways to go to regular consumers. There is still confusion and is not intuitive enough yet when it comes to setting up. Every time a thread on this very forum pops up about HDR you still have people post their confusion. And some of them have HDR sets. Is baffling to me.
 

Gestault

Member
Oct 26, 2017
13,371
It is a really great game, I've been really enjoying this. The lack of HDR doesn't dampen that, it's just important that we are aware that there are some poor examples of game labeled HDR and this is the highest profile games which has done this.

Bingo. And thanks for taking the time to do the analysis well enough that people have some clearer metrics for discussion.
 
OP
OP
EvilBoris

EvilBoris

Prophet of Truth - HDTVtest
Verified
Oct 29, 2017
16,684
So is there any advantage of disabling HDR for NieR?
It'll likely look like it is meant to, the TV can process the SDR signal as an SDR image, rather than a SDR image as an HDR Signal.

In HDR due the a poor conversion with no real tone mapping many of the mid tones are disproportionately elevated.
 

MCD

Honest Work
Member
Oct 27, 2017
14,809
There was an update to Nier two days ago with zero info or changelog.

Perhaps it's related to this? Anyone wanna confirm?
 

Railgun

Member
Oct 27, 2017
3,148
Australia
He should be back today I believe.

Saw he tweeted about No Man's Sky.



I'm sure he will or would have taken a look at Nier after the patch.

He's taken a look at the patch, no difference. I too had a look the night the patch went live and there's no fix to the colour saturation in HDR or the performance drops. Not sure what the patch has done, probably fixed some crashes.
 

Mr Delabee

Member
Oct 25, 2017
1,165
UK
As suspected, he's usually pretty quick to test if there's updates in games especially any that have had HDR issues.
 

ThreepQuest64

Avenger
Oct 29, 2017
5,735
Germany
What was the destiny 2 uproar?
To be honest, what Bungie did was something else and is not really that compareable to this situation. But as soon as I smell the slightest sneaky move, I'm a little bit on the edge. To answer your question: Bungie got caught that Destiny 2 is showing you a wrong XP progression and it was found out, that you get less XP in a session the longer you play, making it harder to level up and harder gain loot boxes by normal playing/progression; there was scaling factor. After a redditor found out, Bungie replied that the system was not working as intended and removed the scaling. So they fixed it on one side, but raised the needed XP – in fact, they doubled it – for a level up. Of course people found out, too.
 

Maneil99

Banned
Nov 2, 2017
5,252
To be honest, what Bungie did was something else and is not really that compareable to this situation. But as soon as I smell the slightest sneaky move, I'm a little bit on the edge. To answer your question: Bungie got caught that Destiny 2 is showing you a wrong XP progression and it was found out, that you get less XP in a session the longer you play, making it harder to level up and harder gain loot boxes by normal playing/progression; there was scaling factor. After a redditor found out, Bungie replied that the system was not working as intended and removed the scaling. So they fixed it on one side, but raised the needed XP – in fact, they doubled it – for a level up. Of course people found out, too.
Oh I thought they had something to do with their HDR. Sorry.