• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.
OP
OP
EvilBoris

EvilBoris

Prophet of Truth - HDTVtest
Verified
Oct 29, 2017
16,678
How difficult is it for developers to implement proper HDR to the point where faking it made more sense to them? Like, what goes into making a game use proper HDR?

I don't think it's something you just enable or turn on, various assets have to be built with it in mind.

At a very basic level, the values that control how bright the sun is or the shaders that control how much light reflects back off a surface all need to be be controlled by artists to ensure that the look right.
Then you have other things like the LUTs that control color grading, which will be created in SDR and perhaps cannot simply be converted to function in HDR and maintain the artist's vision.

For the most part, games need to have been built for HDR from the start. DICE have published a couple of presentations about their HDR workflows and they build the game for HDR then make adjustments for SDR users.

When talking about their being no HDR on the cards for Halo 5, Stinkles said that every asset in the game would need to be "touched" in order for it to be HDR. I think that's a very simple way to put it that suggests how much effort it takes to do it.
 

DOTDASHDOT

Helios Abandoned. Atropos Conquered.
Member
Oct 26, 2017
3,076
You forgot to add The Last Guardian to the poor HDR mastered list! Nice slam though, this is not what people need, and if this was someone's first introduction to HDR, then damn!
 

Pargon

Member
Oct 27, 2017
11,994
So you may of seen some of my other threads taking a look at the HDR output of various games, trying to understand why a game might look particularly good and which game are going to get the best out of a premium HDR TV.

There have been great games ; Call of Duty WW2, Sea of Thieves
and some stinkers ; Monster Hunter World , Deus Ex.

But for the most part, the developers that have gone down the route of going to the effort of implementing HDR have done so with success.


Last week Nier Automata launched on Xbox One, alongside various improvements for the game on the X, HDR became an advertised new addition to the title as it belatedly came to Microsoft's platform.

Starting to play the game, it quickly became apparent the game didn't have the same wow effect that has I have become attuned to whilst playing other games, at first I thought this was maybe just the look of the game. Others online were commenting the game looked dim, even comparing screenshots of the game versus other version, noting the colours were further muted vs the PC/PS4 with the game looking a little washed out.

My first look at the game appeared as if the game was totally SDR, with not a glimmer of any peak brightness close to what we would expect.



Another look at the game made it clear that the game was so low in it's dynamic range, that using my current methods for determining the output actually didn't work on Nier, so I had to actually build some new tools to help identify the very low peaks

Tuning my tools to find the highest peak in an image led me to find that the game did have a very low contrast output, with the highest peaks I could find hitting 600ish nits. You can see them as red pixels in the image below.
pZ70DaB.jpg



So I went on to accept that the game had actually been mastered for 600nits, which seems like an odd value, however it's not unheard of (HDR favourite Planet Earth 2 is mastered to a similar level)

As you progress further through the game, the environments change and I started to reach greener , more foliage laden areas and the game triggered a cutscene, which looked almost entirely different to the game I was just playing (apart from being a poorly quality 1080p video) it was noticeably lighter, brighter and more colourful than the game I was just playing.
It's unusual for devs to encode in game FMVs in HDR and this was likely to been rendered from another version of the game, however I was confused as to why the tone and mood of the game looked so significantly different.

So I disabled HDR.....
Oh. The game offered more contrast, was brighter and more saturated. Essentially the exact opposite of what I would expect to happen.


So I started digging deeper.
First of all I needed to indentify whether this difference was simply a processing variation between SDR/HDR on my TV.
The simplest way to double check this is to take some raw screenshots with HDR enabled and disabled and compare them, which was what I did.
It became almost immediately obvious that the 2 screenshots I had taken were almost identical, with some very minor differences in black level, white level and exposure.

In the below image your can see the 2 Raw screenshots from the Xbox (labeled SDR and HDR output) The SDR image is PNG from the Xbox and the HDR image is a 10bit JXR image from the xbox.

You can then see that I have converted each image to match the other output.



The game is entirely in SDR and is expanded from 8bit to 10bit and then adjusted to fit into the appropriate place within an HDR ouput and then is sending the signal to the TV that the content is HDR.
Nothing is gained, in fact things that would have previously been darker now become lighter than then should be, resulting in a lower contrast image, washed out image.

This is the HDR equivalent of upscaling and claiming the game is a higher resolution than it actually is. But then actually up-scaling it incorrectly and reducing the resolution a bit

It's super disappointing that a developer has done this and we should expect better. HDR is already a total mess of a format with such varied support and capability. It's clearly possible at this point in time for such a new technology for a developer to misunderstand and produce poor results, but this feels more like an effort to deceive.

It is not entirely clear from your post, but it sounds to me like the game is rendering with a fixed dynamic range 'camera' but is adjusting the exposure on an HDR scale?
So there may only be 100 nits of dynamic range at any one time, but depending on the exposure level, the peak brightness can still exceed 100 nits.

That is a perfectly valid method for SDR to HDR conversion.
It won't produce results as good as something natively rendering (or shot) in HDR, but should still be an improvement over SDR if done correctly.
Of course I'm not stating that NieR:Automata is doing it correctly - I haven't played it on Xbox.

How difficult is it for developers to implement proper HDR to the point where faking it made more sense to them? Like, what goes into making a game use proper HDR?
If a game is not using physically based rendering, and was not built for HDR to begin with, it may not be as easy as you might think it would be.
Unfortunately it's not just a case of disabling tone mapping for the SDR output and sending the high dynamic range image directly to the display.

[…] Whenever I look at hdr enabled games it just looks different to me. Not necessarily better or worse.
This is often due to people's displays being improperly calibrated for SDR. When viewing SDR to spec, it is limited to 100 nits peak brightness.
With an improperly calibrated display, the brightness of SDR content can be pushed to 500 nits or more, and you can even make the image look more saturated than HDR if you set the color space incorrectly.

Many HDR sources do not change the average brightness much compared to SDR - they only extend the dynamic range with more highlight information (these are the ones that still look fine on OLED).
For example, here is a side-by-side comparison with Mad Max: Fury Road on Blu-ray being displayed in SDR on a calibrated display on the left, and in HDR on a calibrated display on the right.
madmax-highlight-larg7cjon.jpg


There are differences in color because they are mastered slightly differently, but HDR is capable of displaying richer colors than SDR permits. The ground is roughly all one color in SDR while there is more variation in HDR.
The main place that the two differ is in dynamic range. The overall image brightness is similar in both, but the HDR version allows for detail above 100 nits. The sun is much brighter in HDR and it, along with the lens flare, actually stands out against the sky instead of it turning into a solid bright area.

If you turn up the display brightness so that the SDR image is being displayed out-of-spec, the image is actually much brighter than HDR as the entire sky now matches the brightness of the sun in HDR.
madmax-2-largem1k0n.jpg


But making the display brighter doesn't extend its dynamic range, so the sky still lacks detail, and the image ends up washed-out - the ground has turned yellow.
This is why it's still common for people to say that HDR looks "dim" compared to SDR. It's not that HDR is dim, it's that SDR is completely blown-out with whatever settings they're using.
It's also why people get the wrong impression about HDR brightness. Higher brightness in HDR is not about raising the average screen brightness like it does for SDR - it's about being able to show more detail in bright highlights, and more vivid colors.
 
Last edited:

Edge

A King's Landing
Banned
Oct 25, 2017
21,012
Celle, Germany
I wish I could come over at see your set up lol. Whenever I look at hdr enabled games it just looks different to me. Not necessarily better or worse.

I never saw HDR on a TV so far but I have a Samsung Galaxy S9+ and even I "see the light" when I watch HDR videos on YouTube, gaming or not, it is like day and night, just crazy.

Watching DFs Sea of Thieves video when the room is dark can actually give you a headache cause it's so unbelievable bright when it flashes with every shot, it's mind-blowing.
 

Kyle Cross

Member
Oct 25, 2017
8,411
For such a great game it is a mess on every platform. Platinum has a real problem on the technical side of game development. To all versions of Nier, to the Bayonetta games only running at 720p with a fluctuating framerate on the Switch, the same exact way it was on Wii-U. They make great playing games, but they do not make games that run or look as well as they should.

They need to hire some new tech guys.
 

oneils

Member
Oct 25, 2017
3,084
Ottawa Canada
It is not entirely clear from your post, but it sounds to me like the game is rendering with a fixed dynamic range 'camera' but is adjusting the exposure on an HDR scale?
So there may only be 100 nits of dynamic range at any one time, but depending on the exposure level, the peak brightness can still exceed 100 nits.

That is a perfectly valid method for SDR to HDR conversion.
It won't produce results as good as something natively rendering (or shot) in HDR, but should still be an improvement over SDR if done correctly.
Of course I'm not stating that NieR:Automata is doing it correctly - I haven't played it on Xbox.


If a game is not using physically based rendering, and was not built for HDR to begin with, it may not be as easy as you might think it would be.
Unfortunately it's not just a case of disabling tone mapping for the SDR output and sending the high dynamic range image directly to the display.


This is often due to people's displays being improperly calibrated for SDR. When viewing SDR to spec, it is limited to 100 nits peak brightness.
With an improperly calibrated display, the brightness of SDR content can be pushed to 500 nits or more, and you can even make the image look more saturated than HDR if you set the color space incorrectly.

Many HDR sources do not change the average brightness much compared to SDR - they only extend the dynamic range with more highlight information (these are the ones that still look fine on OLED).
For example, here is a side-by-side comparison with Mad Max: Fury Road on Blu-ray being displayed in SDR on a calibrated display on the left, and in HDR on a calibrated display on the right.
madmax-highlight-larg7cjon.jpg


There are differences in color because they are mastered slightly differently, but HDR is capable of displaying richer colors than SDR permits. The ground is roughly all one color in SDR while there is more variation in HDR.
The main place that the two differ is in dynamic range. The overall image brightness is similar in both, but the HDR version allows for detail above 100 nits. The sun is much brighter in HDR and it, along with the lens flare, actually stands out against the sky instead of it turning into a solid bright area.

If you turn up the display brightness so that the SDR image is being displayed out-of-spec, the image is actually much brighter than HDR as the entire sky now matches the brightness of the sun in HDR.
madmax-2-largem1k0n.jpg


But making the display brighter doesn't extend its dynamic range, so the sky still lacks detail, and the image ends up washed-out - the ground has turned yellow.
This is why it's still common for people to say that HDR looks "dim" compared to SDR. It's not that HDR is dim, it's that SDR is completely blown-out with whatever settings they're using.
It's also why people get the wrong impression about HDR brightness. Higher brightness in HDR is not about raising the average screen brightness like does for SDR - it's about being able to show more detail in bright highlights, and more vivid colors.

Great explanation. This partly explains why so many folks are less than impressed by HDR. They are just used to very bright or saturated scenes in sdr.
 

StuBurns

Self Requested Ban
Banned
Nov 12, 2017
7,273
For such a great game it is a mess on every platform. Platinum has a real problem on the technical side of game development. To all versions of Nier, to the Bayonetta games only running at 720p with a fluctuating framerate on the Switch, the same exact way it was on Wii-U. They make great playing games, but they do not make games that run or look as well as they should.

They need to hire some new tech guys.
I think it's less they aren't good at it, and more they're not being given the budget to do it well.

If SE say here's five hundred grand, give us a XBO version with HDR, it's going to be underwhelming. If they say here's five million, it should be perfect. It's hard to say.

The fact is, Bayo 1 on 360, and Vanquish on the two original platforms, were totally comparable to the strongest technological outings of those genres on those platforms, and both of those platforms were more technologically demanding to produce software for than PS4 and XBO.
 

Deleted member 36578

Dec 21, 2017
26,561
I never saw HDR on a TV so far but I have a Samsung Galaxy S9+ and even I "see the light" when I watch HDR videos on YouTube, gaming or not, it is like day and night, just crazy.

Watching DFs Sea of Thieves video when the room is dark can actually give you a headache cause it's so unbelievable bright when it flashes with every shot, it's mind-blowing.

Wait really? That's what kind of phone I have and I don't see it as a day and night difference what so ever.
 

LiK

Member
Oct 25, 2017
32,047
I think I already mentioned that Boris should be hired by DF, lol

The Durante of HDR.
 

Deleted member 36578

Dec 21, 2017
26,561
This is often due to people's displays being improperly calibrated for SDR. When viewing SDR to spec, it is limited to 100 nits peak brightness.
With an improperly calibrated display, the brightness of SDR content can be pushed to 500 nits or more, and you can even make the image look more saturated than HDR if you set the color space incorrectly.

Many HDR sources do not change the average brightness much compared to SDR - they only extend the dynamic range with more highlight information (these are the ones that still look fine on OLED).
For example, here is a side-by-side comparison with Mad Max: Fury Road on Blu-ray being displayed in SDR on a calibrated display on the left, and in HDR on a calibrated display on the right.
madmax-highlight-larg7cjon.jpg


There are differences in color because they are mastered slightly differently, but HDR is capable of displaying richer colors than SDR permits. The ground is roughly all one color in SDR while there is more variation in HDR.
The main place that the two differ is in dynamic range. The overall image brightness is similar in both, but the HDR version allows for detail above 100 nits. The sun is much brighter in HDR and it, along with the lens flare, actually stands out against the sky instead of it turning into a solid bright area.

If you turn up the display brightness so that the SDR image is being displayed out-of-spec, the image is actually much brighter than HDR as the entire sky now matches the brightness of the sun in HDR.
madmax-2-largem1k0n.jpg


But making the display brighter doesn't extend its dynamic range, so the sky still lacks detail, and the image ends up washed-out - the ground has turned yellow.
This is why it's still common for people to say that HDR looks "dim" compared to SDR. It's not that HDR is dim, it's that SDR is completely blown-out with whatever settings they're using.
It's also why people get the wrong impression about HDR brightness. Higher brightness in HDR is not about raising the average screen brightness like does for SDR - it's about being able to show more detail in bright highlights, and more vivid colors.

Doesn't this really boil down to preference then? I know some people calibrate their sets so everything is super bright. If they're watching something well lit then it's going to be tough for them to even notice the HDR to begin with. I do notice hdr displays being fairly dim myself , but the light source in the images give off a more realistic tone to my eyes.
 

Kyle Cross

Member
Oct 25, 2017
8,411
I think I already mentioned that Boris should be hired by DF, lol

The Durante of HDR.
I believe Dark1x is already familiar with his efforts, so who knows?

The two things I want more from DF is PC graphical setting comparisons to help with self optimization, and more HDR coverage. Publishers, developers, and tech makers alike pay attention to DF so them giving greater emphasis to HDR can be a very good thing. Lord knows I'm sick and tired of poor HDR implementations and PC versions not having HDR at all.
 
Oct 26, 2017
8,992
Funnily enough the other day I saw a thread on Xbox One Reddit praising the hell out of the HDR in Nier, wtf.

A damn shame though. So disabling HDR is the way to go then?
 

Deleted member 40872

user requested account closure
Banned
Mar 10, 2018
36
UK
Many HDR sources do not change the average brightness much compared to SDR - they only extend the dynamic range with more highlight information (these are the ones that still look fine on OLED).
For example, here is a side-by-side comparison with Mad Max: Fury Road on Blu-ray being displayed in SDR on a calibrated display on the left, and in HDR on a calibrated display on the right.
madmax-highlight-larg7cjon.jpg


There are differences in color because they are mastered slightly differently, but HDR is capable of displaying richer colors than SDR permits. The ground is roughly all one color in SDR while there is more variation in HDR.
The main place that the two differ is in dynamic range. The overall image brightness is similar in both, but the HDR version allows for detail above 100 nits. The sun is much brighter in HDR and it, along with the lens flare, actually stands out against the sky instead of it turning into a solid bright area.
I get what you are trying to say but images that are captured with a camera to show the difference HDR makes on a SDR screen make no sense.
If anything this just proves that the tonemapper of whatever camera this was shot with is far superior to that of the TV / film (which honestly it almost always is).

Most games are already created in a "HDR-Ready" way these days as PBR has become standard practice, but if there arent constant reviews in HDR during production details like light / glow intensity, material response in the highlights, grading, exposure etc. might not look good once you see the whole range.
In addition to this the game needs different output curves for SDR and HDR, with the HDR curve never being the simple job everyone expects as there isnt a decent standard and the consumer needs enough control to make the image look good on his horrible 200nits TV.
Another showstopper is exposure. While some studios have started to do fixed exposure in HDR I personally think that makes no sense and the exposure should be mastered in HDR and then offset in the SDR tonemapper. Something that obviously cant be done after the fact.
 

New Fang

Banned
Oct 27, 2017
5,542
Wonderful work OP. This kind of stuff needs to be called out in hopes that other devs see this and get it right.

I've noticed some games do a wonderful job with HDR, but many do not. Monster Energy Supercross on PS4 is one example I've personally seen where turning off HDR made the game look noticeably better.
 
OP
OP
EvilBoris

EvilBoris

Prophet of Truth - HDTVtest
Verified
Oct 29, 2017
16,678
It is not entirely clear from your post, but it sounds to me like the game is rendering with a fixed dynamic range 'camera' but is adjusting the exposure on an HDR scale?
So there may only be 100 nits of dynamic range at any one time, but depending on the exposure level, the peak brightness can still exceed 100 nits.

That is a perfectly valid method for SDR to HDR conversion.
It won't produce results as good as something natively rendering (or shot) in HDR, but should still be an improvement over SDR if done correctly.
Of course I'm not stating that NieR:Automata is doing it correctly - I haven't played it on Xbox..

No, the SDR output has literally been dropped into HDR, complete with it's SDR gamma curve.

No additional conversion, grading or ramp in brightness to try and make use of HDR has occurred as it does in the Witcher 3, which as you say is far from perfect, but is notably better than Nier.
 
OP
OP
EvilBoris

EvilBoris

Prophet of Truth - HDTVtest
Verified
Oct 29, 2017
16,678
While some studios have started to do fixed exposure in HDR I personally think that makes no sense and the exposure should be mastered in HDR and then offset in the SDR tonemapper. Something that obviously cant be done after the fact.

When you say a fixed exposure in HDR what do you mean?
 

Pargon

Member
Oct 27, 2017
11,994
Doesn't this really boil down to preference then? I know some people calibrate their sets so everything is super bright. If they're watching something well lit then it's going to be tough for them to even notice the HDR to begin with. I do notice hdr displays being fairly dim myself , but the light source in the images give off a more realistic tone to my eyes.
The process of calibration is setting up your display to match the appropriate specification as closely as it can.
For SDR, that's 100 nits peak brightness, 2.4 gamma, and BT.709 color space - though ~2.2 gamma may be more appropriate for games, as that is typical for computer/game content (close to the sRGB spec).
If you deviate from that, it's not calibrated. You can't be 'calibrated to a higher brightness'.

But the controls are user-accessible, so people can set them up however they like - it's just not accurate to how it is supposed to look.
Some people would prefer that the entire sky is the brightness of the sun than view SDR accurately, which means they will be disappointed with how HDR looks since HDR does not give people the controls to do that - it hands most of the controls over to the content creators.

There are of course good reasons for viewing content out of spec though - the specs for SDR and HDR intend for content to be viewed in a dimly lit room (not pitch black, for what it's worth) and you don't want to be viewing a display set up for that in the middle of the day in a bright room. You'll want the brightness pushed up and gamma reduced so that you can actually see things clearly.
No, the SDR output has literally been dropped into HDR, complete with it's SDR gamma curve.

No additional conversion, grading or ramp in brightness to try and make use of HDR has occurred as it does in the Witcher 3, which as you say is far from perfect, but is notably better than Nier.
Thanks for the confirmation. It didn't seem like that's what they were doing, but I just wanted to point out that you can still benefit from HDR without rendering 'true' HDR, by having an SDR image that has its exposure set on an HDR scale.
 
OP
OP
EvilBoris

EvilBoris

Prophet of Truth - HDTVtest
Verified
Oct 29, 2017
16,678
For anybody interested on Xbox who has an HDR TV but hasn't bought Nier, the screenshots I've been using are on Xbox Live in my screenshots.

my tag is EvilBoris HDR.
 

Handicapped Duck

▲ Legend ▲
Avenger
May 20, 2018
13,661
Ponds
Count me in as one who's uncertain if they have seen a proper implementation of HDR. Got a Sony X850D in 2016, caliberated it to the RTings settings and did some comparisons with GoW and SotC and only GoW looked noticeably different with actual god rays looking like bright beams of light and the sky being distinct in its coloring. SotC was harder to tell but I don't know if that was because of the area I was in in the game (forest section).
 

Syriel

Banned
Dec 13, 2017
11,088
As soon as you start producing content that goes above that, you are creating content with a larger dynamic range and you need the display to be able to view it.
HDR movies are available that range anywhere from 500 nits to 4000 nits, with a view that in future this will go as high as 10,000.

...

So there has been loads of politics and bullshit from Day Zero of HDR and Displays.

That probably hasn't fully answered your question, but it's hard to keep it briefer than that :P

Equally important to the brightness is the color space.

SDR is Rec.709. And there are still TVs on the low end of the market that can't properly reproduce that.

HDR is Rec.2020. Colors that simply can't be shown on a display that only supports Rec.709.

They share the same white point, but that's about it.

Oh, and as to your theater comment, a Dolby Vision theater can display more than 100 nits.
 

Massicot

RPG Site
Verified
Oct 25, 2017
2,232
United States
I have an HDR monitor, and I've enabled it on both my hardware and in games that support it (FFXV) and I still am not sure if it's working properly. I also was in the NVidia area at E3 where they were showing off both their big monitor displays and HDR and I still felt that it merely looked different, not better (they had two displays of Destiny 2 side by side, one with HDR). What a weird thing.
 
OP
OP
EvilBoris

EvilBoris

Prophet of Truth - HDTVtest
Verified
Oct 29, 2017
16,678
Count me in as one who's uncertain if they have seen a proper implementation of HDR. Got a Sony X850D in 2016, caliberated it to the RTings settings and did some comparisons with GoW and SotC and only GoW looked noticeably different with actual god rays looking like bright beams of light and the sky being distinct in its coloring. SotC was harder to tell but I don't know if that was because of the area I was in in the game (forest section).

When I looked at my friend's Sony the black point settings for HDR content weren't as good as they should have been, so you actually lost quite a bit of contrast to elevated black.
There were 2 settings in particular that made quite a large difference.
Contrast needed to be set at 86 and I think there is a gamma or blackpoint option that you could drop back to -2(?)
 
Oct 27, 2017
5,345
Don't know how bad can HDR be in this game, but even games with supposedly brilliant HDR implementation piss the hell out of me with that fake exposure thing they keep doing, which makes the games look like being lit by a dying lightbulb.

But yeah it's an issue, HDR is confusing enough as it is, we don't need fake HDR to make things worse
 

Handicapped Duck

▲ Legend ▲
Avenger
May 20, 2018
13,661
Ponds
When I looked at my friend's Sony the black point settings for HDR content weren't as good as they should have been, so you actually lost quite a bit of contrast to elevated black.
There were 2 settings in particular that made quite a large difference.
Contrast needed to be set at 86 and I think there is a gamma or blackpoint option that you could drop back to -2(?)
Definitely noticed blacks were not as good on the 850D model. I think I set the setting to 90 for contrast, brightness at 35 for non-hdr and max brightness for hdr, with gamma at -1. I'll go and check. Thanks for the help anyhow. Much appreciated.
 

Syriel

Banned
Dec 13, 2017
11,088
How difficult is it for developers to implement proper HDR to the point where faking it made more sense to them? Like, what goes into making a game use proper HDR?

You are setting up rendering and exposure for two very different color spaces. It's not just a matter of "flipping a switch." It has to be planned for from the start.

Or, to use a poor example, look at HDR photography. When you set "HDR" mode on your digital camera, you're not shooting a still image in the HDR color space. You're shooting multiple SDR images and then using various parts of multiple images from a bracketed exposure. This is done to simulate HDR in the SDR color space.

You can't just take a single SDR image and magically make it HDR. The data isn't there.

So wait, HDR is strictly a higher contrast ratio? Does that mean that the source is responsible for sending a signal that will look good when run through HDR?

No. Color space and grading is also a key component. It is not "just" contrast ratio. Display also matters, as you can have a board that properly handles HDR but a display panel that isn't even close to Rec.2020 and proper brightness/contrast.

Doesn't this really boil down to preference then? I know some people calibrate their sets so everything is super bright. If they're watching something well lit then it's going to be tough for them to even notice the HDR to begin with. I do notice hdr displays being fairly dim myself , but the light source in the images give off a more realistic tone to my eyes.

HDR is all about a more realistic image. It "pops" because it's closer to looking out a window. HDR is NOT oversaturated colors and making everything look like CSI Miami.
 
OP
OP
EvilBoris

EvilBoris

Prophet of Truth - HDTVtest
Verified
Oct 29, 2017
16,678
Equally important to the brightness is the color space.

SDR is Rec.709. And there are still TVs on the low end of the market that can't properly reproduce that.

HDR is Rec.2020. Colors that simply can't be shown on a display that only supports Rec.709.

They share the same white point, but that's about it.

Oh, and as to your theater comment, a Dolby Vision theater can display more than 100 nits.

It was meant to be a simplified version to help with the question in hand

I try to stay away from the colour space conversation, as most documentation refers to most games being expanded from REC709 to BT2020 anyway, so again there is no real gain from that point of view.

as for Dolby Cinema - yes it goes above 100 nits. 108nits :P
 
Oct 29, 2017
1,100
This is what I hate about HDR.

Half the stuff I have becomes duller looking and either darker or more washed when I enable hdr output.

The other half looks amazing however.

Devs should never make things darker to implement hdr. But some do.
 

K' Dash

Banned
Nov 10, 2017
4,156
There is no "standard" implementation of HDR and that is the main issue, I have had to adjust my TV for each game cause using the same settings on all is a big fucking mistake.

At least 4K content looks good enough to justify the purchase of my new TV.
 

ghibli99

Member
Oct 27, 2017
17,704
That sucks, but the more good coverage of it like this there is, the more developers and other content creators will take the time to do it right going forward. Keep fighting the good fight.
 

Deleted member 36578

Dec 21, 2017
26,561
Look in the YT app, not the browser.





Or these, these will get ya.

https://youtu.be/tO01J-M3g0U

https://youtu.be/ygqzQ8XlHTM


I'm just not seeing it. Just looks different. Colors, brightness, ect. Not better or worse. HDR is still just a bullet point on a TV box as far as I'm concerned.

Even in one of those digital foundry vids the guys admit to not knowing what they're looking for. They say there's some weird glitch that throws the colors off at times. If someone is looking at a set in HDR and convinces themselves it's better with it on , more power to em. Until I buy an oled 4k HDR set myself and tool around with all the various settings for each individual game(and lets be honest here I really don't want to take the time to change settings with every game but it seems with hdr we have to) I'll never be convinced that it's the second coming like I see people claim it is on forums . I still believe there's a strong placebo effect for a lot of people. They can be super convinced their tv looks amazing and then another person comes in who also has a HDR tv , and needs to change settings to match their own preference.
 
Last edited:
OP
OP
EvilBoris

EvilBoris

Prophet of Truth - HDTVtest
Verified
Oct 29, 2017
16,678
I'm just not seeing it. Just looks different. Colors, brightness, ect. Not better or worse. HDR is still just a bullet point on a TV box as far as I'm concerned.

Even in one of those digital foundry vids the guys admit to not knowing what they're looking for. They say there's some weird glitch that throws the colors off at times. If someone is looking at a set in HDR and convinces themselves it's better with it on , more power to em. Until I buy an oled 4k HDR set myself and toll around with all the various settings for each individual game I'll never be convinced that it's the second coming like I see people claim it is on forums . I still believe there's a strong placebo effect for a lot of people. They can be super convinced their tv looks amazing and then another person comes in who also has a HDR tv , and needs to change settings to match their own preference.

I think people talk about it like it's this big thing that everyone is going to spot immediately, however I personally think it's an additive experience. You might not realise it is there when you first start watching HDR content, but you will start to notice when it's not there
 

Deleted member 36578

Dec 21, 2017
26,561
I think people talk about it like it's this big thing that everyone is going to spot immediately, however I personally think it's an additive experience. You might not realise it is there when you first start watching HDR content, but you will start to notice when it's not there

That's a big take away I get from it too. I read people's experiences that they don't notice a major difference at first, but afterwards if they go back to non hdr they totally do.