• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.
  • We have made minor adjustments to how the search bar works on ResetEra. You can read about the changes here.
OP
OP
EvilBoris

EvilBoris

Prophet of Truth - HDTVtest
Verified
Oct 29, 2017
16,684
Take this with a healthy dose of salt because I'm not a professional display technology guy, but here's my understanding of the situation with the LG 7 series as a B7 owner with a HDFury Vertex. You don't need to inject any metadata for the TV to target 4000 nits, that's what it shoots for when it doesn't have any metadata to work with (all console games at this moment in time). If you have HDFury device your best bet is probably to inject 1000 nit metadata with most games: some games that have no controls for max luminance target that number by default. Other games that have adjustable settings for HDR might not give you an option to go as high as 4000 nits (Halo: MCC tops out at 1000, I think Battlefield V tops out at 2000), which means no part of the screen will ever hit the TV's peak brightness without injecting metadata since it's expecting the content to have a max luminance of 4000 nits. In short, you're (at least currently) more likely to games that work well with 1000 nits as your target than games that go for 4000 nits or beyond - though there's also a handful of games where sending in 1000 nit metadata can be to your detriment (ie. skies in Horizon: Zero Dawn lose some definition due to clipping). Bottom line is if you have one such device you'll want to see which works best for you on a game by game basis.

I spent some time generating numerous different sets of metadata with the Vertex PC app at one point and ultimately landed on one that targeted 780 nits
87:01:1a:fd:02:00:c2:33:c4:86:4c:1d:b8:0b:d0:84:80:3e:13:3d:42:40:0c:03:01:00:00:00:00:00
max luminance that worked best for me with most titles (with each set to target 1000 cd/m2 peak brightness). Despite that target number being significantly lower than 1000 nits, I actually found it to clip less on the top end than injecting 1000 nit metadata while still being suitably bright after comparing the two with several games.

Pretty much this^


If you TV typically wants 4000nit content (as is the case for many LG displays with game consoles) then you want to try and give it that.

This *should* be reflected as the approximate point where a logo disappears in RDR2 or other games that have a labelled option.
However if you have changed brightness, contrast settings, gamma etc, then you may find you typically settle on a slightly different figure as you will be changing what your TV is doing with the input.

I think in the early days I was less aware of the way certain displays were tracking in a slightly strange fashion (hence earlier 1000nit reccomendations)


As for the reason why you might get a more poppy image with a 1000nit PQ :
Let's say your display can do 800 nits. It displays everything perfectly up to say 600 nits. After that point , then the difference between 600-1000nit inputs is distributed in across only 200 available actually values of brightness you display has to give.

So something that was intended to be displayed at 1000nits, gets pulled down to an actual value of 800(the display's) max.

Now if you consider that generally the higher the input value, the less common it will be.

So for a display that uses a 4000nit tone curved, in order for you to see your display's maximum capacity, you have to feed it 4000nit pixels; which will be less common.
So for a game that is internally capped at a 1000nit output, you might only ever see just over 600nits and that last 200 you have goes to waste.

This is where the linkers come in handy as you can change the behaviour of the display and tell them to use a 1000nit curve
 
Last edited:

Deleted member 49179

User requested account closure
Banned
Oct 30, 2018
4,140
I cannot test it right now, but I wonder what value I should set RDR2 Peak Brightness for a x930e. This TV is rated as being able to hit ~1500 nits of peak brightness. Should I put the value at 1500, or go all the way to 4000 and let the TV tone map?
 
Dec 13, 2018
46
Who ya callin' A Goat?
You, sir.

ea3tN3F.jpg
 
Last edited:
Oct 25, 2017
13,246
I'm still sort of confused as to how you figure out what the target nits for your TV is. Or does that require professional tools?

I have a Vizio P 2018 (P-65) and it seems to look best on the default settings for most games. 750nits on DMCV but then something like 3000 nits on Division 2. It always confuses me.
 
Dec 13, 2018
46
I have a LG B8. Is there any reason to buy a HDFury Linker?
No, dynamic tone mapping in game mode should solve all brightness issues that you hear 2017 OLED owners talking about. Also, in game mode on 8 series even without dyn tone mapping enabled, it provides a brighter picture. Vincent shows the different eotf curves in the c8 vs b7 comparison video, I believe.

The reason, at least for me, that a linker was purchased was to improve the dim tone mapping on my C7. Your B8 shouldn't have an issue and you should not need a linker.
 
OP
OP
EvilBoris

EvilBoris

Prophet of Truth - HDTVtest
Verified
Oct 29, 2017
16,684
I'm still sort of confused as to how you figure out what the target nits for your TV is. Or does that require professional tools?

I have a Vizio P 2018 (P-65) and it seems to look best on the default settings for most games. 750nits on DMCV but then something like 3000 nits on Division 2. It always confuses me.

Some of the confusion is down to games getting it wrong and TVs doing dynamic processing which gives different readings.

Generally a peak brightness setting won't affect the image in an majorly visible way, whether you ask the game to deal with it or your TV does the same thing, but the effect will not stack per se.

The only visible artefacts you would notice if you went too high ,might be some white clipping on occasion.

If in doubt, max it out™
 
Last edited:
Oct 25, 2017
13,246
Some of the confusion is down to games getting it wrong and TVs doing dynamic processing which gives different readings.

Generally a peak brightness setting won't affect the image in an majorly viable way, with you ask the game to deal with it or your TV does the same think, but the effect will not stack per se.

The only viable artefacts you would notice if you went too high might be some white clipping on occasion.

If in doubt, max it out™

That helps a lot, thanks!
 

Kyle Cross

Member
Oct 25, 2017
8,431
Pretty much this^


If you TV typically wants 4000nit content (as is the case for many LG displays with game consoles) then you want to try and give it that.

This *should* be reflected as the approximate point where a logo disappears in RDR2 or other games that have a labelled option.
However if you have changed brightness, contrast settings, gamma etc, then you may find you typically settle on a slightly different figure as you will be changing what your TV is doing with the input.

I think in the early days I was less aware of the way certain displays were tracking in a slightly strange fashion (hence earlier 1000nit reccomendations)


As for the reason why you might get a more poppy image with a 1000nit PQ :
Let's say your display can do 800 nits. It displays everything perfectly up to say 600 nits. After that point , then the difference between 600-1000nit inputs is distributed in across only 200 available actually values of brightness you display has to give.

So something that was intended to be displayed at 1000nits, gets pulled down to an actual value of 800(the display's) max.

Now if you consider that generally the higher the input value, the less common it will be.

So for a display that uses a 4000nit tone curved, in order for you to see your display's maximum capacity, you have to feed it 4000nit pixels; which will be less common.
So for a game that is internally capped at a 1000nit output, you might only ever see just over 600nits and that last 200 you have goes to waste.

This is where the linkers come in handy as you can change the behaviour of the display and tell them to use a 1000nit curve
Well for example you just recommended 1000nits for Sea of Thieves, not 4000. So I'm still confused.

Also with a Linker should I be feeding 1000 nits (PC metadata) or 4000 nits (The Great Gatsby)?
 

Robdraggoo

User requested ban
Banned
Oct 25, 2017
2,455
Does anyone have experience with sammy q6fn?hdr doesnt look that great and I'm not sure if its the Tv or its me not "getting" hdr.
 

EsqBob

Member
Nov 7, 2017
241
Pretty much this^


If you TV typically wants 4000nit content (as is the case for many LG displays with game consoles) then you want to try and give it that.

This *should* be reflected as the approximate point where a logo disappears in RDR2 or other games that have a labelled option.
However if you have changed brightness, contrast settings, gamma etc, then you may find you typically settle on a slightly different figure as you will be changing what your TV is doing with the input.

I think in the early days I was less aware of the way certain displays were tracking in a slightly strange fashion (hence earlier 1000nit reccomendations)


As for the reason why you might get a more poppy image with a 1000nit PQ :
Let's say your display can do 800 nits. It displays everything perfectly up to say 600 nits. After that point , then the difference between 600-1000nit inputs is distributed in across only 200 available actually values of brightness you display has to give.

So something that was intended to be displayed at 1000nits, gets pulled down to an actual value of 800(the display's) max.

Now if you consider that generally the higher the input value, the less common it will be.

So for a display that uses a 4000nit tone curved, in order for you to see your display's maximum capacity, you have to feed it 4000nit pixels; which will be less common.
So for a game that is internally capped at a 1000nit output, you might only ever see just over 600nits and that last 200 you have goes to waste.

This is where the linkers come in handy as you can change the behaviour of the display and tell them to use a 1000nit curve

Should we put 4000 nits in PC games too when using LG OLEDs?
 
OP
OP
EvilBoris

EvilBoris

Prophet of Truth - HDTVtest
Verified
Oct 29, 2017
16,684
Should we put 4000 nits in PC games too when using LG OLEDs?

So on PC metadata is actually sent, as far as I know Nvidia actually sends 1000nit MaxCLL metadata (I'm not sure what AMD cards do) , so you may find that you get different results (which solve some problems)
Hitman 2 has HDR support and free demo. It's in game peak brightness adjustment allows for upto 2000, so you should easily be able to see whether you have the same behaviour there.
 
OP
OP
EvilBoris

EvilBoris

Prophet of Truth - HDTVtest
Verified
Oct 29, 2017
16,684
Does anyone have experience with sammy q6fn?hdr doesnt look that great and I'm not sure if its the Tv or its me not "getting" hdr.

Those displays support HDR rather than doing it particularly well ( as they have very limited local dimming)
I'd imagine in games that have high average brightness (due to half the screen being sky) will probably still look good. Things that aren't so reliant on black details.
Sea of Thieves, Forza H4, Horizon ZD and even assassin's creed will probably look good still.
Just make sure you have your brightness set correctly on your display, as sometimes out of the box they can raise blacks a little more than purely the backlight causing an issue
 

Robdraggoo

User requested ban
Banned
Oct 25, 2017
2,455
Those displays support HDR rather than doing it particularly well ( as they have very limited local dimming)
I'd imagine in games that have high average brightness (due to half the screen being sky) will probably still look good. Things that aren't so reliant on black details.
Sea of Thieves, Forza H4, Horizon ZD and even assassin's creed will probably look good still.
Just make sure you have your brightness set correctly on your display, as sometimes out of the box they can raise blacks a little more than purely the backlight causing an issue
Thank you very much
 
OP
OP
EvilBoris

EvilBoris

Prophet of Truth - HDTVtest
Verified
Oct 29, 2017
16,684
Well for example you just recommended 1000nits for Sea of Thieves, not 4000. So I'm still confused.

Also with a Linker should I be feeding 1000 nits (PC metadata) or 4000 nits (The Great Gatsby)?

Ah Sea of a Thieves.
That does a non standard adjustment which affects the HDR, which makes everything in the image brighter.
Have a look at it at 4000nits and see if there is
A: No white clipping on the horizon
B:Night time still looks dark (you can make night time almost daylight if you adjust that too far)

If it looks ok, it looks ok.
If it doesn't then, this is perhaps an example where actually being able to inject 1000 and set the game to 1000 would actually give you the most optimal and most "correct" (if there is such a thing) image
 

MrBenchmark

Member
Dec 8, 2017
2,034
I've spent the last couple of weeks investigating the HDR output of various games.
When it comes to SDR video , you may be familiar with RGB values 0-255 , with 0 representing the colour black and 255 representing white (if you've ever used a colour picker in MS paint or a word processor, you may have seen this type of number).
HDR10 /Dolby Vision is a little bit different, not just because it uses a scale of 0-1023, but because each of these data values represent not just black to white (or colour), but also a measure of luminance in Nits, which isthe intensity of the light (how bright it is)

Unlike previous video formats, these values are defined and are absolute. A value of 0 will always represent no light at all (total black) , a value of 1023 will always represent 10000nits of luminance, a value of 769 will always represent 1000nits.

So if a modern HDR TV is fed these values, they should be exactly outputting the amount of light described by the value given.
HDR10 and Dolby vision both used this system, and can refereed to as PQ based HDR

Now as it stands, there aren't many TVs that get to the heady heights of 10k nits, you are lucky if you can get one that goes above 1500 at the moment.
When the signal being received goes beyond the hardware capability of the display, the TV chooses how it handles this, most manufacturers simply clip the white values above a level chosen by them. They may also choose a soft roll off and try to make the shift into the clipped values less obvious.
In order to do this, the when the content is mastered/produced for HDR10 and Dolby vision specify some additional information about the image content in the form of metadata, this metadata usually says what is the most intensely brightest value that will be seen in the game (or movie) and what the average luminance is across all of the content. These values are defined by the display the content was mastered on.
Most UHD content current is being mastered for 1000nit screens or 4000nit screens.

The purpose of this metadata is so an SDR image (or something inbetween SDR and HDR) can be derived from the the original HDR content in the event that the content is viewed on a display that does not reach the peak Luma of the display that the content was mastered on.

So if you movie has been mastered on a 1000nit reference display, and you are using an OLED screen with a 650nit max output the TV can use this metadata to try best decide how to display the information that can't otherwise be displayed, due to hardware limitations.
Once you are using a display that meets or exceeds the peak brightness of the content, the metadata actually becomes irrelevant.


So with this in mind I've been looking at how games have been mastered, what options do they present to the user to adjust the image , what do these options actually do and what is the relationship between these things and how the HDR looks.
Videogames have a big advantage over movies in that the image is generated in real time, so the image can be adjusted at will

Due to the nature of HDR content, this is actually really easy to measure, all we need is an un-tonemapped screenshot or video capture and from this we can look at the code values that have been used in various in parts of the image image.
We can see if the game is actually outputting anything that is black (or has a cinematic grade been applied with raised blacks) we can also see what is the very brightest value that the game is going to try and use to represent something like the sun.
So I've been looking at the make up for various different games for Xbox (which allows for HDR screenshot output) to try and understand what the in-game adjustments actually do and how I should be using them to ensure that I get the best from my display.

The goal of HDR is to to transport more information to a display , so that more intense light can be display in parts of the image where it is required, typically you'll see these brightest points in specular highlights, explosions and the sun.

Let's have a look at a few of really good examples of in game HDR. They all have slightly different settings and different approaches to how the tone mapping is going to be performed.

In order to better visualize the output of the game in a non HDR10 / SDR format, I've created a method of producing maps of the luminance.
Using this scale you can quickly get an overview of what is dark, what is light and what is really intensely light.


WvSncK2.png

The vast bulk of what we see is going to sit between 0-150 nits, anything above that is the "Extra" luma that HDR offers.


Star Wars : Battlefront 2
Actually all of the HDR compatible Frostbite games I looked at (Battlefield 1,Mass Effect) all use the same setup.
Metadata output will be 100000 and the actual tone mapping is performed by the game via the HDR slider, with 0 nits being the very most left value and 10k nits being the most right value.

AbcrCaO.jpg


As we can see, the sun itself is outputting at 10k nits, things that should be totally dark are as they should be, you'll see that the specular highlights reflected on the top of the gun are also hitting between 4000 and 10000 nits.
The Dice Frostbite games are actually really interesting in that you can turn the HDR slider down to 1 click from the left , which will given you 100/200 nits depending on which game it is, essentially toning the game to SDR. This gives you a really nice way to see where your new fancy TV is showing off.

Another fun thing you can do with the DICE game is to move the slider to 0% and literally turn off the lights, you can see just how real-time the lighting as you tell the engine the brightest any source of light can be is 0.

Rise of the Tomb Raider
So this game has a similar setup, a brightness slider which controls the black point (move to lowest point) and then a secondary HDR slider which controls the peak brightness.
Tomb Raider has been capped at 4000nits. Like the Frostbite games, either set the slider to the Max to let your TV tone map, or follow the on screen instructions to try to eyeball the peak brightness and let the game output.

RotTR is particularly great as there are loads of specular highlights, not just in the normal places you'd expect to see them in real life such as on the shiny ice and the twinkles in the snow as they reflect from the sun.
4ymuu1o.jpg


But also in lower light conditions on less obviously "shiney" surfaces, so as on this insanely high resolution boot.
TNdscjx.jpg


It appears that once the output reaches 4000nits, any level above this jumps straight to 10k nits (which the display will clip anyway, as presumably the metadata is telling the display it's 4000nit)
It's not data that is missing, it can be brought into visibility occasionally, which suggests that it is an artistic decision or part of the process for grading the game for HDR.
zMZVDn7.jpg



Assassin's Creed Origins

AC:Origins is another game that really does HDR well, like Tomb Raider is also capped at 4000nits (the pink bits) and offers a brightness slider which should be set as low as possible , based upon your viewing environment. Also it offers you a Max Luminance, which is neatly labeled in nits.
There is also a "Paper white" scale, so as well as having sliders that dictate the darkest something can be in game, the brightest it can be, the game gives you a scale that allows you to adjust one of the mid points: How bright is a piece of paper.

servlet.ImageServer


Ubisoft's recommendation for the paper white slider is

However like the brightness slider, this is here to allow you to adjust the output of the game to match your viewing conditions, if you are in a cinema like controlled light source environment technically you would set to 80nit, however as your surrounding light increases you will prefer higher values.

gVRFT4k.jpg


Setting the games to the technical correct settings also highlights how HDR10/Dolby vision is too dim for many consumers. You'll also see how developers are still getting to grips with these new technologies, you'll see that the HUD in AC becomes a little too dark as the main game image becomes calibrated correctly.


Moving on, we can see 3 Microsoft 1st Party Offerings which are all working with a full 10000nit outputs

Forza Horizon 3

A really great example of how HDR doesn't need to be used just for crazy bright sun-spots, magic and fires.
Cloudy days still have very bright skies, photographers have had to use a few tricks to deal with the contrast issues they cause. Forza really shines here, it's almost enough to make me feel a sensation cold and drabness when viewed with my own eyes.

You can see the skie reaching 4000 or so nits, tires grill being suitably unlit and dark and then the headlights pumping out a full 10k nit output.
RHItYfa.jpg


In a night time environment you can see the game making full use of the darker end of the scale, whilst the explosions and headlights are still illuminating in a realistic fashion.

EzvkDtP.jpg

This is also a good example , even in SDR how a very dark image which adheres to the correct HDR10 standards will be perceived as being too dark or dim or "crushing blacks" . As we can see from the Luma map, the detail is all actually there, but human eyes cannot adjust to see such details until they are exposed to low levels of light for 10 minutes or so, when certain chemical reactions occur within the cells of the eyes. This obviously is an issue for many consumers who probably aren't in lightning conditions conducive to this happening.

Gears of War 4
VUJV9lx.jpg


INSECTS
Me6YYCD.jpg


All 3 games work with 2 sliders
A brightness slider, this controls black levels, but also controls an aspect of contrast, pushing max nit output up to 10000k
The a secondary HDR slider which also allows the max output to be set below that which the other slider adjusts to.
Forza and Gears simply refer to Brightness and HDR , however INSECTS refers to this as HDR Contrast and HDR brightness.

Now lets look at a few of other implementations

Shadow of Mordor
SoM offers a super simplified approach, the game is always outputting 10k nit maximum and the tone mapping is left entirely to the TV. This is interesting, as we know that the developers have never actually seen the game outputting at 10k nits as there is no such display available.
There is a traditional Brightness slider which allows users to specify the black point to their taste/viewing conditions.

Here we can see the obvious place to look for 10k nits, the sun and the specular highlights. You can also see that as it should be, the shadowed sided of the player character is as dark as it should be.
lDJUEgO.jpg


Agents of Mayhem
Similar approach here, except the game is capped at 1000nits and has obviously been graded/toned with this in mind, as it's a fairly achievable output from a consumer display. I don't think it's a coincidence that they have made the game with this in mind and it's actually one of the games that has a really great looking HDR output.

m5MxA5i.jpg


dCfYAig.jpg


Again, like other games, adjust slider to the left to improve black level, this will adjust the overall brightness and impact slightly on the max brightness. Adjust to your lightning conditions/taste.

DEUS EX : Mankind Divided
Lots of raised blacks in this perhaps as an artistic choice, but also a bug which causes grading to to totally fail if the in game brightness slider goes below a value of 35%.
Ov443zl.jpg


This looks like the result of some kind of flawed curve adjustment.

JP8Zqef.jpg

40-45% will give you 1000nit output without raising the blacks too much.

Final Fantasy XV
From one Squarenix meh, to a Squarenix wow.
1000 nit fixed max output and a simple brightness slider to drop black levels.
Really fantastic grading throughout and in various lightning conditions.

jpKBjZw.jpg


xzvttv8.jpg


Even the Titlescreen has 2D elements optimised for HDR.
7zGfh6M.jpg


Monster Hunter World
Much like Deus Ex, Monster Hunter World appears to operate within 4000nits, however much like DEUS EX, when HDR is enabled, the game appears to have severe black level problems. At the default brightness setting, this is what we are getting.
ux3EHkT.jpg

aRiOgO6.jpg


All mids and highlights, where are the shadows?

So with a quick and dirty level amendment, we can remove the extra HDR luminance from the data and take a look at the histogram

61mOIOa.jpg


If we compare this to an in game SDR shot taken just moments later

g2thR9i.jpg


we can see the significant shift that there is between the SDR and the HDR toning. Contrast and black levels are totally out of sorts in HDR.
This can be remedied slightly by dropping the brightness down as low as it goes, but there is not enough of a change to make it right. This appears as if it is at least partly caused by some kind of eye adaptation that is occuring.


Whew!
That was a lot of images!

So this explores one side of how games are made, what it doesn't explore is the metadata side of what the games are outputting. It wouldn't surprise me if there were titles that had mismatched metadata, but because the metadata is static, as soon as you make any image adjustments, it would technically become wrong anyway.

Update 1:
Figured out how to run it on video

Horizon Zero Dawn : Frozen Wilds

Even from this short clip we can see how to do HDR right, almost everything you see sits within the standard SDR range, however the highlights on Aloy's weapons glimmer in the sun and the sparkle that runs down her back are heading towards the 10k nit level. You can see the clouds are all highly illuminated and sit between 1000-4000nits with the sun itself hitting 10k


That is hitting upwards of 4000nits from what I can see.

Uncharted 4

Got some HDR footage of Uncharted 4 too, another one that is going over 4000nits towards 10k


Nice work! I'm really curious about Destiny 2 as well I play it a ton and the HDR in that game is odd and the settings are tricky with HDR on the dark areas like caves go super dark and you can't see shit.
 

Kyle Cross

Member
Oct 25, 2017
8,431
Ah Sea of a Thieves.
That does a non standard adjustment which affects the HDR, which makes everything in the image brighter.
Have a look at it at 4000nits and see if there is
A: No white clipping on the horizon
B:Night time still looks dark (you can make night time almost daylight if you adjust that too far)

If it looks ok, it looks ok.
If it doesn't then, this is perhaps an example where actually being able to inject 1000 and set the game to 1000 would actually give you the most optimal and most "correct" (if there is such a thing) image
So SoT is a weird implementation of its nits slider, and most games should be set to 4000?

Does that mean I should inject Great Gatsby metadata (which I think is 4000 nits) into every game, or 1000 nits? PC is 1000 nits so I don't know which I should do.
 
OP
OP
EvilBoris

EvilBoris

Prophet of Truth - HDTVtest
Verified
Oct 29, 2017
16,684

Lots of games treat 400-500 nits as a very similar level of dynamic range to the SDR mode, so if you are playing a game that allows you to drop it down (without causing any immediately noticeable problems) you may find that this will prevent be TV from pushing the backlight too hard (which will grey up the blacks) but you'll still get some of the benefit of a 10bit output (less banding) and little gains in colour gamut which you wouldn't have in SDR mode.
 
OP
OP
EvilBoris

EvilBoris

Prophet of Truth - HDTVtest
Verified
Oct 29, 2017
16,684
So SoT is a weird implementation of its nits slider, and most games should be set to 4000?

Does that mean I should inject Great Gatsby metadata (which I think is 4000 nits) into every game, or 1000 nits? PC is 1000 nits so I don't know which I should do.

I think I would go for the lower 1000nit value as it is closer to the true peak of your display.

If you use 4000nit data on a game which is internally capped at 1000 or 2000nits, then you will still have an issue with perhaps losing out a little.

1000/2000/4000 are the most common that I see as fixed peak caps.
Even games with a 4000nit cap may only go to 2500 in the game engine for very specific high intensity light sources.

The only time I would switch out for a 4000nit game is for some of the extremely bright titles (God of War, Metro Exodus both spring to mind)

Ultimately you want to preserve the intensity of brightness for the things you see most often.
 

Kyle Cross

Member
Oct 25, 2017
8,431
I think I would go for the lower 1000nit value as it is closer to the true peak of your display.

If you use 4000nit data on a game which is internally capped at 1000 or 2000nits, then you will still have an issue with perhaps losing out a little.

1000/2000/4000 are the most common that I see as fixed peak caps.
Even games with a 4000nit cap may only go to 2500 in the game engine for very specific high intensity light sources.

The only time I would switch out for a 4000nit game is for some of the extremely bright titles (God of War, Metro Exodus both spring to mind)

Ultimately you want to preserve the intensity of brightness for the things you see most often.
Okay so to sum it up for most games:

In-game: 4000 nits
HDFury Linker: 1000 nits

Right?

I guess my last question is paper white. This is a setting I've never wrapped my head around. Is there an ideal setting of it for a B7?
 

SirMossyBloke

Member
Oct 26, 2017
5,855

I have the same TV but calibrated, it supports max nits of 800 which isnt bad but thats what you want to set your games to. I've found most games you can set to 1000 no problem, but make sure your TV is calibrated as best as possible and you should still get some good black levels out of it. I havent found a game that doesnt show a big difference yet, but remember these TVs are meant for darker rooms.
 

Robdraggoo

User requested ban
Banned
Oct 25, 2017
2,455
Lots of games treat 400-500 nits as a very similar level of dynamic range to the SDR mode, so if you are playing a game that allows you to drop it down (without causing any immediately noticeable problems) you may find that this will prevent be TV from pushing the backlight too hard (which will grey up the blacks) but you'll still get some of the benefit of a 10bit output (less banding) and little gains in colour gamut which you wouldn't have in SDR mode.
I have the same TV but calibrated, it supports max nits of 800 which isnt bad but thats what you want to set your games to. I've found most games you can set to 1000 no problem, but make sure your TV is calibrated as best as possible and you should still get some good black levels out of it. I havent found a game that doesnt show a big difference yet, but remember these TVs are meant for darker rooms.
Would you mind sharing your settings with me? Regular tv settings?
 
Dec 13, 2018
46
Well for example you just recommended 1000nits for Sea of Thieves, not 4000. So I'm still confused.

Also with a Linker should I be feeding 1000 nits (PC metadata) or 4000 nits (The Great Gatsby)?
As a side note, in the case of Great Gatsby metadata being sent to B7/C7, it doesn't take into account the 4000nit max. The B7/C7 actually look at the MaxCLL value (603 nits) and applies little to no tone mapping. (As these OLED's have peak brightness above 603 nits)

It has been quite confusing for me, trying to figure out how the 2017 OLEDS on newest firmware tone map given the metadata being sent. Sometimes it looks at Max luminance (4000 or 1000 when injecting metadata) whereas other times in the case of GreatGatsby it looks at MaxCLL. Which is why the GreatGatsby will provide the brightest image, or close to the brightest image, when its metadata is sent to your 2017 OLED on game mode using the HDFury Linker.
 

Kyle Cross

Member
Oct 25, 2017
8,431
As a side note, in the case of Great Gatsby metadata being sent to B7/C7, it doesn't take into account the 4000nit max. The B7/C7 actually look at the MaxCLL value (603 nits) and applies little to no tone mapping. (As these OLED's have peak brightness above 603 nits)

It has been quite confusing for me, trying to figure out how the 2017 OLEDS on newest firmware tone map given the metadata being sent. Sometimes it looks at Max luminance (4000 or 1000 when injecting metadata) whereas other times in the case of GreatGatsby it looks at MaxCLL. Which is why the GreatGatsby will provide the brightest image, or close to the brightest image, when its metadata is sent to your 2017 OLED on game mode using the HDFury Linker.
So Great Gatsby causes the B7's tonemapping to basically turn off, or am I misunderstanding?
 
Dec 13, 2018
46
So Great Gatsby causes the B7's tonemapping to basically turn off, or am I misunderstanding?
You got it. It "should" at least. I have noticed it is still a bit dimmer than, say, turning on active HDR in other modes outside of game. And much dimmer than the standard HDR mode. (I believe standard mode really disables all ton mapping, at least that has been what I've read.)

You can see where my confusion is coming from doing tests with the linker in game mode.
 
Oct 27, 2017
1,387
You got it. It "should" at least. I have noticed it is still a bit dimmer than, say, turning on active HDR in other modes outside of game. And much dimmer than the standard HDR mode. (I believe standard mode really disables all ton mapping, at least that has been what I've read.)

You can see where my confusion is coming from doing tests with the linker in game mode.
If this is true, the overall picture will be brighter, but you'll be clipping detail above 603 nits unless you are able to set the in-game nits to around ~600 nits.
 
Dec 13, 2018
46
So Great Gatsby causes the B7's tonemapping to basically turn off, or am I misunderstanding?
I would strongly recommend using the Great Gatsby metadata for brightest picture, and something like Bourne identity (1000 nit max luminance) to retain a bit more highlight detail in games that don't allow peak brightness calibration.

I have actually set it to Borne identity / legacy as I just wanted to set it and forget it.
 
Dec 13, 2018
46
If this is true, the overall picture will be brighter, but you'll be clipping detail above 603 nits unless you are able to set the in-game nits to around ~600 nits.
Still looks to retain highlights up to 1000 nits using this mode. Which led to my confusion even more.

The borne identity / legacy metadata retains highlights up to about 2000.

(When I say retain highlights, that is using an in game calibrator, say like the new RDR2 calibrator to see when the R* dissapears
 

kanuuna

Member
Oct 26, 2017
726
So SoT is a weird implementation of its nits slider, and most games should be set to 4000?

Does that mean I should inject Great Gatsby metadata (which I think is 4000 nits) into every game, or 1000 nits? PC is 1000 nits so I don't know which I should do.

The Great Gatsby metadata actually corresponds to a max luminance value of 603 (MaxCLL, which is the only value the 7 series are taking into account while tonemapping) being sent to the TV. While that nets you a fairly average brightness, it probably clips at times when your games are set to 1000 nits.
I'm not overly familiar with the PC metadata you're referring to, but if it's what I think it is, it's probably brighter than that (a more linear curve - brighter, but clips more).

Not to dissuade you from using either one, since what works or doesn't largely depends on the room you're in - ambient light and all that.
 

ZmillA

Member
Oct 27, 2017
2,163
No, dynamic tone mapping in game mode should solve all brightness issues that you hear 2017 OLED owners talking about. Also, in game mode on 8 series even without dyn tone mapping enabled, it provides a brighter picture. Vincent shows the different eotf curves in the c8 vs b7 comparison video, I believe.

The reason, at least for me, that a linker was purchased was to improve the dim tone mapping on my C7. Your B8 shouldn't have an issue and you should not need a linker.

Thanks
 

Kyle Cross

Member
Oct 25, 2017
8,431
The Great Gatsby metadata actually corresponds to a max luminance value of 603 (MaxCLL, which is the only value the 7 series are taking into account while tonemapping) being sent to the TV. While that nets you a fairly average brightness, it probably clips at times when your games are set to 1000 nits.
I'm not overly familiar with the PC metadata you're referring to, but if it's what I think it is, it's probably brighter than that (a more linear curve - brighter, but clips more).

Not to dissuade you from using either one, since what works or doesn't largely depends on the room you're in - ambient light and all that.
Actually the Great Gatsby metadata is significantly brighter than the PC metadata. In the past I've been recommended that 1000 PC metadata for bright HDR games, and Great Gatsby for dim HDR games.

It's the PC metadata that's provided in the Linker FAQ here and on AVForums.
 

Deleted member 35071

User requested account closure
Banned
Dec 1, 2017
1,656
my TV is old and crappy by this point. Sony x800d. But i for the life of me can't find a good HDR setting for RDR2. I can make it look decent for daytime. But the game looks awful to me at night.

And the little logo boxes when calibrating dont even work for my tv i guess. I can't see the white one at all. And i can't make the black one disappear.

I assumed because my TV is a paltry 380 nits or something (i googled). That leaving it at the default 500 is the way to go? but idk. And setting the paper white to anything below like 190 is too dark for me.
 

kanuuna

Member
Oct 26, 2017
726
Actually the Great Gatsby metadata is significantly brighter than the PC metadata. In the past I've been recommended that 1000 PC metadata for bright HDR games, and Great Gatsby for dim HDR games.

It's the PC metadata that's provided in the Linker FAQ here and on AVForums.

You're right. I must've recalled something else. That one looks to be 1000 nits.
 

Spacejaws

One Winged Slayer
Member
Oct 27, 2017
7,814
Scotland
I've given up on RDR2 on my TV. Closest I can get to it looking natural is game mode, paper white at 200 ish and peak brightness at 900-1000. Paper white is the real fuck up here no matter what I do white either looks dull/greyish or vivid and all the black contrast is wrong. Really noticeable on the weapon wheel. Starting to think my TV's implementation of HDR might be bust.

Put it on for Mortal Kombat and it looks kinda better? Hard to tell, definetly more colours but again I feel like the contrast is off and that some of the black has reddish tints to it. It was a cheap 4K TV so never really expected much out of it for HDR. I guess there's next time :/
 

impact

Banned
Oct 26, 2017
5,380
Tampa
my TV is old and crappy by this point. Sony x800d. But i for the life of me can't find a good HDR setting for RDR2. I can make it look decent for daytime. But the game looks awful to me at night.

And the little logo boxes when calibrating dont even work for my tv i guess. I can't see the white one at all. And i can't make the black one disappear.

I assumed because my TV is a paltry 380 nits or something (i googled). That leaving it at the default 500 is the way to go? but idk. And setting the paper white to anything below like 190 is too dark for me.
On that TV you're probably gonna get better picture quality from SDR. I used to own it but quickly realized my mistake since I wanted an HDR TV.
 

RedshirtRig

Member
Nov 14, 2017
958
Curious what setting you all have your TVs on for games. Got a Samsung 4k TV (Sorry exact model unknow). Set it on movie picture mode, seems to be the mode people on YouTube who know more than me use.