LG 2017 OLEDs - Calibrated Settings for Xbox One X / webOS (SDR/HDR/DV)

SublimeAnarky

Member
Oct 27, 2017
384
P40L0 for someone who is technically ignorant, what's the difference between options 1 and 2?

Is there an image or performance difference between the two?
 
OP
OP
P40L0

P40L0

Member
Jun 12, 2018
820
Italy
Hi P40L0. I was wondering why it is preferable to setup Xbox One X in 8 bit in game mode instead of 10 bit?
In Option 1 this will basically let X1X to just "pass" the native contents' Chroma + Color Depth to the TV, without middle recoversions, so SDR will output 8-bit, HDR 10-bit and DV 12-bit (and the TV will then handle the signal for its 10-bit panel).

On Option 1 input lag is higher that way, so in Option 2: enabling 10-bit offloads all TV processing of the signal, and X1X will upconvert/reconvert or downconvert everything to 10-bit before passing the final signal to the TV. This seems to fix the higher input lag in PC Mode + HDR Standard.
 
Last edited:
OP
OP
P40L0

P40L0

Member
Jun 12, 2018
820
Italy
P40L0 for someone who is technically ignorant, what's the difference between options 1 and 2?

Is there an image or performance difference between the two?
Option 2 is more accurate in SDR, and more bright, but slightly more clipped in HDR (but people claim higher input lag on PS4 Pro and on PC, while on X1X this can be fixed with the correct video settings as suggested)

Option 1 is less accurate both in SDR and HDR, but with less highlights clipping (input lag is always consistently low on both tho)

This is basically why two preset alternatives exist based on devices and preference.
Personally, I currently prefer Option 2 on X1X. ;)
 
Last edited:
OP
OP
P40L0

P40L0

Member
Jun 12, 2018
820
Italy
I'm quite skeptical that it's really changed since you last looked at it. How would it had to have changed to make it accurate?
Don't know, but you can just apply both calibrated settings for HDR Technicolor Expert (with Active HDR) and HDR Standard, and swapping between the two while watching a 4K/HDR movie, and accuracy is the same.

HDR Standard is only even brighter, and tends to clip higher nits highlights more.
 
Last edited:

SublimeAnarky

Member
Oct 27, 2017
384
Option 2 is more accurate in SDR, and more bright, but slightly more clipped in HDR (but people claim higher input lag on PS4 Pro and on PC, while on X1X this can be fixed with the correct video settings as suggested)

Option 1 is less accurate both in SDR and HDR, but with less highlights clipping (input lag is always consistently low on both tho)

This is basically why two preset alternatives exist based on devices and preference.
Personally, I currently prefer Option 2 on X1X. ;)
Thanks. I'll give option 2 a spin.

When you say 'highlights clipping', what do you mean? Fidelity at high brightness?
 

Manac0r

Member
Oct 30, 2017
231
UK
Quite a lot lol, but I'll have a look at the C8 later and see how good I can get that, it'll be very similar I'm sure, in terms of whites/colour/gamma etc.
Thanks would be much appreciated. After playing around with option 1 and injected meta data, and then option 2 without any injection the picture seems some what comparable by eye. With all the subtle tweaks it is not as easy as flicking between presets so it might placebo.

I think option 2 is definitely a viable option especially for those not using any injection methods.

As always will look out for your posts and thoughts. Good looking out..

Edit:

In option 1 with injection RDR2 can go above 2000 nits before clipping, however option 2 clipping starts at a 1000 nits which is in line with what P40L0 is saying.
 
Last edited:

burgerdog

Member
Oct 27, 2017
2,128
I tried option 2 in hdr last night and it’s unplayable. The latency is way too high. Anyone who thinks it’s fine needs to boot up shadow of the tomb raider and try it out. It feels awful. Other games felt a little bit better but it’s not as responsive as hdr game.
 

charly.be

Member
May 18, 2019
7
I tried option 2 in hdr last night and it’s unplayable. The latency is way too high. Anyone who thinks it’s fine needs to boot up shadow of the tomb raider and try it out. It feels awful. Other games felt a little bit better but it’s not as responsive as hdr game.
Shadow Of The Tomb Raider already has a ton of input lag, so it doesn't take too much to make it unplayable...
 

charly.be

Member
May 18, 2019
7
In Option 1 this will basically let X1X to just "pass" the native contents' Chroma + Color Depth to the TV, without middle recoversions, so SDR will output 8-bit, HDR 10-bit and DV 12-bit (and the TV will then handle the signal for its 10-bit panel).
So it would be better for me to check "10 bits" instead of 8 if I play in HDR option 1 (game mode) on Xbox One X?
 

DOTDASHDOT

Member
Oct 26, 2017
2,230
Thank you for that. I look forward to your report.
Thanks would be much appreciated. After playing around with option 1 and injected meta data, and then option 2 without any injection the picture seems some what comparable by eye. With all the subtle tweaks it is not as easy as flicking between presets so it might placebo.

I think option 2 is definitely a viable option especially for those not using any injection methods.

As always will look out for your posts and thoughts. Good looking out..

Edit:

In option 1 with injection RDR2 can go above 2000 nits before clipping, however option 2 clipping starts at a 1000 nits which is in line with what P40L0 is saying.
Had a good go at it on my C8 and DTM off, obviously it's really hard for me to say how much brighter it is on a B7, as the C8 tracks a lot brighter without DTM vs B7, but anyhow PC Standard with a colour temp tweak to W40 and colour 55 does look decent and comparable to HDR game.......BUT that lag :/
 
OP
OP
P40L0

P40L0

Member
Jun 12, 2018
820
Italy
Had a good go at it on my C8 and DTM off, obviously it's really hard for me to say how much brighter it is on a B7, as the C8 tracks a lot brighter without DTM vs B7, but anyhow PC Standard with a colour temp tweak to W40 and colour 55 does look decent and comparable to HDR game.......BUT that lag :/
Did you try setting X1X Color Depth to 10-bit?
Seems pretty identical to HDR Game response time after that to me (or difference may be so subtle to go unnoticed).
 

charly.be

Member
May 18, 2019
7
I just tried option 2 in SOTR with 10 bits enable and YCC disabled. PQ is great but I can also feel a slight increase in input lag (and I'm playing in performance mode which already has lower input lag than resolution mode).
 

charly.be

Member
May 18, 2019
7
Does someone know if it's ok to change Xbox video output parameters while a game is running in HDR or should it be restarted between each test?
 
Last edited:

charly.be

Member
May 18, 2019
7
Well I have to thank you P40L0 for making me finally able to enjoy HDR games on my C7 1,5 year after buying it. I was so disappointed by the game mode being so dim that I finally abandoned hdr in games.
 
OP
OP
P40L0

P40L0

Member
Jun 12, 2018
820
Italy
Isn't the dimness problem in the 2017 OLEDs solved with dynamic tone mapping in the 2018 & newer OLEDs? I've got a B8 and I'm trying to figure out how applicable these settings are for me.
Apply webOS HDR suggested settings to HDR Game, then disable Dynamic Contrast and enable Dynamic Tone Mapping and you're good to go also on B8.
 

Manac0r

Member
Oct 30, 2017
231
UK
Thank you.. So as I see it... :

Option 1: Great PQ for those sensitive to lag, and even better with injected metadata.

Option 2: the brightest HDR you can get with minimum (highlight peaks clip) visual compromise, however lag can be an issue for those more susceptible.

Looks like this thread caters to all tastes. Going with option 1 and injected metadata of a 1000nits myself but option 2 is the best alternative I have seen.

This is an outstanding thread. Thank you!!
 
Last edited:

Manac0r

Member
Oct 30, 2017
231
UK
In true Columbo style... Just one more thing...

Why is wide gamut option recommended for HDR when auto has usually been the option for less satured and more accurate PQ?
 
OP
OP
P40L0

P40L0

Member
Jun 12, 2018
820
Italy
In true Columbo style... Just one more thing...

Why is wide gamut option recommended for HDR when auto has usually been the option for less satured and more accurate PQ?
Because Wide is forced when using PC Mode + HDR Standard, but adjusting Color and Temperature values can get VERY close to Auto results.

Thank you.. So as I see it... :

Option 1: Great PQ for those sensitive to lag, and even better with injected metadata.

Option 2: the brightest HDR you can get with minimum (highlight peaks clip) visual compromise, however lag can be an issue for those more susceptible.

Looks like this thread caters to all tastes. Going with option 1 and injected metadata of a 1000nits myself but option 2 is the best alternative I have seen.

This is an outstanding thread. Thank you!!
Enjoy ;)
 

Art Vandelay

Member
Oct 27, 2017
882
USA
Does any else have an issue loading the page now? I’m on iPhone if that matters. Have tried chrome and safari, and resetting cache and cookies.
 

Sky87

Member
Oct 27, 2017
894
New firmware available for the LG C7V:

1. Improvement
1) To Improve color issues at HDMI 2160p 50Hz input

Seems pretty irrelevant for gaming usage, but may be undocumented changes.
 
OP
OP
P40L0

P40L0

Member
Jun 12, 2018
820
Italy
New firmware available for the LG C7V:

1. Improvement
1) To Improve color issues at HDMI 2160p 50Hz input

Seems pretty irrelevant for gaming usage, but may be undocumented changes.
Yeah, I'm already on it since pre-release, but it should be relevant for EU/PAL TVs and devices only.
All the OP settings are still compatible with it.
 

zzzyz36

Member
Oct 23, 2018
90
Anyone know the best in game HDR setting for RDR2 since the patch? Can now tweak peak brightness and something called paperwhite
 

PowerK

Member
Oct 31, 2017
41
CALIBRATED IN-GAME HDR SUGGESTED SETTINGS / EXAMPLES:


Option 1:
  • AC Origins: 4.000 nits HDR Luminance, 120 Paper White, Brightness 1 tick left compared to default;
  • AC Odyssey: 4.000 nits HDR Luminance, 120 Paper White, Brightness at middle (Default);
  • Battlefield 1: Brightness at 50% (Default), 4.000 nits HDR Luminance
  • Battlefield V: Brightness at 50% (Default), 2.000 nits HDR Luminance (Max)
  • Forza Horizon 3: Brightness 50 (Default), HDR Luminance slider to the Max
  • Forza Horizon 4: 4.000 nits HDR Luminance, Brightness 50 (Default)
  • Forza Motorsport 7: Brightness 50 (Default), HDR slider to the Max
  • Halo: MCC: all at Defaults (1.000 HDR Luminance, 150 Paper White, Contrast 5)
  • Hellblade: Gamma raised 1 tick right;
  • Gears of War 4: Brightness at default, HDR Luminance 8 ticks from left to right
  • Red Dead Redemption 2: HDR Style: Game; Luminance: 4.000; White point: 120 NEW
  • Resident Evil 7: HDR Luminance 2.000 nits (its Max), HDR Brightness 320
  • Rise of the Tomb Raider: Brightness to the Middle and HDR Luminance to the Max
  • Shadow ot the Tomb Raider: Video Mode: Resolution (for Native 4K), HDR: On, Brightness: Maxed to the right; HDR Luminance slider: Maxed to the right
Let me get this straight.
Usually games titles provide in-game instruction picture for adjusting black level, contrast, peak HDR luminance etc. (I wish The Division 2 provides it, too).
Are you suggesting to ignore it and adjust sliders based on numbers ?

I ask because for example, in Forza Horizon 4, according to onscreen setting picture, peak white starts to clip around 1, 500 nits of HDR luminance settings. Still recommended to raise it to 4,000 nits ?
I understand that there's a possibility of in-game onscreen picture/reference being wrong & misguided.
 
OP
OP
P40L0

P40L0

Member
Jun 12, 2018
820
Italy
Let me get this straight.
Usually games titles provide in-game instruction picture for adjusting black level, contrast, peak HDR luminance etc. (I wish The Division 2 provides it, too).
Are you suggesting to ignore it and adjust sliders based on numbers ?

I ask because for example, in Forza Horizon 4, according to onscreen setting picture, peak white starts to clip around 1, 500 nits of HDR luminance settings. Still recommended to raise it to 4,000 nits ?
I understand that there's a possibility of in-game onscreen picture/reference being wrong & misguided.
If you're using Option 1 suggested general calibration you won't hide the FH4 logo with only 1.500 nits, as you can still see it (or part of it) up to 3.500 nits.

In the OP you can find different in-game HDR values depending if you have applied Option 1 or Option 2 calibration on TV/Console. Be sure to not mix the two.
 

PowerK

Member
Oct 31, 2017
41
If you're using Option 1 suggested general calibration you won't hide the FH4 logo with only 1.500 nits, as you can still see it (or part of it) up to 3.500 nits.

In the OP you can find different in-game HDR values depending if you have applied Option 1 or Option 2 calibration on TV/Console. Be sure to not mix the two.
Understood. Perhaps, I didn't recall the nit number.
So, the suggestion is 4,000 nits for FH4 but the reference picture logo disappears around 3,5000 nits as you say.
My question was should one ignore in-game reference picture?
 
OP
OP
P40L0

P40L0

Member
Jun 12, 2018
820
Italy
Understood. Perhaps, I didn't recall the nit number.
So, the suggestion is 4,000 nits for FH4 but the reference picture logo disappears around 3,5000 nits as you say.
My question was should one ignore in-game reference picture?
You should never ignore it, as if you get very close to the screen you will still see parts of the logo still there at 3.500 nits, while 4.000 nits make it vanish completely.

This is saying that most of the time in-game HDR calibrations' results and general suggestion are matching ;)
 

RoninChaos

Member
Oct 26, 2017
2,917
hey, P40L0, On the 2019 sets, the gamma isn't setup how you have it. The options are 1.9, 2.2, 2.4 and BT.1886. Which should I be using?

Also, warm 2 still looks way too cold. Warm 3 is what looks "right" to me, but I may be doing something wrong. Any one have any suggestions?
 
Last edited:

Kyle Cross

Member
Oct 25, 2017
3,965
hey, P40L0, On the 2019 sets, the gamma isn't setup how you have it. The options are 1.9, 2.2, 2.4 and BT.1886. Which should I be using?

Also, warm 2 still looks way too cold. Warm 3 is what looks "right" to me, but I may be doing something wrong. Any one have any suggestions?
2.2 is "Medium" and should be what most use. If you're in a very dark room, you can go BT.1886. Warm2 is usually the most accurate temperature. If it seems too cold/warm you usually just need to give your eyes time to adjust to it.
 

PowerK

Member
Oct 31, 2017
41
hey, P40L0, On the 2019 sets, the gamma isn't setup how you have it. The options are 1.9, 2.2, 2.4 and BT.1886. Which should I be using?
OLED which can do perfect black, BT.1886 is exactly the same curve as gamma 2.4.
LCD which can’t do perfect black, BT.1886 is a slightly different curve, but more or less the same as gamma 2.4.
These are both for “dark room” viewing. You should think of BT.1886 as equivalent to gamma 2.4 in what it aims to get out of the display and how quickly it “comes out of black”.

Gamma 2.2 on the other hand is used for “bright room” viewing - lower number of gamma curve means it “comes out of black” more rapidly so that you can see the very dark shadow details in the bright room conditions. Some people prefer the look of 2.2 even if they can see everything with it set to 2.4.
 
OP
OP
P40L0

P40L0

Member
Jun 12, 2018
820
Italy
hey, P40L0, On the 2019 sets, the gamma isn't setup how you have it. The options are 1.9, 2.2, 2.4 and BT.1886. Which should I be using?

Also, warm 2 still looks way too cold. Warm 3 is what looks "right" to me, but I may be doing something wrong. Any one have any suggestions?
TL:DR - Use Gamma 2.2 and Warm 2, your eyes will adjust to it pretty quickly on every room condition ;)
 

ForeverYung87

Member
Nov 8, 2017
278
Gotta say, Option 2 is great. I’m really enjoying playing Days Gone with these settings. Lag seems on par with Game Mode to me.

I willl say DV for X1X looked a bit dim but I raised OLED light to counter that.
 
OP
OP
P40L0

P40L0

Member
Jun 12, 2018
820
Italy
Gotta say, Option 2 is great. I’m really enjoying playing Days Gone with these settings. Lag seems on par with Game Mode to me.

I willl say DV for X1X looked a bit dim but I raised OLED light to counter that.
Yep, Option 2 also became my favorite setup for gaming.

For DV being too dim, are you using DV Cinema settings or DV Game?
I found DV Game profile pretty bright?
Anyway, no issues in raising OLED Light a bit for DV, but I would not go over 60 value (compared to default 50).