Just a quick question about Wide Color Gamut. I see mentions of Gears 4 supposedly supporting this and i own an LG C7. Should i manually change my TV into Wide Gamut instead of using Auto when playing games that support it, or will Auto work as intended?
I know that if i force my TV into WCG the colors look a bit too overblown, so Auto is the way to go on the OLEDS from what i've heard.No games on Xbox (beyond the insects demo possibly) are actually WCG.
Both consoles always output WCG if HDR is enabled, I'm sure auto will be fine either way.
I know that if i force my TV into WCG the colors look a bit too overblown, so Auto is the way to go on the OLEDS from what i've heard.
Just a quick question.
Is the regular PS4 capable of true HDR ? I was under the impression only the Pro could output in HDR but it seems i was wrong ?
Should I bother to rearrange my connection so that my PS4 doesn't pass through the old-ish Marantz receiver ?
Will it make any real difference on my not-very-bright (800 nits) TV anyway ?
Seems to be the thread to ask this random question.
Anybody here play Ratchet & Clank in HDR and notice on the title screen that the sun coming over the horizon was blown out with lots of artefacting?
No games on Xbox (beyond the insects demo possibly) are actually WCG.
Both consoles always output WCG if HDR is enabled, I'm sure auto will be fine either way.
The "deeper black" thing regarding HDR is nonsense. RGB 0 (or RGB 16) is black. HDR doesn't enable some magical Spinal Tap "none more black" mode.
When someone in a store says HDR means deeper blacks you know they don't know what HDR is about. HDR is about colour volume, WCG and highlights.
I have a question for you guys and girls - and the resident expert EvilBoris
Like many people here, I love HDR in games and, when done right, the difference between HDR and SDR is very clear. However, I notice the difference between the two almost exclusively in bright highlights and, also, in a wider range of colors. Makes sense, of course. However, every description of HDR mentions "deeper blacks". I've never seen any deeper blacks or more shadow detail in HDR games compared to SDR titles. At first, I assumed my TV is to blame, but after upgrading to an OLED, I still don't see the difference in shadow detail. What am I missing?
Also, just saw this.... what do you mean they are not actually WCG? Just Xbox or PS4 too, because I can definitely see those reds and greens go nuts in HDR compared to SDR in God of War or Shadow of Tomb Raider.
Oh man,,just read that now..sorry to hear as I was really looking forward to your analysis of MCC.I was working on a big in depth look at MCC, but my Mac totally fried itself last night so have had some what of a hitch.
I've ordered a new Surface Book 2 today, which is quite a bit more capable , so I may even look at making a video instead
I've seen mention across the internet by people that on PC you shouldn't actually switch to YUV422 10bit for HDR, but instead leave it on RGB 8bit which will then apply dithering. The claim seems to be the loss of color information going from RGB to YUV422 is severe and that the 8bit with dithering is basically indistinguishable from 10bit. What's your thoughts on this EvilBoris ?
I just don't know which one to go with, I want to have the best experience. Ultimately, which "sacrifice" would you recommend?I don't know enough about it to be honest, you are essentially disposing of information with either method, just in different ways.
There are a selection of PC games that do actually output 10bit and have the assets to match, so you definitely wouldn't want to use 8bit for that.
I just don't know which one to go with, I want to have the best experience. Ultimately, which "sacrifice" would you recommend?
LG B7 via HDMI.
You need 10bit to even get it to activate HDR most of the time, not to mention 10bit is 4 times more information than 8bit, not double, and 4/2/2 is only half the chroma, and still full luma, so you are still getting 2/3rds of the data. The cherry on the top of all of this is the B7 (i have this same tv), has really shitty dithering. Stick to 4/2/2 10bit.
Any tips for Far CRy 5 HDR? I was hoping for a AC: Origins like system but there is only HDR Brightness, which elevated overall brightnes and black level.
You need 10bit to even get it to activate HDR most of the time, not to mention 10bit is 4 times more information than 8bit, not double, and 4/2/2 is only half the chroma, and still full luma, so you are still getting 2/3rds of the data. The cherry on the top of all of this is the B7 (i have this same tv), has really shitty dithering. Stick to 4/2/2 10bit.
Edit: Bad math
420 at the same bit depth as 422 has less colour detail , so I don't know why you'd choose 420.EvilBoris I've been seeing some recommend 420 over 422 for HDR, stating that 420 has less banding and that it's what most games are mastered in, anyways. Your thoughts?
EvilBoris given that many people have different opinion about RDR2's HDR, will you do an analysis for it?
Wow, nice news man. Looking forward to it.Yes - it should be appearing on Digital Foundry in about 7 hours
Glad to hear that that all came together. Are you an official member or a contributor? I hope official as I want a lot more HDR coverage on DF, especially when developers listen to DF and I want HDR in gaming to get better.Yes - it should be appearing on Digital Foundry in about 7 hours
On X1X, when enabling 4:2:2 (or forcing any other Color Depth options other than 8-bit) , what happens is that all SDR content will be internally converted from RGB to 4:2:0 (visibly loosing clarity), while for all HDR contents all games are converted from RGB to 4:2:0 and then upconverted again from 4:2:0 to 4:2:2 (but results are very similar to native 4:2:0).420 at the same bit depth as 422 has less colour detail , so I don't know why you'd choose 420.
You are literally throwing away information you could otherwise use, I mean it's probably absolutely minute and inperceiveable, but if it's there, you might as well use it.
That's the theory, goodness knows how a display could treat those signals differently and produce different results
Woah, that's a surprise.Yes - it should be appearing on Digital Foundry in about 7 hours
Yes - it should be appearing on Digital Foundry in about 7 hours
Just read your piece on DF and I've the following question. Should I just turn off HDR when playing RDR2? Or does it come down to preference? What would you recommend?
Yup. Next expansion they'll fix it and do what people asked for. Destiny 2 Year 3 will have proper HDR next September. Believe.
stumbled across this video: https://www.youtube.com/watch?v=hJgXQ3qNg74
think you already saw it!?
the bullshit going on with fake 4k and hdr is pretty annoying..
Also, I re-played Spider-Man and the HDR presentation is weak in that game.
Man, can't wait to see how RDR2 looks in HDR.
stumbled across this video: https://www.youtube.com/watch?v=hJgXQ3qNg74
think you already saw it!?
the bullshit going on with fake 4k and hdr is pretty annoying..
What makes you think it is weak?
And man RDR in HDR lol oh how we waited to see something awesome.
To me it seems like a very subdue implementation. For example the lampposts in night time compared to Infamous SS don't look very bright. Other than Electro's powers and the sun at dusk I don't really remember any moments that HDR added much to the image. Not bad but certainly not among the best.
And don't even get me started on RDR2. The more I play the more it saddens me we didn't get a good HDR implementation.
To me it seems like a very subdue implementation. For example the lampposts in night time compared to Infamous SS don't look very bright. Other than Electro's powers and the sun at dusk I don't really remember any moments that HDR added much to the image. Not bad but certainly not among the best.
And don't even get me started on RDR2. The more I play the more it saddens me we didn't get a good HDR implementation.
What's up with Black Ops 4 HDR? As far as I can tell it's unusable on both of my TVs. Washed out.
Is it just a slider or something else that's missing?