It should
Great that they got the fix out only a few weeks after the problem popped up. I think some were worried that this would be pushed to the back burner because of other projects behind-the-scenes.
Me too. Already ask Dark1x on twitter for seeing that.As I said in the older thread, I rather wait a while before downloading this patch, so that maybe Digital Foundry or someone else can test it first. I didn't download the prior patch that messed so much stuff up, so I'm in no hurry.
With "increases foliage draw distance" they actually mean reincreases foliage draw distance, right? It didn't receive any improvementsover it's pre-messed up settings?
What would have been really great is if they had just never released a patch with this problem in the first place.Great that they got the fix out only a few weeks after the problem popped up. I think some were worried that this would be pushed to the back burner because of other projects behind-the-scenes.
So i wonder what compromise they took this time. Probably gonna find out shadows don't load 10 feet outside the player.
Wasn't that more of a glitch? There were drawbacks but that video doesn't really show the normal experience.
DF made a detailed video about the downgrades
I'm still a bit shocked this didn't happen. CDPR is such a PC focused company it's strange to me that the PC version gets left out of this feature. Who knows... now that console versions are fully working now (hopefully!) PC will get some HDR love as well. Glad to see this issue was fixed for the PS4 Pro.
So fast? Really? And I wouldn't be surprise if it runs even worse after the HDR stuff. I mean I don't remind the fps struggle so much during the rain before. I remebered it wasn't stable but now I noticed immediately fps drops even more.
So fast? Really? And I wouldn't be surprise if it runs even worse before the HDR stuff. I mean I don't remind the fps struggle so much during the rain with this patch. I noticed it immediately.
It isn't. EvilBoris analyzed it and sadly it is basically SDR.
The implementation was said to be "not great". Still better than no HDR. Here was EvilBoris's opinion:
kanuuna
So with the Witcher 3 it looks like either the lightining engine itself doesn't support some of the back end stuff to let HDR work it's magic (or perhaps it's simply not graded right?)
What I am often seeing is a very flat essentially SDR image, here this shot of the moon is all within SDR ranges. The glow from the campfire is also essentially SDR at this point.
But it's dark
We see almost the same thing here, except a tiny bit of 1000nit data on the moon, but nothing else we could consider being within the HDR range, notably the specular highlights in the water not reflecting anything even vaguely bright. You can also see a 1000nit torch in the background
Now if we start to look at some other conditions, here we can see that this fire is actually brighter than the sun, this is really causing a loss of impact and again we essentially have an SDR image.
And in daylight we are seeing weird things, facing the sun here Geralt's face is hitting 1000hits, as are a few rocks and items in the background. He actually looks pretty ghostly. But again, an SDR image with some misplaced bright spots.
And again, looking straight at the sun, a whole load of detail is missing where I would expect to see it, however even as an HDR screenshot it looks very similar....
One thing I noticed as I was putting together this post was that my tonemapped image (which is a really quick and dirty uncomplicated representation of the HDR image in SDR) looked much closer to the actual in game images than any other game I have looked at before.
Now what we are seeing is a very sharp curve from when something is SDR to when it is suddenly HDR, as if once a specific 8bit colour threshold is reached it's pumping it straight to 1000nits.
This would explain the overall low dynamic range image we are seeing, as well as these sharp brightness ramps which bypass almost everything between 150-1000 nits.
So to confirm this I tried to get the same shot within the game when running in SDR and HDR, I then took the SDR shot and did a quick and dirty image adjustment to move the SDR image into an HDR-like code value range.
I then ran my filter across the top of them...
Nearly identical.
So I believe the game is taking that 8bit SDR image and automatically kicking out 1000nits as soon as anything approaches 255 white in SDR.
So what I think we are looking at with the Witcher 3 is essentially a quick and dirty pseudo-HDR effect, almost in the form of a post processing filter.
So fast? Really? And I wouldn't be surprise if it runs even worse after the HDR stuff. I mean I don't remind the fps struggle so much during the rain before. I remebered it wasn't stable but now I noticed immediately fps drops even more.
Is this only for 4K displays on the Pro, or is there super sampling for 1080P displays ?
Is this only for 4K displays on the Pro, or is there super sampling for 1080P displays ?
I believe it's time I finish the starting area and actually play the game for once.
For the record the game was done years ago. I hate when fans call games incomplete or broken just because of some technical hiccups.
Can confirm, game looks amazing and performance is better than the first Pro patch.
I don't think that's what they were getting at. CDPR have really struggled to get the Witcher 3 working well on the PS4. They did get there, but often each attempt to make the PS4 version better actually broke as much as it fixed. It took a long time, and many, many patches. It ended up in a great place though... until the HDR patch.For the record the game was done years ago. I hate when fans call games incomplete or broken just because of some technical hiccups.