It won't be, but remember this game was built on a completely custom engine made for a single piece of hardware (with Pro features, too, sure) so there will surely be some performance lost in the transition to another platform.
better shadows and ao, nvidia reflx support.
And what CPU they're using here. Can kind of determine a lot in regards to FPS.
And what CPU they're using here. Can kind of determine a lot in regards to FPS.
It says on the picture that the image does not show FSR in use. So the picture is actually completely irrelevant.
The image is a compressed native image, it's the performance that's the issue
This can't be happening...It says on the picture that the image does not show FSR in use. So the picture is actually completely irrelevant.
I guess the one concern to take away from this when we don't have any other data or metrics to go on is that this card is the recommended spec for 4k/60fps. If it isn't matching that at a lower resolution (due to FSR enabled), this is a problem. Though even then we don't know the CPU used here.
basically this. hopefully it's just silly new settings.As is always the case with stuff like this, be considerate of engine optimisation and settings running at "ultra" or whatever and what that actually means in the rendering engine. Superficial numbers and values without context don't give us an accurate read of actual performance.
So, for example, while I think many of us are under the impression that God of War's PC enhancements aren't particularly incredible (at least on paper), and it's a generation old game at this point, and so should be scalable up to 4K pretty easily on cutting edge hardware, we don't know how well the engine handles said PC enhancements or how they're scaled. EG: adding in higher quality ambient occlusions that are scalable up to an "ultra" setting well past reasonable performance vs. image quality returns could very easily mangle the framerate. Hardware and engines are ideally built for optimised image quality and the impression of detail while intelligently scaling down, but sometimes "ultra" settings in games are well past a point of reasonable performance cost, with negligible image enhancements.
We'll have to wait and see.
Better SSR reflections as well.better shadows and ao, nvidia reflx support.
but i mean, their engine is custom built for ps4 so not surprising if it isn't super optimized.
from what I've seen, FSR Ultra Quality and DLSS Quality produce comparable framerates. though DLSS Quality is doing it from a lower resolutionNot sure they should be advertising that. We know it has DLSS, which is likely going to create way better numbers.
Well, those are just presets. I've seen games where "Performance" DLSS looks better than "Ultra" FSR so you can get away with a much lower preset for DLSS.From my experience fsr is a better performer than dlss, both at ultra preset of course.
After watching extensive amount of footage comparing both on the same games, the performance seems to be somewhat similar, but DLSS gives better IQ.Not sure they should be advertising that. We know it has DLSS, which is likely going to create way better numbers.
no, checkerboarded
I don't understand people's obsession with always using Ultra settings. A number of settings offer almost 0 improvement and tank frame rates. If I decide to replay the game on PC, I'll use DLSS Quality mode plus turn down shadows and volumetric lighting to very high or whatever instead of ultra.
But it resolved a couple of details TAA didn't or somethingTFW you seem to be the only one bothered by all the DLSS artifcting.