All games with PS4 Pro enhancements

Lelo Silva

Member
Nov 18, 2018
6
The last time I got screenshots, they appeared identical to pre-patch screens. This is why the game isn't listed.

4K screens are the only way to be certain about res, but sometimes 1080p shots can suggest an answer. If you're willing, you could post PNGs of the same scene, one with system supersampling on and one with it off. From very early in the game is best, because there'll be more chance of shots from standard PS4 already online to compare. Thanks!
Dunno is this is of any help... but here it goes... (captured through "share" function [PNG // 2160], even though the console is connected into a 1080 display)

EDIT1: Here are the respective abload PNG links (instead of the imgur ones used below):

SS OFF // Improved Shadows OFF

SS OFF // Improved Shadows ON

SS ON // Improved Shadows OFF

SS ON // Improved Shadows ON


SS OFF // Improved Shadows OFF


SS OFF // Improved Shadows ON


SS ON // Improved Shadows OFF


SS ON // Improved Shadows ON
 
Last edited:

Lelo Silva

Member
Nov 18, 2018
6
noticed just now that imgur converts the PNG to JPG files... where can i upload the PNG ones?... don't use those image services often...

EDIT1: Used abload as recommended prevously during the discussion, the links were inserted at the post above...
 
Last edited:

Fastidioso

Banned
Nov 3, 2017
2,827
I'm not a pixels counter but I have Darksiders 2 and resolution seems not beyond 1080p on Pro. Jaggies are too wider to be higher res. Not sure what the pro patch enhancements really does.
 

Fastidioso

Banned
Nov 3, 2017
2,827
Sorry, I haven't been keeping up with news. Do you mean "low latency mode was delayed, but now it's out and here it is", or do you mean "it was delayed so here's shots from standard mode"? I count these shots as 2560x1440, which is within what could be expected given previous counts of dynamic 1800c. (I don't see evidence of CBR artifacts, but depending how still you were when taking shots that might not be obvious.)

If there are two modes available now, might it be possible to get some shots as similar as possible in the other mode? Differences between settings are often hard to work out without direct comparisons. Either way, thanks for what you've already posted!


Thanks! Guess I should have trusted my initial counts which showed Pro and One X both at 1440p. But there definitely seemed to be softer edges than that in the shots from this thread. Perhaps this was due to motion blur? Or maybe I just did it badly.


I'm pretty sure this doesn't actually work. Digital Foundry said in a thread on Era that resolution changes don't actually take effect until you quit and restart the game. So in the above case, you're just getting 1080p the whole time.

In addition, 4K mode and 1080p but with forced system supersampling should not run differently at all. These are the exact same settings, the only difference is whether you're using the game or the OS to select it.


That's this video, and it's pretty confusing. You have to keep in mind it was created before system-driven supersampling was available. (Also, Richard Leadbetter doesn't always explain technical concepts properly.) So all it boils down to is that games which don't automatically supersample can be forced to run better by lowering their resolution (and keeping system-wide supersampling off). The bit about getting your machine to upscale is pretty meaningless--display scaling is usually about the same. But even if you feel it's different it is most definitely NOT the same as turning supersampling on, whether by OS or in game.

The part that's right is that you can generally get framerate benefits by forcing the Pro down to 1080p, but only in games which have separate modes by resolution setting. These titles are indicated in this thread's lists by markers, explained in the OP. Please let me know if you have any further questions about this.


Firstly, checkerboard rendering (CBR) isn't a type of upscaling. It's not inherently a blending operation, and can product both sharp and soft results, depending on exactly how it's used. For Red Dead Redemption II, the CBR isn't really what's making the game "less clean". That's the very aggressive temporal AA pass the game has, no matter what resolution it's at. The apparent difference between the game's two modes is because Rockstar added a post-process sharpening filter when it's run at 1080p (and that filter is set to max by default). If you turn the filter down, the 1080p mode becomes just as soft as the 4K mode--but with less detail onscreen, due to the lower resolution.

You may still prefer the way it looks. But if so, you might want to try running in 4K, but upping the sharpness setting on your TV. That may give you what you want, without giving up the resolution benefits.
To be fair, VG Tech on youtube (which most of the times is a reliable source for such stuff) has reported CBR doesn't works on Pro most of the time and RDR2 runs at 1920x2160p+TAA almost regularly but rarely he counted an "effective" 2160cbr if not in some isolated spot. There is definitely something of broken about the R solution from his opinion.
 
Last edited:

Lelo Silva

Member
Nov 18, 2018
6
I'm not a pixels counter but I have Darksiders 2 and resolution seems not beyond 1080p on Pro. Jaggies are too wider to be higher res. Not sure what the pro patch enhancements really does.
I know that the game have at least an option to improve shadows on PRO... but can't say anything about resolution improvements though...
 

Lelo Silva

Member
Nov 18, 2018
6
Yes, the system-wide supersampling implemented by Sony is terrible. All the first games with native downsampling looked much better that these last ones.
And maybe it's only a placebo effect, but I'm noticing very very slightly IQ losses in other games too (more blurrier), like UC4 or AC:Origin: the games with no options and only a native SS by default. I think that the system-wide supersampling option overwrites the native method, and it's not a good thing.

This is something that should be investigated by DF. At the firmware release I remember that they compared Last of Us and claimed the game was identical. But it was only a game, and with options for the user. Maybe in the games with a SS/ framerate user choice (God of War, Horizon, Last of US etc ), the native method implemented by the devs has the priority over the system-wide one.

I think that the games with no options and only a native SS should be object of further investigations after the firmware release with system-wide downsampling.
The system-wide supersampling mode is not exactly "terrible"... its just the way the system works that is kinda convoluted because when you are developing, you basically have to define how the software will work according to the resolution available on the display...

When developing on PS4 PRO your software must "read" the output used by the TV/monitor and then define a specific profile on which the software/game is going to work... so for instance when you boot The Evil Within 2:

TEW2 with system Super Sampling OFF (default)
-if connected on a 1080 display or lower:
profile 1 // game will display native 1080p (no downscale from higher resolutions), and use extra GPU performance to improve frame rate (kind of a "performance mode")
-if connected on a 4k display:
profile 2 // game will display "4k" (basic upscale from a 1260p buffer), targeting 30fps

what the "system-wide supersampling mode" does is basically force and inform the game that the console is hooked into a 4k display(when its not)... and then the final image is downscaled to a lower resolution output in the end, so if used in the example above the game will assume the "profile 2"... and then downscale the frame to 1080 in the end, so:

TEW2 with Super Sampling ON
-if connected on a 1080 display or lower:
profile 2 + downscale // game will process a 4k frame (basic upscale from a 1260p buffer), targeting 30fps and then downscale to 1080p output at the end
-if connected on a 4k display:
profile 2 // game will display "4k" (basic upscale from a 1260p buffer), targeting 30fps (NO CHANGE)

In that situation you are basically losing the "performance mode" when using SS...

Lets now use Uncharted 4 for instance:

UC4 (single player) with Super sampling OFF (default)
-if connected on a 1080 display or lower:
profile 1 // game will display a 1080 frame, downscaled by software/game from a 1440p buffer. (basically SSAA from a 1440 buffer)
-if connected on a 4k display:
profile 2 // game will display "4k", basic upscaled by software/game from a 1440p buffer.

so when using system forced super sampling:

UC4 (single player) with Super sampling ON
-if connected on a 1080 display or lower:
profile 2 + downscale // game will process a 4k frame (upscaled from a 1440 buffer), and then downscale to a 1080p output at the end
-if connected on a 4k display:
profile 2 // game will display "4k", upscaled from a 1440p buffer. (NO CHANGE)

so instead of downscale directly from a 1440p buffer, "forced" supersampling downscale a 4k image that was previously upscaled from a 1440 buffer... (thats why TLOU showed the same results on DF analysis, cause basic profile runs the game @2160/30fps on buffer, unless changed on settings)
thats why the supersampling mode is turned off by default... because its not the way the software was originally intended to work by the devs... and because this process of downscaling a 4k image (that was in most cases, originally created from a lower buffer) can create "noise" and lower the IQ, instead of improving it...

comparing to XB1X, the difference is that developers there can't know what kind of display they are connected into... so the system always automatically downscale (or upscale) the buffer used to the connected display... so devs are basically forced to develop at the higher resolution possible on buffer while working with the "enhanced mode"(xb1x mode), because in the end the system will always downscale by itself... or implement separate "performance modes" (forcing lower resolution frames on buffer) directly into the software (more or less like the latest tomb raider games did)

its a more streamlined way to work... because gives developers less options to deal with (hardware wise)... the problem though is that devs can't customize experience according to display connected, like ps4 pro allows to (for instance TEW2 does not have a "performance profile" on X, like it does on Pro)...

but when we look into what developers in general are doing with PS4 Pro... (like Rockstar, that left on RDR2 [while playing on Pro and 1080p displays] the same base slim/fat PS4 quality profile, lowering considerably IQ on 1080 displays, and even on 4k displays they could not implement checkerboard properly) I believe that, even though having more options is usually good, they are not well implemented by developers in general... unfortunately... (thats why we see lots of games running @1440p/30fps instead of using the double gpu performance and ID buffer to implement CBR, GR or TR to properly run @4k/30 or closer as its use was originally intended on hardware...)
 
OP
OP
Liabe Brave

Liabe Brave

Member
Oct 27, 2017
886
Dunno is this is of any help... but here it goes... (captured through "share" function [PNG // 2160], even though the console is connected into a 1080 display)
Thanks for the screens! From them I can say that "Improved Shadows" is indeed making them better...but they're still not very good. As for resolution, as previously stated it can't be directly counted because the image is scaled from 1080p in all cases (due to display attached). However, the resolving of edges and even surface detail does differ between system SS off and on. This suggests one of two things is true:

1. Darksiders II doesn't automatically downsample, and may be running slightly higher res when in 4K mode.
2. Darksiders II runs at 1080p always, and the difference is due to a messy situation of multiple scaling operations.

Unfortunately, I can't decide between these options without screens from a 4K display. I'll try and track some down so I can add the game to the list. Thanks once again for your help!
 

Velikost

Member
Oct 28, 2017
634
Liabe Brave Thank you for maintaining this list, it's an invaluable resource. Just one note, it looks like you listed Shenmue I+II as 60fps, but I think it runs at 30 on everything.
 
OP
OP
Liabe Brave

Liabe Brave

Member
Oct 27, 2017
886
Here you go Liabe ;-) Shot them myself.

Darksiders II Deathanitive Edition PS4 Pro - 4K - PNG
Thanks so much! The game is definitely 1080p here. Since standard PS4 is also pretty solid 30fps, so far the only advantage Pro gives is the improved shadows. I'll look for more, but add the game either way at the next update.

Liabe Brave Thank you for maintaining this list, it's an invaluable resource. Just one note, it looks like you listed Shenmue I+II as 60fps, but I think it runs at 30 on everything.
It sure does, this is a cut-and-paste error from Shaq Fu. I'll fix it when next I update the list. I really appreciate the correction!
 

Ametroid

Member
Oct 27, 2017
5,708
Black Ops 4 is 1080p while in Splitscreen on Base PS4? Really?

Whenever i play splitscreen on my Slim PS4 looks so low res and text etc unreadable etc and i hate that the screen ratio is like 4:8 ... Infinite Warfare splitscreen way better and readable and uses all the screen
 

Lakeside

Member
Oct 25, 2017
4,736
I realize it's just a Lego game, but is Lego DC Villains really Xbox One X enhanced, but not PS4 Pro enhanced?
 
OP
OP
Liabe Brave

Liabe Brave

Member
Oct 27, 2017
886
Black Ops 4 is 1080p while in Splitscreen on Base PS4? Really?

Whenever i play splitscreen on my Slim PS4 looks so low res and text etc unreadable etc...
The entry is correct, but this is only confirmed for Blackout battle royale. It's possible other splitscreen modes (if there are any) may be different; I'll update the entry to avoid confusion. If you're talking about Blackout mode, remember that the whole screen is 1920x1080. That means your individual view is only half the vertical resolution as when you play fullscreen. More minorly, there's a slight horizontal scaling that will improve shimmer, but might make text coarse (if that's not on an unscaled UI layer).

I realize it's just a Lego game, but is Lego DC Villains really Xbox One X enhanced, but not PS4 Pro enhanced?
It is enhanced on Pro, via CBR. Please keep in mind that I only analyze games after launch, to make sure I'm not looking at earlier builds. But it can take a couple weeks to find and go through source material. LEGO DC Super-villains is one of the games I'll be adding to the list when I update next.
 

Fastidioso

Banned
Nov 3, 2017
2,827
Very curious what's up on RDR2. If it's just a less aggressive TAA or something in the resolution, because it's definitely sharper in the details.
 

Tyaren

Member
Oct 25, 2017
6,494
We need a new pixel count for Red Dead Redemption 2, after the 1.03 patch. There are divergent opinions on the IQ.

Some pictures in this thread:
https://www.resetera.com/threads/red-dead-redemption-2-df-analysis-read-op.76899/page-107
Here direct comparisons that I made from the images provided in that thread:

The IQ did improve on at least these images. It is sharper and more detailed. You can't really compare anything in the foreground, because the camera angles are too different, but you can compare the backgrounds:

old:


new:


The trees have more clarity, their stems are visible as fine lines instead of thick blurry ones.

old:


new:


The mountain outline is sharper, the snow fields on the mountain are clearer and less blurry as well.

If the changes aren't really apparent to you, you need to overlay the two images and switch back and forth, then there is a clear improvement visible.
 

chandoog

Member
Oct 27, 2017
11,743
Anyone got any Darksiders 3 PS4 Pro screens ? curious about that game, according to reviews the PS4 (Pro) version has pretty bad performance drops.
 

SixelAlexiS

Member
Oct 27, 2017
1,966
Italy
As a new ps pro user with 1080p screen, see the lack of choices with ingame resolutions (and frame cap/vsync) on some games is absurd.

Let's take Mafia 3, it actually runs worse then base ps4 cause higher resolution and there is nothing you can do about... that's some bulls*it.

I haven't tried Spiderman yet but I see the same lack of choice by the game list, I hope that the game manage to eliminate the slowdowns I had on base ps4 instead of going even worse for having high resolution, which for me is useless.

Every game should give the option to chose the resolution and frame cap like Infamous Second Son, or at least let me stick to 1080p with locked 30fps if I want to.
 

Fafalada

Member
Oct 27, 2017
1,367
The system-wide supersampling mode is not exactly "terrible"...
To put it simpler - it works the same way as driver supersampling on PCs does.

so instead of downscale directly from a 1440p buffer, "forced" supersampling downscale a 4k image that was previously upscaled from a 1440 buffer...
I've explained this in another thread - but that's not what 'normally' happens. System downscaling/upscaling works on system supported resolutions(and there are many of those between 1080p and 4k), it does not work just out of 4k buffers.
 

Planet

Member
Oct 25, 2017
975
Many games upscale to a 4K buffer themselves, to render UI elements in native res. Those games at least scale up and down again with system wide supersampling.
 
OP
OP
Liabe Brave

Liabe Brave

Member
Oct 27, 2017
886
Very curious what's up on RDR2. If it's just a less aggressive TAA or something in the resolution, because it's definitely sharper in the details.
It's mostly just TAA, which has been dialed back notably, making the image sharper. The CBR still mostly counts as 1920x2160. However, I did see some apparent reduction to artifacting in the grass (though not elsewhere in the image). It's possible there's been a mipmap change there, or it might just be due to say, wind changing the motion between screens.

As a new ps pro user with 1080p screen, see the lack of choices with ingame resolutions (and frame cap/vsync) on some games is absurd.

Let's take Mafia 3, it actually runs worse then base ps4 cause higher resolution and there is nothing you can do about... that's some bulls*it.
That's not good, but you've chosen to focus on one of the rare games for which this is true. There are only 4 total, out of over 400 titles.

Many games upscale to a 4K buffer themselves, to render UI elements in native res. Those games at least scale up and down again with system wide supersampling.
I think he's trying to say that the scaling pass can take place separately for the game layer and UI layer.
 
Last edited:

Planet

Member
Oct 25, 2017
975
I think he's trying to say that the scaling pass can take place separately for the game layer and UI layer.
I don't think it can be. The function is at system level and can't interfere with what the game does, only take over from the very last step. That final backbuffer can have sub 4K resolution, but it will contain everything that is to be displayed.
 

Fastidioso

Banned
Nov 3, 2017
2,827
It's mostly just TAA, which has been dialed back notably, making the image sharper. The CBR still mostly resolves to 1920x2160. However, I did see some apparent reduction to artifacting in the grass (though not elsewhere in the image). It's possible there's been a mipmap change there, or it might just be due to say, wind changing the motion between screens.


That's not good, but you've chosen to focus on one of the rare games for which this is true. There are only 4 total, out of over 400 titles.


I think he's trying to say that the scaling pass can take place separately for the game layer and UI layer.
Wait a second so is 1920x2160cbr? Not even native 1920x2160p? Because sound awful
 
OP
OP
Liabe Brave

Liabe Brave

Member
Oct 27, 2017
886
I don't think it can be. The function is at system level and can't interfere with what the game does, only take over from the very last step. That final backbuffer can have sub 4K resolution, but it will contain everything that is to be displayed.
That might not be true for games made after system-wide supersampling was created, though. The development environment might allow (or require) hooks so that this setting affects the render layer prior to UI compositing. We don't know if that's the case, of course, but it theoretically could be.

So is still a broken CBR. I suspect TAA is the main problem and CBR can't fully resolved the whole res because the temporal AA broke it.
Yes, CBR is mostly not providing any benefit for this game. It takes place prior to TAA, though--immediately after a camera cut you can see the CBR result without any TAA--so the problem doesn't appear to be interference. (It's certainly not inherent, as other titles do both CBR and TAA with better results.)
 

Fastidioso

Banned
Nov 3, 2017
2,827
That might not be true for games made after system-wide supersampling was created, though. The development environment might allow (or require) hooks so that this setting affects the render layer prior to UI compositing. We don't know if that's the case, of course, but it theoretically could be.


Yes, CBR is mostly not providing any benefit for this game. It takes place prior to TAA, though--immediately after a camera cut you can see the CBR result without any TAA--so the problem doesn't appear to be interference. (It's certainly not inherent, as other titles do both CBR and TAA with better results.)
So I don't understand: if TAA doesn't interferes why CBR doesn't provide a full 4k result? I'm confuse.
 
OP
OP
Liabe Brave

Liabe Brave

Member
Oct 27, 2017
886
So I don't understand: if TAA doesn't interferes why CBR doesn't provide a full 4k result? I'm confuse.
I'm not sure, but we've seen something similar happen with World of Final Fantasy and L.A. Noire before. The effect from Rockstar is definitely not the exact same, or at least has further effects layered on top, but could be of a piece. It's possible that these games are using not CBR, but some derivation of TIR--the method developed by Guerrilla for Killzone: Shadow Fall's multiplayer. I'd think that unlikely, since the checker pattern shouldn't be any harder to implement than a columnar one, and should give better results. But maybe there's some savings somewhere? That's the best guess I got (though it could easily be very wrong).

Note that even at its least effective, RDR 2's implementation gives more detail than just a standard 1920x2160 render. I think that's become more apparent now that the aggressive blur has been toned down.
 

SixelAlexiS

Member
Oct 27, 2017
1,966
Italy
That's not good, but you've chosen to focus on one of the rare games for which this is true. There are only 4 total, out of over 400 titles.
I mean, It's one of the few titles I was interested to give a try but I didn't since I remember that on PS4 was quite a disaster, so I waited for the Pro.
Then yesterday I watched the DF video and I saw that the frame pacing was solved but on Pro it had a worse framerate -_-"

Even God of War doesn't have an option to play at 1080p with locked 30fps.
I bet that the game in resolution mode still have slowdown with valkyries and other demanding situations.
If you want stable framerate you can't since the other mode has an unlocked framerate which is nice (since is 40-50ish fps) but isn't locked, so it's stuttery.

And this is a major exclusive that received a lot of patch, and still lack of choices to have a locked and stable framerate.

BTW, since I appreciate your topic, I want to say that your guide is very forgiving on base PS4 frame rate.
Like I said, God of War have severe slowdown with valkyries (Kara fight is probably the worst in this sense, really runs like crap) and when there are a lot of enemies and particle effects, and still it only say "30FPS".
I doubt that with Pro in resolution mode GoW have perfect 30FPS in those scenes, I have to try it.

Same happen with Horizon Zero Dawn, the slowdown aren't that "rare" but not severe as GoW. It has slowdown with every encounter with big and/or a decent number of enemies.

Then there is The Frozen Wild that is another game basically (performance wise).
I played the game on base PS4 a couple of months ago and the stable 30FPS in the DLC zone are a mirage.
The game is almost constantly under 30FPS and in many encounters it actually freeze the whole game/console for a second.

Example 1
Example 2 (spoiler on a new type of enemy)

I mean, things like that should be noted on the guide, since the DF video of both GoW and Horizon DLC doesn't do any serious stress test on both base PS4 and PS4 Pro, which is extremely misleading and disappointing.

I didn't tried Horizon DLC on Pro yet, but I suspect that in high resolution mode you can still have those issues while on peformance mode, hopefully, not.

Even Spider-Man, on base PS4, has a lot of slowdown, both in combat and in the city (expecially at night/rain, where you almost can't see a locked 30fps).
Still haven't tried on PS4 Pro, but in this game too you don't have any option to play at 1080p and locked 30FPS.

I repeat myself, every PS4 Pro game need to have at least this two options (like Infamous Second Son):

Resolution: 1080p - "4K"
Framerate: 30FPS - Uncapped

A vsync ON/OFF will be nice too.. but at least those two options would help a lot, otherwise you have to stick with bad framerate options in both cases, or absurdly in worse framerate with a more powerful console...
 
OP
OP
Liabe Brave

Liabe Brave

Member
Oct 27, 2017
886
BTW, since I appreciate your topic, I want to say that your guide is very forgiving on base PS4 frame rate.
Like I said, God of War have severe slowdown with valkyries (Kara fight is probably the worst in this sense, really runs like crap) and when there are a lot of enemies and particle effects, and still it only say "30FPS".
Partially this is a restriction of resources. I can't go through each game and determine where the performance is worst, and then calculate framerate. I have to rely on analysis of whatever material is readily available (though I'm open to having clips to examine suggested). More importantly, worst performance is not what the framerates are meant to report. Even when I provide ranges, those never include the absolute worst drops (or transient high spikes) that can be found. I'm trying to give an impression of how the game runs over its entire length, and naturally those average figures aren't going to include extremes.

Also, you seem to have missed that "30fps" or "60fps" isn't the highest performance ranking I tag in the list. That's "locked 30fps" or "locked 60fps". The straight number isn't intended to indicate that the game stays at that level all the time, or essentially all the time. Note that, per the averaging factor I mentioned above, even "locked" titles may drop frames here and there. But these will be rare and inconsequential to most users.

Then there is The Frozen Wild that is another game basically (performance wise).
I played the game on base PS4 a couple of months ago and the stable 30FPS in the DLC zone are a mirage.
The game is almost constantly under 30FPS and in many encounters it actually freeze the whole game/console for a second.

Example 1
And here we have the final factor in your disconnect from the framerates I report. You are simply very sensitive to framerate instability. In this example video, the point you characterize as freezing the whole game for a second actually runs at 18fps. This is indeed quite a low drop, but not anything like a freeze (the longest frametime is 133ms). It's also transient, lasting only a single second. Expanding to a 3-second window puts the framerate at 21fps. And the other 35 seconds of that video run above 28fps.

Players like yourself for whom framerate is a very powerful concern are always going to detect issues that other players won't notice. I'm afraid I simply can't provide the precision and specificity you're looking for. If you don't already, I'd definitely suggest checking out VG Tech 's Youtube channel. He does an excellent job breaking down performance statistics categorically.

...but at least those two options would help a lot, otherwise you have to stick with bad framerate options in both cases, or absurdly in worse framerate with a more powerful console...
Again, obligate worse framerate on Pro is very rare when looking at games with official support. It's incredibly rare across all games, due to the existence of Boost Mode. If framerate drops on standard PS4 are infuriating you, Pro is guaranteed to improve the situation across the board. Almost every single game will run better...but this is not the same thing as every game running great.
 

SixelAlexiS

Member
Oct 27, 2017
1,966
Italy
Also, you seem to have missed that "30fps" or "60fps" isn't the highest performance ranking I tag in the list. That's "locked 30fps" or "locked 60fps". The straight number isn't intended to indicate that the game stays at that level all the time, or essentially all the time. Note that, per the averaging factor I mentioned above, even "locked" titles may drop frames here and there. But these will be rare and inconsequential to most users.
I didn't missed that, I said that to you cause you use the "30fps w/drop" in a lot of cases, and both GoW and Horizon really need that on base PS4, cause the framerate can be really rough on both games.

This is the Kara encounter (GoW) on base PS4.

And here we have the final factor in your disconnect from the framerates I report. You are simply very sensitive to framerate instability. In this example video, the point you characterize as freezing the whole game for a second actually runs at 18fps. This is indeed quite a low drop, but not anything like a freeze (the longest frametime is 133ms). It's also transient, lasting only a single second. Expanding to a 3-second window puts the framerate at 21fps. And the other 35 seconds of that video run above 28fps.
Yes, in the first video you can see a severe slowdown, but in the second video seems even worse.
BTW, how can you analyze the framerate of the youtube video? I'm very interested to do the same, I'll appreciate any hint, thanks :)

Players like yourself for whom framerate is a very powerful concern are always going to detect issues that other players won't notice. I'm afraid I simply can't provide the precision and specificity you're looking for. If you don't already, I'd definitely suggest checking out VG Tech 's Youtube channel. He does an excellent job breaking down performance statistics categorically.
Thank you for the VG Tech suggestion, I'll take a look :)
Again, I don't pretend that you are going to analyze every single game, but just use more the "w/drops" tag at least, so we can have a better idea of how runs the game.
Or maybe just eliminate that tag if the simple "30fps" means that the game will have drops anyway.
It's just a suggestion to make the whole description more clear and reliable.


Again, obligate worse framerate on Pro is very rare when looking at games with official support. It's incredibly rare across all games, due to the existence of Boost Mode. If framerate drops on standard PS4 are infuriating you, Pro is guaranteed to improve the situation across the board. Almost every single game will run better...but this is not the same thing as every game running great.
Yeah, worse framerate then base PS4 can be rare, but still the lack of locked 30fps options on Pro aren't, at all.
Like I said, every game should act like Infamous (separate framerate and resolution options with the ability of combine them), but even the choice to have 1080p/locked 30fps like Horizon or RDR2 will be great, so you can play the game at 1080p at locked 30fps if you don't care about higher resolution = frame drops.

They can give you 2K/4K with stable 30fps but not locked, and for me this is just a waste, since the PS4 Pro can do 1080p/locked 30fps, but the games doesn't allow the players to play like that.

I really wish that in future games we will have multiple choices in this regard.
 
OP
OP
Liabe Brave

Liabe Brave

Member
Oct 27, 2017
886
I didn't missed that, I said that to you cause you use the "30fps w/drop" in a lot of cases, and both GoW and Horizon really need that on base PS4, cause the framerate can be really rough on both games.

This is the Kara encounter (GoW) on base PS4.
What I put in the list for framerate is what I believe the player can expect to see during the majority of their gameplay time. "Locked 30fps" doesn't mean no frames are missed, and "30fps" doesn't mean there are no drops in framerate. When I explicitly put "w/ drops" and such, I mean that you'll be seeing them repeatedly, as a common occurrence. I add "few" or "rare" as the number and length of slowdowns decreases, and once they only happen every so often then I don't put anything at all. When they're very close to absent, I add "locked". (Just to be complete: if the game doesn't have intermittent big drops, but instead a tendency to consistently run under its target framerate, I add a tilde "~30fps".)

And to repeat, this means that the absolute extremes of performance will not be listed at all. You yourself characterize that Kara fight as being the worst example, and yet the footage you posted doesn't seem to dip below 24fps, and usually appears to be around 28fps. There are even (short) sequences where it's at full 30fps. If this 6 minutes is the worst a 20-hour game gets, I believe it's more misleading to include the lowest results as part of a general description, rather than to leave them out.

BTW, how can you analyze the framerate of the youtube video? I'm very interested to do the same, I'll appreciate any hint, thanks :)
Just count the number of frames.

Yeah, worse framerate then base PS4 can be rare, but still the lack of locked 30fps options on Pro aren't, at all.
I won't argue against your preferences. I agree it would be great if I could include robust metrics for all the games, so as to better serve everyone. But at some point I'm just not equipped to provide this, both because of my intended methodology and due to resources I can put into this effort. You're also arguing that I should be downgrading results for standard PS4, which is probably the least important aspect of the info I'm trying to provide. This thread is meant as a resource for potential or current Pro buyers, not standard owners. If standard PS4 sometimes runs worse than I've noticed, then the advantages of Pro are even greater than I list.

In the end, if you're interested in better framerates then Pro gives you those, in basically every game it can. If your reaction is "yeah, but they're not improved enough" then I'm sorry. Unfortunately, developers can't satisfy everyone's technical desires, and that means sometimes your personal preferences won't be supported. (Though sometimes they are. I count that 71 out of the 432 enhanced games offer player choice of performance versus render prioritization, through in-game or OS setting.)
 

SixelAlexiS

Member
Oct 27, 2017
1,966
Italy
What I put in the list for framerate is what I believe the player can expect to see during the majority of their gameplay time. "Locked 30fps" doesn't mean no frames are missed, and "30fps" doesn't mean there are no drops in framerate. When I explicitly put "w/ drops" and such, I mean that you'll be seeing them repeatedly, as a common occurrence. I add "few" or "rare" as the number and length of slowdowns decreases, and once they only happen every so often then I don't put anything at all. When they're very close to absent, I add "locked". (Just to be complete: if the game doesn't have intermittent big drops, but instead a tendency to consistently run under its target framerate, I add a tilde "~30fps".)

And to repeat, this means that the absolute extremes of performance will not be listed at all. You yourself characterize that Kara fight as being the worst example, and yet the footage you posted doesn't seem to dip below 24fps, and usually appears to be around 28fps. There are even (short) sequences where it's at full 30fps. If this 6 minutes is the worst a 20-hour game gets, I believe it's more misleading to include the lowest results as part of a general description, rather than to leave them out.
Yes, it's the worst peak in my memory (and 24fps for me is quite embarrassing) but most of the fights in GoW doesn't sustain full 30fps, the game go full 30 when you walk around and with most of the trial fights on the lava mountain, but when you see particle effects it's almost guaranteed to go under 30.
Btw I remembered a point where the game go even worse then Kara video, but I didn't upload it and it will not be helpful anyway.
But I get your point and method of the topic, I'm not going to contest further on that.

Just count the number of frames.
This isn't helpful in any way but thanks, I'll do some google search :)

I won't argue against your preferences. I agree it would be great if I could include robust metrics for all the games, so as to better serve everyone. But at some point I'm just not equipped to provide this, both because of my intended methodology and due to resources I can put into this effort. You're also arguing that I should be downgrading results for standard PS4, which is probably the least important aspect of the info I'm trying to provide. This thread is meant as a resource for potential or current Pro buyers, not standard owners. If standard PS4 sometimes runs worse than I've noticed, then the advantages of Pro are even greater than I list.
Not only base PS4, but PS4 Pro too, since GoW in "4K" have same framerate issues as base PS4, but they are listed with the same tag anyway so it's correct, my idea was to adjust the tag case by case but it was just a suggestion, as I said I get your point and I'll end here.

In the end, if you're interested in better framerates then Pro gives you those, in basically every game it can. If your reaction is "yeah, but they're not improved enough" then I'm sorry. Unfortunately, developers can't satisfy everyone's technical desires, and that means sometimes your personal preferences won't be supported. (Though sometimes they are. I count that 71 out of the 432 enhanced games offer player choice of performance versus render prioritization, through in-game or OS setting.)
Yeah, if developers doesn't care It could be nice to force resolution by OS, and then let the system do his job.
If they want to go with the "PC route" they need to give players the ability to choose their preferences, otherwise it's just a waste of power and money.

I really hope that at least PS5 is going to move in that way, and not forcing native 4K with crappy framerate for all games.
 
OP
OP
Liabe Brave

Liabe Brave

Member
Oct 27, 2017
886
This isn't helpful in any way but thanks, I'll do some google search :)
My apologies, I wasn't trying to be flippant. I just meant that I don't use any application, I literally step through the frames one at a time and count them. Divide total unique frames by number of seconds and there's your framerate.

Thanks for these! It appears the game is running at 1440p, like the first Vermintide. How's the performance been, many drops from 60fps? Does the beta support HDR? Your help is very beneficial.

I've added games to the list, but a couple require caveats. First, I've seen 4K screenshots of The Spectrum Retreat that were supposed to be Pro, but the game has a PC version so there's a possibility they were mislabeled. Second, I've seen both 4K screens and 60fps footage of Ride 3, but not both at the same time. I've listed these as two separate modes, simply because that's more conservative. But in theory the game might boost framerate and resolution in a single mode (dynamic res, etc. can't be ruled out).

I continue to search for good material to analyze for several other games just released, or due soon. Darksiders III is said to have dynamic resolution to maintain its 30fps rate. So far I've mostly seen 1080p counts, with one possible 1440p in the mix, so it seems to be aggressive with this. The same is true for Just Cause 4, which appears to be 1080p in the recent IGN footage. Much of that was set during heavy chaos, so several scenarios are possible. Perhaps that game's also got dynamic res, in which case we're seeing low res due to onscreen action; or, it might have been in a specific "prioritize framerate" mode; or, it runs at a static 1080p precisely to keep framerate up. I'll keep checking as I get the opportunity.

As always, input in the form of screens, video, or links is appreciated!
 

Fastidioso

Banned
Nov 3, 2017
2,827
I know you already said RDR2 on pro with the last patch has solely decrease the TAA BUT just to understand:
Before


After

Why the cable is less aliased if the TAA is reduced?
 
OP
OP
Liabe Brave

Liabe Brave

Member
Oct 27, 2017
886
I know you already said RDR2 on pro with the last patch has solely decrease the TAA BUT just to understand:
Before
After
Why the cable is less aliased if the TAA is reduced?
The short answer is, temporal methods aren't pure benefit. They can increase accuracy without a big performance hit, but also can generate problems rather than solve them. In this case, reducing TAA reduces the problem.

THE LONG ANSWER
The reason higher resolutions give more detail is because more pixels can be used to depict something the same size on screen. At some point the image you're generating has exactly as many pixels as the screen you're displaying it on (native resolution), and you can't keep adding pixels to increase detail. But even on a 4K display, the pixels have some actual width to them. Is there any way to show details smaller than a pixel?

The display screen can be conceived as a grid of pixels placed in front of the "world" of objects the game is rendering.




Normally, each pixel's color is determined by sampling the color of the world behind that pixel's center (shown as a black dot). The entire pixel is then colored to match, because pixels must be a single value. So very small details like the green bit of debris here are completely ignored.




But though we can't add more pixels to the screen, the game can sample multiple times per pixel, at different points. This is supersampling. Now very small detail can contribute to the overall pixel value, tinging the brown with green.




While this example may seem like a big change, keep in mind that a single pixel like this is vanishingly small at normal viewing distances. The overall effect is to subtly blend in the contributions of very tiny detail, even when the set number of pixels onscreen are bigger than that detail.

But supersampling, though effective at solidifying detail, is also computationally expensive. You're having to calculate color multiple times per pixel, and that requires accounting for textures, lighting, shaders, etc. There's only milliseconds until the next frame has to be drawn, so there might not be time to complete these extra samples.

Now enters the cleverness of temporal accumulation. When that next frame is drawn, you sample all the pixels again. But you can keep around the values sampled in past frame(s). Then instead of combining multiple samples across screen space, combine them across time: past info plus current. Since you're sampling the same number of times you normally would, the extra work is just the (much less complicated) blending step. Over a short time a more detailed version of the rendered scene is built up, especially if you shift your sampling locations around a tiny bit frame to frame.

The drawback is that games almost never sit still. The camera moves, characters move, the scenery sways in the wind or shakes from explosions, etc. Now the blending step has a problem, because what was behind a particular pixel last frame may not be the same thing behind it this frame. For example, here's a ship's sail from AC: Odyssey. Zoomed in, you can see where TAA has caused a faint echo of the sail's edge from last frame to still appear in this one, even though the sail itself has moved away. (Odyssey isn't too bad with this, it was just the quickest example I had to hand; artifacting can get much worse in other games.)




What's happening in the Red Dead Redemption 2 shot is similar, I believe. Except that the strung wires aren't moving as much frame to frame. So instead of the echo being far apart from the object, it's almost right on top of the current wire. If you flip between the shots rapidly, you'll see that the "after" wires are skinnier overall, because they're not doubled up. Note that, while this is most obvious in the high contrast of wires against sky, this blurring of edges is visible all over the "before" shot. Here's a zoom on the fence. Note how everything feels sharper in the "after" shot, but the detail seen is still somewhat blocky. It's just that the horizontal smearing has been reduced.



So as I said in the short answer, temporal methods can create artifacts if old data is kept around too long, especially when a scene is in motion. By reducing the TAA's aggressiveness, Rockstar has removed a lot of the blending. The image revealed is pretty noisy and blocky, but only if you get up very close. From normal viewing distance, the impression is of a sharper IQ because pixels aren't being smoothed with their previous values.
 

Gbraga

Member
Oct 25, 2017
9,544
I started playing Horizon last night. The game is beautiful, the environments are just breathtaking, the only thing that bothered me about it visually were the typical SSR artifacts. They're not as bad as in FFXV, but it's still noticeable. Do the "enhanced reflections" of performance mode mean better coverage and less artifacts? I might go with that mode if so.

It's also the first game that made my PS4 Pro make some noise, other than God of War's map screen. It could be due to it being so damn hot the past few days, though.
 
Last edited:
OP
OP
Liabe Brave

Liabe Brave

Member
Oct 27, 2017
886
Thanks for this! My counts seem to show that this is 1080p, but I'll say that heavy JPEG compression is making that uncertain. In general, if you can host large screenshots somewhere other than imgur it'd be helpful. They don't necessarily need to be PNGs--JPEG compression isn't inherently so damaging to IQ. But imgur particularly recompresses everything over 1MB, and heavily. I use abload.de, but www.lensdump.com and other options are basically all better. But either way, I really appreciate the input!

I've added several games to the list. Please note that Darksiders III is listed as possibly dynamic, because I've seen a screen that was definitely 1440p. However, most of the material I've analyzed has been 1080p. It's possible that rather than being dynamic, it may just be straight "full HD", and the higher res was a promo shot, prerelease build, etc.

Let me know if you have any comments or questions.
 

bitcloudrzr

Member
May 31, 2018
95
Thanks for this! My counts seem to show that this is 1080p, but I'll say that heavy JPEG compression is making that uncertain. In general, if you can host large screenshots somewhere other than imgur it'd be helpful. They don't necessarily need to be PNGs--JPEG compression isn't inherently so damaging to IQ. But imgur particularly recompresses everything over 1MB, and heavily. I use abload.de, but www.lensdump.com and other options are basically all better. But either way, I really appreciate the input!
I reuploaded with Lensdump. This might be a situation where the main menu isn't as high res though or they haven't enabled full Pro mode for the alpha. I'll get a few screenshots up when it's online tomorrow.
 
Last edited:
OP
OP
Liabe Brave

Liabe Brave

Member
Oct 27, 2017
886
I reuploaded with Lensdump. This might be a situation where the main menu isn't as high res though or they haven't enabled full Pro mode for the alpha. I'll get a few screenshots up when it's online tomorrow.
Thanks for the rehost! I can now see that the screen is full 4K. There's a lot of chromatic aberration present, and the soft shadow lighting is being calculated at a lower resolution. These things interacting with the compression is why my earlier count was much blockier.

There's aggressive TAA present, so it's very possible the overall resolution is being achieved through temporal reconstruction. (With a shot like this where most things are still, the results should be nearly indistinguishable from native rendering.)