Eh I just bought a new video card for my pc and kept my stock ps4. Don't own a 4k tv, refuse to pay for online and when i read downsampling was going to be iffy it made my decision pretty easy. Hope they get the downsampling situation taken care of.
No it's not.So its up to the dev to actually manually implement downsampling
No it's not.
It's up to the Dev whether they will offer separate 1080p and 4k profiles based on console-video output selector (ie. no downsampling).
This is the first time I've heard this. You're saying that if the Pro is hooked to a 1080p display, both modes are greyed out?Not exactly the same but similar, Elder Scrolls Online won't even give you 1080p enhanced mode features (reflective water and stuff) without hooking it up to a 4k tv. The only workaround is to connect your PS4 to remote play for some reason.
At least for me (and many others in the ESO community, not sure if it's universal or not), it defaults to regular 1080p, OG PS4 settings and doesn't even show the option for 1080p enhanced or 4K in the menu.This is the first time I've heard this. You're saying that if the Pro is hooked to a 1080p display, both modes are greyed out?
All PS4 games output at 4K when using a Pro hooked up to a 4K panel, so not seeing how that's different from what MS are doing.The main explanation is that Sony did a poor job on the system level, where it's up to the individual piece of software to recognize the resolution of the monitor they outputting to and make adjustments based on that, this essentially means that every game that wants to do super sampling has to implement their own method to render at a higher resolution and scale down to 1080p. Sony had the same issues with the PS3 where the software had to be aware of what kind of screen they were outputting to 480p/720p/1080p, making some games run differently based on the output mode.
Microsoft did the smartest thing for everyone with the Xbox One X where all Xbox One X aware titles always output to a 3840x2160 frame buffer, regardless if the actual content is in 4K or outputting to a 4K monitor, making it super easy for developers to allow their games to benefit from super sampling(since the scaler in the console handles this).
While i have not developed for Xbox One X or Playstation 4, i have developed a rudimentary super sampling system for Unity and while it's not hard to make a naive implementation of it, it does require considerably more work to make a good down scaling of the super sampled content compared just adjusting some config files that allow the software to use the extra resources and allow for the internal scaler to take over.
All PS4 games output at 4K when using a Pro hooked up to a 4K panel, so not seeing how that's different from what MS are doing.
I still firmly believe it's because there's nothing in the SDK yet to allow devs to render at any native resolution and have the PS Pro upscale/downscale as appropriate.
I imagine MS's approach will allow for < 4K upscaled to 4K and then down to 1080p. IMO I'm not convinced this is optimal.
This is incorrect. Both consoles run software at any resolution internally, across a very wide range. Both can output either 1080p or 2160p, which they can scale to if the game is running something else. (Xbox One X will receive an update in the future that will allow 1440p output as well.)The difference here is that the One X from what I understand always run game software at 3840x2160 internally, where as the PS4 seems to have an internal resolution of either 1920x1080 or 3840x2160.
They are (for this context) the same result, looked at from two directions. To start, games keep an internal, entirely mathematic description of their scenes. When it's time to make those scenes show up on a screen, they have to split up the immensely precise detail into the same number of pixels the screen has. This is done by sampling the scene: asking "at this point, what color is supposed to be there?" Each sample determines the color of a pixel. (In actuality, there are tons of subtleties and different techniques I'm ignoring, but that's the gist.)
Liabe, in your opinion would something like 1440p > 2160p > 1080p result in a cleaner image than a native 1080p image with decent AA?This is incorrect. Both consoles run software at any resolution internally, across a very wide range. Both can output either 1080p or 2160p, which they can scale to if the game is running something else. (Xbox One X will receive an update in the future that will allow 1440p output as well.)
They are (for this context) the same result, looked at from two directions. To start, games keep an internal, entirely mathematic description of their scenes. When it's time to make those scenes show up on a screen, they have to split up the immensely precise detail into the same number of pixels the screen has. This is done by sampling the scene: asking "at this point, what color is supposed to be there?" Each sample determines the color of a pixel. (In actuality, there are tons of subtleties and different techniques I'm ignoring, but that's the gist.)
If a game is rendering higher resolution than the screen it will be shown on, the image has to be shrunk. Downsampling is the process of turning many pixels into fewer. You sample, say, four adjacent pixels, and average them out to get one result, the single pixel that you're shrinking to. This creates very smooth transitions from one color to another for all the detail onscreen. That results in less shimmer and fewer jaggies for the smaller size, while still keeping maximum detail.
In reality, a game may not be rendering big and then shrinking. Rather, it might only create the small image, but instead of sampling once for each, it supersamples more--say, four times, averaging the values. The end result is the same as downsampling: smoother blends and fewer harsh edges.
Depends what you mean by "cleaner", and what else the game is doing. Every scaling operation introduces blur. Upscaling adds more blur than downscaling. Blur is the same thing as AA, so adding more does make jaggies and shimmering go away. It also washes out small details and makes edges less distinct.Liabe, in your opinion would something like 1440p > 2160p > 1080p result in a cleaner image than a native 1080p image with decent AA?
Could someone post some 4K and 1080p comparison shots of Smite? It's by the same studio, maybe they've changed it as well.
Thanks for your take and I think this supports the position Sony might have taken. An OS level downsample option might not always produce optimal results for 1080p displays due to the varying native resolutions in use.Depends what you mean by "cleaner", and what else the game is doing. Every scaling operation introduces blur. Upscaling adds more blur than downscaling. Blur is the same thing as AA, so adding more does make jaggies and shimmering go away. It also washes out small details and makes edges less distinct.
Each person tends to evaluate a different mix of smoothness and sharpness as the "right" or "most impressive" approach. So I can say that, all other things being equal, 1440p 》2160p 》1080p would result in a smoother, softer image than 1440p 》1080p. Whether that's "cleaner" is up to each viewer. Some--especially if the game is using other blur techniques, like TAA, CA, motion blur, etc.--might find it not "cleaner" but "muddier".
One thing that complicates this is that at higher display resolutions, games can achieve the same impression (whatever that is) with less built-in blur. That's because at typical viewing distances, pixels are very small and the viewer's eyes are blending them together more. On a 4K display a native 4K game with no AA at all can be an acceptable level of sharpness to folks who'd find a 1080p no AA game on a 1080p display bad-looking and jaggy.
Could someone post some 4K and 1080p comparison shots of Smite? It's by the same studio, maybe they've changed it as well.
As mentioned before, that's not the case.I still firmly believe it's because there's nothing in the SDK yet to allow devs to render at any native resolution and have the PS Pro upscale/downscale as appropriate.
And neither is this - the only exception would be games that implement their own 4k upscale.I imagine MS's approach will allow for < 4K upscaled to 4K and then down to 1080p.
So you're saying Sony's Pro SDK enables devs to render at any native resolution and then have the PS4 automatically upscale/downscale as appropriate? The evidence so far suggests all upscale/downscale is implemented by devs.
IIRC the functionality is not tied to the Pro.So you're saying Sony's Pro SDK enables devs to render at any native resolution and then have the PS4 automatically upscale/downscale as appropriate?
I'd ask "what evidence" - to date I've mostly seen people's own head-canon (in either direction).The evidence so far suggests all upscale/downscale is implemented by devs.
According to a response to this article, the studio also expands on the PS4 Pro situation at 1080p. Regular 1080p users don't leave empty-handed on PS4 Pro; as a trade-off for the absence of downsampling, you can expect improved 8x anisotropic filtering, broader draw distances, and a doubling in shadow resolution quality. According to this follow-up, it states this has "a better impact on the final image than 4K downsamping" overall. An extra advantage of this mode is there's no dynamic resolution, fixing the pixel count at 1080p.
The only correct course of action that should be taken.Sony just need to take control of it and make it a system thing. Like the Xbox One X.
Updated :)The Outlast 2 info in the OP also needs to be updated. 1080p owners aren't left empty handed.
http://www.eurogamer.net/articles/digitalfoundry-2017-outlast-2-console-tech-analysis
Does the XboneX have downsampling for every game, or is it the same situation as the Pro?
Does the XboneX have downsampling for every game, or is it the same situation as the Pro?
This makes Sony look super pathetic to be honest.Every game that has enhancement patches for the Xbox one X will do auto super sampling on a 1080P screen. This is done on the hardware level and requires no further interaction from the Devs.
Also non patched games automatically get 16XAF applied to them which makes textures look alot better and makes the LOD seem higher and any games with varying resolutions or framerate will hit the highest peak possible for an un patched game and this will happen on a 4K screen or 1080P screen automatically.
Many automatic benefits for 1080P owners on the Xbox One X
No it's not.
It's up to the Dev whether they will offer separate 1080p and 4k profiles based on console-video output selector (ie. no downsampling).
For one, Sony does have a similar system (minus the 16xAF). Boosting of dynamic resolution, framerate, and other effects that scale with available resources happens on Pro. And downsampling is the default behavior.This makes Sony look super pathetic to be honest.
How long have they had now to implement a similar system?
But no, let's roll the dice and see if 1080p players get screwed.
Updated OP with Shadow of the Colossus. Can't watch the video right now, does it have 1080p benefits?
Did this too if people want to join me: