• The GiftBot 2.0 Launch Giveaway Extravaganza has come to a close with an astounding 8073 games given away to the community by 696 members, a huge success thanks to you! The gifting now continues with more official prizes in the new Gaming Giveaways |OT|. Leftover Steam codes are also being given away to the PC Gaming Era community.

It is time for developers to finally provide "PS4/Xbox" graphical presets in PC versions of their games

BriareosGAF

Member
Oct 28, 2017
1,021
I don't have time or patience to tweak settings until I find the perfect one, I just want the game to look good enough as the devs intended. If I had a "closest possible to the PS4 version" option
What is perfect here? 30Hz with some dips to 20Hz? Solid 30Hz? Solid 60Hz? Most auto-configuration mechanisms attempt to configure your PC game settings to maximize visual fidelity while ensuring the baseline performance that the title is intended for, e.g. 60 Hz on COD. Would you want the "PS4" settings on your PC game to result in a lower framerate in order to match the visual settings of the console? Seems like a weird request.
 

Minsc

Member
Oct 28, 2017
1,193
Pretty embarrassing display here from PC gamers who game on a platform the embrasses options - adding more options wouldn't take anything away from anyone.

This is porting issue. There's plenty of below spec PC's out there. Putting a guard rail on gaming is hardly optimal. Not to mention console games dont adhere to standard spec specs, meaning different loads, once again, not specific to PC hardware.
IMO if a PC isn't capable of playing the base PS4 version of a game (which would be similar to low/medium), they probably can't play the game at all right? And it's just an option. If their PC can't run it at PS4 settings, then what's preventing them from lowering the settings further to the absolute minimum? Or upgrading their CPU/GPU/replacing their PC? The option to use a console level preset is not harming anything on PC gaming.
 
Oct 27, 2017
4,541
Spain
What is perfect here? 30Hz with some dips to 20Hz? Solid 30Hz? Solid 60Hz? Most auto-configuration mechanisms attempt to configure your PC game settings to maximize visual fidelity while ensuring the baseline performance that the title is intended for, e.g. 60 Hz on COD. Would you want the "PS4" settings on your PC game to result in a lower framerate in order to match the visual settings of the console? Seems like a weird request.
That is obviously not what OP is asking for. OP is asking for settings that match the draw distance/geometry quality/shading quality/etc of the console versions, as that's what the developers (who are more informed than most or all users) considered to be the most optimal combination of visual flair and performance for mid-range hardware.
 

FluffyQuack

Member
Nov 27, 2017
474
I already assume presets like "medium and "high" represent different balances between graphics fidelity and performance, so I don't see much point in having a preset which simulates how it looks like consoles. I suppose it could be novel knowing exactly what the console version looks like (andI imagine Digital Foundry would find it very convenient), but I wouldn't have any practical use for it.

What I would freaking love to see is console games getting more options. I would be okay with sacrificing resolution and overall graphics quality in every game in order to maintain a higher framerate.
 

flyinj

Member
Oct 25, 2017
2,847
Such a setting would expose many devs regarding the quality of their PC ports. It won't happen.
This is pretty much it.

Even if they put a lot of time and effort into the port, you are probably going to get less performance on console settings if you have similar hardware to the console you are setting to given the differences in architecture between the two platforms.

Developers would just be setting themselves up for people calling them incompetent.
 

Premium

Member
Oct 27, 2017
655
NC
So you want default settings at a lower fidelity to match console output as a preset? And you want this on higher-end hardware?

This request makes no sense when you can just play on console.
 

dgrdsv

Member
Oct 25, 2017
2,986
Msk / SPb, Russia
It can't be that difficult to provide such a preset, can it?
It can be fairly difficult actually because there is quite a lot of features available in console APIs (PS specifically) which don't have a 1:1 analogue on PC.

It's also possible that some things can be done way more efficient on PC while resulting in a slightly different quality result - these would have to be excluded despite the fact that you may loose a lot of performance with no apparent quality gain here.
 

SweetBellic

Member
Oct 28, 2017
1,504
When I tinker with graphics settings, my goal is to attain a smooth 60fps experience at maximum fidelity. I don't see how console-parity presets would be relevant. At best, they would be another arbitrary starting point among the other presets.
 

Fitts

Member
Oct 25, 2017
4,696
As mentioned, but there are already presets. There are also automatic optimizers that are built into some games or stuff like GeForce Experience if that’s your thing.

Edit: same as the above poster. I just crank everything up besides turning off motion blur, CA, and depth of field/AA if it’s a poor or aggressive implementation. If it’s not staying at a stable 60fps, I typically just have to sacrifice shadow detail and I’m done.
 

capitalCORN

Member
Oct 26, 2017
8,189
IMO if a PC isn't capable of playing the base PS4 version of a game (which would be similar to low/medium), they probably can't play the game at all right? And it's just an option. If their PC can't run it at PS4 settings, then what's preventing them from lowering the settings further to the absolute minimum? Or upgrading their CPU/GPU/replacing their PC? The option to use a console level preset is not harming anything on PC gaming.
Because instrisically PC gaming and consoles are two different worlds (radically so in parts of the world), matched with hardware and market for different applications.
Because it muddies acceptable ports, when you hold a baseline for a whole console cycle, which is already solved thru normal. And if you add a 'console' spec that just confuses normal. And if a user uses console settings and gets garbage results, they won't be any more educated.
Because someone who bought pc for Battlefield is probably already comfortable with pcs.
 

Smartlord

Member
Oct 27, 2017
65
The answers here show why so many people are afraid of PC gaming.

Not everyone playing games on PC is doing it because they like tweaking settings. I play games on PC because I don't have money to get both a gaming PC and a console. I got a PC because I can also use that for work and most games I care are on PC, not to mention they're cheaper on PC thanks to digital stores.

I don't have time or patience to tweak settings until I find the perfect one, I just want the game to look good enough as the devs intended. If I had a "closest possible to the PS4 version" option I would definitely use it, at least as a base. Low, Medium and High are usually too vague and so many games are lacking auto-detect these days.
This response just shows ignorance on the subject. GeForce Experience is an example of a program that has existed for literally over half a decade at this point that will automatically attempt to tweak your settings for you, with some ability to even choose your target performance. If you don't want to tweak then you don't have to.

Also, if you want most multiplatform games to look "as the devs intended" then the consoles already aren't for you. Most AAA games by EA, Ubisoft, Bethesda, and Activision-Blizzard are already running at significantly reduced visuals compared to their PC counterparts. That's not even mentioning how many console releases have questionable performance. I'm sure the devs at Infinity Ward didn't want the Xbox One version of their game (MW2019) to drop below 50 fps constantly. That's not their intention for a series that always targets 60.

People keep asking in this thread for "the option." Console visual settings are partly arbitrary and don't represent some kind of best possible balance between visuals and performance. An easy example is how anistropic filtering is literally negligible in performance but can significantly reduce image quality, yet literally dozens of major console releases both multiplatform and exclusive neglect this setting. I can understand the kind of instinct to pretend that these consoles are some sort of baseline, but their settings don't even make sense for a lot of PC hardware. The current consoles are very underpowered in the CPU department and might need to make optimizations for that, while its extremely common on PC to have a significantly better CPU than on consoles, making some types of optimization make way less sense on PC. Another quick example is that PCs tend to be more VRAM limited (I'm obviously speaking for the mid range mostly here) and have significantly more general purpose RAM.

If you don't want to tweak, it's why games often support performance levels that just hide all the deeper setting by default. If High is running too bad for you, drop to Medium. The names don't mean anything. For some games, the closest to equivalent console settings are the max settings, because the devs didn't want to upgrade the PC version. For others it's Medium, since they wanted to have some upgrades but allow scaling down for worse PCs. For many, it's Low, and even for some, the console settings would require going below the lowest default options.
 

Bjones

Member
Oct 30, 2017
4,518
Wouldn’t work, most people wouldn’t understand why they would get worse performance with low end hardware which would be horrible pr for hardware manufacturers and console makers alike.
 
Last edited:

Orayn

Member
Oct 25, 2017
1,352
It would be a nice baseline to have, especially if the PC version's ultra/max settings are forward-looking and not meant for most current hardware.
 

Menx64

Member
Oct 30, 2017
3,081
I dont think Sony and MS would be happy when people find their pro consoles are sitting somewhere between medium-high.
 

SoaringDive

Member
Feb 3, 2019
390
They should just get rid of the high and ultra settings in PC games. Since, at least from reading PC-gaming subreddits, it seems only a few people in those communities get that graphics will eventually get better and their old 1060 can't just hit ultra every game forever.

Hyperbole aside, I agree OP. If users had a frame of reference for what was actually considered the "ideal base" settings that the developers intended the game to look like then there would be a lot less confusion. Typically consoles have a mix of settings, certain ones on high, some on low, etc. The current preset system just isn't helpful other than how I use it - which is set everything to medium and adjust a few settings up to high/max that I know I should (like texture quality and AF).
 
OP
OP
Paul

Paul

Member
Oct 27, 2017
2,482
Digital Foundry drops a new RDR2 video comparing the game running at "as close as possible to console" preset and make the same obvious common sense argument as myself, that this should just be included by default as an optional baseline to make it much more logical for people to understand the level of performance:


Because people are used to just using "high" or whatever and then bitching about bad optimisation, when they do not know what that preset entails.
 

Majukun

Member
Oct 27, 2017
4,033
Just don't see the point of it.
The entire point of graphic options is to tinker with them, why limit yourself in some areas just because you find the console version standards adequate.
 

Stayfone

Member
Oct 28, 2017
209
I would very much like this, though consoles being a ‘closed’ system where as pc being a compilation of third-party products the results will vary too much. My pc has slightly better specs then my pro, yet my pro runs third-party titles much smoother then i can ever get my pc to do. So even though i would very much like it, i dont think its possible. Games like Uncharted and Horizon are able to squeeze out every inch of performance since the devs know exactly what hardware they are working with. Having a setting on pc that would at least mimic this is still welcome however.
 

skeezx

Member
Oct 27, 2017
6,837
would be nice, i have a laptop that's a essentially a vanilla ps4 1.5 but a lot of games run considerably worse (i could get them there but i'm not a tweak master)

but as mentioned, optimizing for a baseline isn't as simple as flipping a switch, unless you assume everybody is rocking the same setup
 
OP
OP
Paul

Paul

Member
Oct 27, 2017
2,482
Just don't see the point of it.
The entire point of graphic options is to tinker with them, why limit yourself in some areas just because you find the console version standards adequate.
...what are you talking about? What "limit yourself"? How would an optional console preset be limiting anything in any way? It would just make it easier to tweak the game from a standard console-like baseline. I am not, and Digital Foundry is not, asking for it to be limiting of anything in any way.
 
OP
OP
Paul

Paul

Member
Oct 27, 2017
2,482
but as mentioned, optimizing for a baseline isn't as simple as flipping a switch, unless you assume everybody is rocking the same setup
What do you mean? Why should everybody have the same setup for an optional console-like preset? I feel like I am in a bizzaro universe where people do not understand how words work.
 

skeezx

Member
Oct 27, 2017
6,837
What do you mean? Why should everybody have the same setup for an optional console-like preset? I feel like I am in a bizzaro universe where people do not understand how words work.
bespoking a simple 'console option' amongst a permutation of CPU and GPU variations wouldn't complicate things a bit? i mean maybe not i'm not a software engineer but ... yeah
 

aevanhoe

Member
Aug 28, 2018
1,285
For what? 99% of PC users dont give a f how a game looks on consoles.
It’s a setting that is “the default”, one the developers probably fine tuned their game the most. I have a mid-range PC and I’m constantly going back and forth trying to decide if I want to sacrefice performance or quality and I don’t like this experience. I would love to have a setting where the developers decided this for me (leaving the granular setting for those who want it, ofc)
 

RvinP

Member
Oct 28, 2017
380
Having graphics options preset from non-pc platforms will be a hassle, imagine additional options below 'auto detect' which would say
- Switch handheld graphics
- Switch docked graphics
- Xbox One base console graphics
- Xbox One X console graphics (resolution)
- Xbox One X console graphics (performance)
- PS4 base console graphics
- PS4 Pro console graphics (resolution)
- PS4 Pro console graphics (performance)

No please, those above number of options would actually be more than the default graphics options in some games.
 
Last edited:

Majukun

Member
Oct 27, 2017
4,033
...what are you talking about? What "limit yourself"? How would an optional console preset be limiting anything in any way? It would just make it easier to tweak the game from a standard console-like baseline. I am not, and Digital Foundry is not, asking for it to be limiting of anything in any way.
your pc doesn't completely reproduct a console hardware, nor would handle the game exactly as a console even if it had the same exct hardware inside of it. It means that if you go for a preset based on a console hardware, you are going to overshoot or undershoot some parts of the settings that you rig can handle better or worse than consoles.

if you want a baseline, you already get that with the usual low-mid-high-very high presets present in most games.

a console preset isn't a thing because it doesn't actually benefit anyone other than people that want to go for comparisons.

Of course developers are free to offer the option if they want depending on how much work it actually entails, but i still think it kind of miss the point of the entire existence of graphical options on a pc game, which is to customize your experience.
 

leng jai

Banned
Nov 2, 2017
9,817
The console "settings" are just what the developer could come up with based on what hardware they're working with, what frame rate they're targeting and what resolution they want to run at. It's not what their vision of what the game should look like or what they feel is the ideal bunch of settings for a balanced experience. They're literally designed based on the exact hardware they console is running on which is completely irrelevant to PC.

Telling a PC to run at "Xbox 1X" settings makes zero sense because the developers gimped all the settings so that they could hit an arbitrary 4K target. Do you ever see people on PC capping themselves to 30fps and putting half the settings on low just so they can run at native 4K? That's literally what the X is doing.

So sure you could make some console preset which is probably designed for 30fps (while PC is generally 60) but that would be a poor use of whatever custom hardware you're' running. You're much better off just running at low/medium/high which is the type of presets that a lot of games have already.
 

tokkun

Member
Oct 27, 2017
1,620
It’s a setting that is “the default”, one the developers probably fine tuned their game the most. I have a mid-range PC and I’m constantly going back and forth trying to decide if I want to sacrefice performance or quality and I don’t like this experience. I would love to have a setting where the developers decided this for me (leaving the granular setting for those who want it, ofc)
On PC there are tools like Geforce Experience that will automatically recommend settings based on your actual hardware. This will give you a much better result than trying to use console-like settings that have been "fine-tuned" for running at 30 fps on a different set of hardware.

The "Low, Medium, High, Ultra" presets that are included with many games are also likely to be better tuned for running on PCs than a console preset would be.
 

ILikeFeet

Member
Oct 25, 2017
24,351
Low, "Console", Medium, High

IO Interactive knows what to turn on and off to reach console-like graphical quality, so just make that a preset in addition to the usual. people thinking console-level toggles are "limiting" are just complicating things for the sake dick waving
 

Tora

Member
Jun 17, 2018
1,099
It's time for them to make decent options on consoles, too. If I want to play your game in 60fps with lowered graphics, let me do it already.
After seeing these 60fps switch game mods on overclocked units (DQ11, Witcher 3), I actually agree lol.

It takes extra resources though for frame-rate; even if you're advertising variable frame rate you gotta ensure it's stable, nothing gets broken etc.
 

shuno

Member
Oct 28, 2017
433
I am talking about settings like shadows/LoD etc. Obviously framerate/res is variable on PC.
But shadows and LOD are variable too?! You can't expect to take settings from a 1080p optimized version and switch the resolution to 4K while expecting the same image quality. Maybe they used 1k shadow maps or low res AO and it will look terrible on 4K. Same with the LOD and other settings.
 
OP
OP
Paul

Paul

Member
Oct 27, 2017
2,482
bespoking a simple 'console option' amongst a permutation of CPU and GPU variations wouldn't complicate things a bit? i mean maybe not i'm not a software engineer but ... yeah
No, it would not. Permutations of CPU and GPU are completely irelevant. I am simply asking them to include optional preset that has similar shadow/LoD/etc settings as the leading console. It has zero to do with anyone's PC hardware.

Having graphics options preset from non-pc platforms will be a hassle, imagine additional options below 'auto detect' which would say
- Switch handheld graphics
- Switch docked graphics
- Xbox One base console graphics
- Xbox One X console graphics (resolution)
- Xbox One X console graphics (performance)
- PS4 base console graphics
- PS4 Pro console graphics (resolution)
- PS4 Pro console graphics (performance)

No please, those above number of options would actually be more than the default graphics options in some games.
Yes, let's take an idea, blow it into insane hyperbolic proportions and pretend it is inpractical.

FFS. Obviously I am not asking for developers to include every single console permutation. Just one from, say, the leading console would suffice. Or one depending on their choice.

your pc doesn't completely reproduct a console hardware, nor would handle the game exactly as a console even if it had the same exct hardware inside of it. It means that if you go for a preset based on a console hardware, you are going to overshoot or undershoot some parts of the settings that you rig can handle better or worse than consoles.

if you want a baseline, you already get that with the usual low-mid-high-very high presets present in most games.

a console preset isn't a thing because it doesn't actually benefit anyone other than people that want to go for comparisons.

Of course developers are free to offer the option if they want depending on how much work it actually entails, but i still think it kind of miss the point of the entire existence of graphical options on a pc game, which is to customize your experience.
The preset does not have to be 100% identical, although that wouldn't be all that difficult. But just like there are already existing presets, there could easily be one most closely matching the console from the already existing settings. Like Digital Foundry is asking.

You claiming it would not benefit anyone is just your personal incorrect opinion. At the very least it would benefit me, since it would make tweaking easier.

On PC there are tools like Geforce Experience that will automatically recommend settings based on your actual hardware. This will give you a much better result than trying to use console-like settings that have been "fine-tuned" for running at 30 fps on a different set of hardware.

The "Low, Medium, High, Ultra" presets that are included with many games are also likely to be better tuned for running on PCs than a console preset would be.
In my experience GF Experience is useless (always undershoots or overshoots) and the presets high/normal etc are nice and all, but give no information about what they actually are, so I still have to go and tweak them.
If I had a, say, PS4 preset available. I would simply choose that, see how the game runs, and then increase my resolution or detail accordingly.

Low, "Console", Medium, High

IO Interactive knows what to turn on and off to reach console-like graphical quality, so just make that a preset in addition to the usual. people thinking console-level toggles are "limiting" are just complicating things for the sake dick waving
Yep. It is bizzare.

But shadows and LOD are variable too?! You can't expect to take settings from a 1080p optimized version and switch the resolution to 4K while expecting the same image quality. Maybe they used 1k shadow maps or low res AO and it will look terrible on 4K. Same with the LOD and other settings.
Good point - but that's why it would just be optional baseline and you can go and tweak to your hearts content from that.
 

tokkun

Member
Oct 27, 2017
1,620
In my experience GF Experience is useless (always undershoots or overshoots) and the presets high/normal etc are nice and all, but give no information about what they actually are, so I still have to go and tweak them.
If I had a, say, PS4 preset available. I would simply choose that, see how the game runs, and then increase my resolution or detail accordingly.
A PS4 preset may be horribly optimized for running on PC hardware and at your target resolution / framerate, though.

The thing to understand from a technical perspective is that certain settings may cause huge performance cliffs on some hardware - for instance if they require more VRAM that you have or take advantage of special functional units that aren't present on your GPU. Even though console hardware is generally less powerful than current PCs, there are still areas of the microarchitecture that will perform better. For instance, there is a cache-coherent bus between the PS4's CPU and GPU, but you don't have that on a PC with discrete CPU and GPU. This also applies to the software side; there may be different bottlenecks in the GPU driver and graphics libraries. The performance cost of the settings do not scale linearly either, so a mixture of settings that may well balanced if you are running at 30 fps may be poorly balanced for running at 60 fps.

This is why your argument that GFE is "useless" because it overshoots / undershoots, but that you would "simply" choose a PS4 present seems bonkers to me. The PS4 preset will have much worse overshoot / undershoot and require more tuning. It would be more difficult to get right rather than simpler.

I will say, I agree with you that the presets giving you no information is a real problem, but I think what you are proposing is not the right solution. I think a better solution would be to have some kind of live preview function built into the settings menu, so you can actually see what happens to the visuals / performance as you toggle the presets or individual settings.
 

capitalCORN

Member
Oct 26, 2017
8,189
isn't that the point that a console preset trying to solve? it gives a baseline of where a pc player can scale up or down from.
There's nothing uniform about 'console' settings, not to the point that you could make a baseline for whatever the dev cycle of a console generation maybe. Not to mention the redundancy with pc exclusives that are basically the reason that pc gaming is pc gaming.
 

ILikeFeet

Member
Oct 25, 2017
24,351
There's nothing uniform about 'console' settings, not to the point that you could make a baseline for whatever the dev cycle of a console generation maybe. Not to mention the redundancy with pc exclusives that are basically the reason that pc gaming is pc gaming.
not like there's uniformity in the current setup to begin with. one game's high is another's medium is another's low, especially over time as games age. like with what we're seeing with Red Dead now
 

tokkun

Member
Oct 27, 2017
1,620
how would that be any different than choosing the "high" setting and getting low performance?
Please continue on to read the next paragraph of the post you quoted. I will reproduce it below for your convenience.

The thing to understand from a technical perspective is that certain settings may cause huge performance cliffs on some hardware - for instance if they require more VRAM that you have or take advantage of special functional units that aren't present on your GPU. Even though console hardware is generally less powerful than current PCs, there are still areas of the microarchitecture that will perform better. For instance, there is a cache-coherent bus between the PS4's CPU and GPU, but you don't have that on a PC with discrete CPU and GPU. This also applies to the software side; there may be different bottlenecks in the GPU driver and graphics libraries. The performance cost of the settings do not scale linearly either, so a mixture of settings that may well balanced if you are running at 30 fps may be poorly balanced for running at 60 fps.
The "high" preset on a PC game is probably tuned decently for running a game using a high-end discrete PC GPU with DirectX (or whatever graphics API the game is actually using in its PC build) at 60 fps. If you pick High and get bad performance, try Medium or Low. One of them is likely to be a good starting point, and you can tweak individual settings further if desired.

This makes a lot more sense than a preset that is based around a console's hardware and API bottlenecks at 30 fps. Its settings are unlikely to be well balanced for any PC and are not a good starting point for tweaking settings further.
 

ILikeFeet

Member
Oct 25, 2017
24,351
Please continue on to read the next paragraph of the post you quoted. I will reproduce it below for your convenience.



The "high" preset on a PC game is probably tuned decently for running a game using a high-end discrete PC GPU with DirectX (or whatever graphics API the game is actually using in its PC build) at 60 fps. If you pick High and get bad performance, try Medium or Low. One of them is likely to be a good starting point, and you can tweak individual settings further if desired.

This makes a lot more sense than a preset that is based around a console's hardware and API bottlenecks at 30 fps. Its settings are unlikely to be well balanced for any PC and are not a good starting point for tweaking settings further.
that doesn't change my answer
 

capitalCORN

Member
Oct 26, 2017
8,189
not like there's uniformity in the current setup to begin with. one game's high is another's medium is another's low, especially over time as games age. like with what we're seeing with Red Dead now
What I'm saying is pc already has a scale. There's a whole publishing industry about benchmarks and reviews that is freely available out there. I see no venn diagram between pc gamers and console settings.
 

floridaguy954

Member
Oct 29, 2017
2,542
What's the point in including a settings preset for a different hardware configuration? Why would you want to use, say, PS4's LOD setting which is meant to reduce the load on the weak CPU if you have a much more powerful processor at your disposal?

The settings used on PS4 and XB1 aren't this magical perfect middle ground between image quality and performance. They are the best compromise that the developers could come up with for decent visuals at the desired ballpark framerate on this particular hardware.
Facts.

Also, I disagree with those who think automatic settings based on hardware on PC don't work too well. Many modern games automatically pick the perfect settings for many hardware configurations.

Furthermore, we have apps like GeForce experience which digs into a games ini file and further tweaks a games graphical settings to get acceptable performance depending on the hardware.

Honestly, what OP is proposing already exists but these option are for someone's personal PC hardware, console hardware is completely irrelevant.