• The GiftBot 2.0 Launch Giveaway Extravaganza has come to a close with an astounding 8073 games given away to the community by 696 members, a huge success thanks to you! The gifting now continues with more official prizes in the new Gaming Giveaways |OT|. Leftover Steam codes are also being given away to the PC Gaming Era community.

It is time for developers to finally provide "PS4/Xbox" graphical presets in PC versions of their games

Paul

Member
Oct 27, 2017
2,509
It would make tweaking and comparing so much easier if this baseline preset was included in multiplatform games.

For example, I am perfectly happy with how RDR2 looks on my PS4 Pro details-wise, so I would be fine using my RTX2080Ti to simply power those exact settings, just with higher resolution and framerate. But since Rockstar does not provide this preset, nor any information whatsoever about what levels consoles run at, it is all just a guessing game and waiting for Digital Foundry to analyze everything the hard way.

Update: Digital Foundry released a new RDR2 video comparing the game running at "as close as possible to console" preset and make the same obvious common sense argument as myself, that this should just be included by default as an optional baseline to make it much more logical for people to understand the level of performance:


 
Last edited:

Chivalry

Member
Nov 22, 2018
1,849
It's time for them to make decent options on consoles, too. If I want to play your game in 60fps with lowered graphics, let me do it already.
 

Hella

Member
Oct 27, 2017
14,646
Console-PC performance doesn't translate 1:1 for a variety of reasons, so it's not really a useful point of comparison. PC gets its own graphics presets for this reason.
 

Nabs

Member
Oct 26, 2017
5,298
Console optimization usually goes a bit further than your average quality setting. If you simply want 30fps locked, with low-medium settings, that's usually just a flip of a button.
 

FondsNL

Member
Oct 29, 2017
691
But consoles have the wizardy switch...
I mean, how these ancient bricks are producing the kind of visual fidelity we're seeing is just black magic.

I don't think PC's have black magic... they just have brute force.
 

RvinP

Member
Oct 28, 2017
406
Games usually auto detect your CPU, RAM and GPU, before assigning a predefined display profile for graphics settings.

Don't they?
 

Raoh

Member
Oct 27, 2017
2,105
No, but more developers should include performance options like in Nioh on console games.
 

Bluelote

Member
Oct 27, 2017
1,072
that would be a nice option to have;
but I suppose that would make consoles look bad at times.
 

DrWong

Member
Oct 29, 2017
681
Tech Era is delivering lately. I mean OP, "to compare" with what?
Put a console in your pc: problem solved thanks.
 

leng jai

Member
Nov 2, 2017
10,010
Yeah we definitely need this to appease the phantom console gamers that claim they tried playing on PC but gave up because they had to tinker 24/7 for every second game.

What you're suggesting isn't as straight forward as you think either.
 

packy17

Member
Oct 27, 2017
1,729
Yeah, no. There's usually no need to compromise, especially this late into a generation. A 2080ti should give you vastly better visuals and significantly higher frames than what consoles deliver. RDR2 is (currently) an unfortunate case of bad optimization.
 

daninthemix

Member
Nov 2, 2017
1,856
This setting would offend as many PC gamers as it pleased. Perhaps more.

And this is assuming the developers are even able to ascertain what like-for-like settings are, via the settings menu.
 

Alexandros

Member
Oct 26, 2017
7,752
What's the point in including a settings preset for a different hardware configuration? Why would you want to use, say, PS4's LOD setting which is meant to reduce the load on the weak CPU if you have a much more powerful processor at your disposal?

The settings used on PS4 and XB1 aren't this magical perfect middle ground between image quality and performance. They are the best compromise that the developers could come up with for decent visuals at the desired ballpark framerate on this particular hardware.
 
Last edited:

Honolulu Blue

Member
Oct 27, 2017
2,623
For what? 99% of PC users dont give a f how a game looks on consoles.
Yeah, it kinda starts and ends here.

It's just pointless. The console setting isn't some arbitrary standard, it's just what works for the two ultimately very similar platforms. If you're choosing to game on PC, then whatever the console setting is is a total irrelevance. You're going to run the game according to whatever you're capable of running.
 
OP
OP
Paul

Paul

Member
Oct 27, 2017
2,509
Well that wouldn't be those exact settings would it?
I am talking about settings like shadows/LoD etc. Obviously framerate/res is variable on PC.

Just buy a console if you want console pressets.

This is a foolish request tbh
I have a console. And precisely because of that I also want that preset to be available on PC version. You did not provide any argument why that should be foolish request.

Tech Era is delivering lately. I mean OP, "to compare" with what?
Put a console in your pc: problem solved thanks.
Putting aside your asshole attitude, what do you mean by "Put a console in your pc: problem solved thanks." ?

Yeah we definitely need this to appease the phantom console gamers that claim they tried playing on PC but gave up because they had to tinker 24/7 for every second game.

What you're suggesting isn't as straight forward as you think either.
Huh? I do not care about any "phantom console gamers" or whatever. I have PC and PS4 Pro. I would like to see console preset available AS OPTION on PC, so I could use it as a baseline and tweak from there. What is so hard to understand about it?

And what is not straightforward about setting shadows/LoD/texture quality etc to be same accross platforms? 99% of games already allow this, it just needs tweaking and finding hard way which settings correspond to what. Digital Foundry do this laborious and needless work regularly.

Yeah, it kinda starts and ends here.

It's just pointless. The console setting isn't some arbitrary standard, it's just what works for the two ultimately very similar platforms. If you're choosing to game on PC, then whatever the console setting is is a total irrelevance. You're going to run the game according to whatever you're capable of running.
Again, it would be useful to have it as a baseline from which to tweak further. And it would be optional. What is it with the unnecessary resistance?

What's the point in including a settings preset for a different hardware configuration? Why would you want to use, say, PS4's LOD setting which is meant to reduce the load on the weak CPU if you have a much more powerful processor at your disposal?

The settings used on PS4 and XB1 aren't this magical perfect middle ground between image quality and performance. They are the best compromise that the developers could come up with for decent visuals at the desired ballpark framerate on this particular hardware.
This is a fair argument and logical post - thanks for that - I am however arguing for including it as a simple optional thing that would make for a decent baseline from which I could tweak further. If it was just one of the presets, what's the harm?

I would like to simply set RDR2 (or any other game) to PS4 Pro preset and see how my PC runs it, and then tweak further according to my desired framerate.
 
Apr 12, 2018
1,394
Digital Foundry has done a few comparisons between console settings and PC settings on games, and several times they've found instances where things like shadow, texture, and lighting quality don't match any settings on PC, suggesting that it was tailor-made specifically for the console hardware.

So it may not be as simple as you're hoping for.
 

DrWong

Member
Oct 29, 2017
681
Yeah sorry for the tone of my post, i guess it's the result/reaction to all the other "pc/consoles" threads. And I say that as console/pc player.

What I meant is that you don't need to compare to consoles settings in order to achieve your desired perf targets: all the tools are there in your options (or should) already. And you have more on the net if needed. You set your fps to 60 and start tweaking from the lowest grapical settings up to highest without losing those 60fps target. It wouldn't be harder, longer, than starting from a console preset.
 

Corralx

Member
Aug 23, 2018
178
London, UK
Doesn't really make much sense.
Consoles uses custom code paths and features that are often not even available on PC. Even if a specific effect may look really similar, it might not be exactly the same (and because of that it may have wildly different performance on some specific PC HW config).
Offering the exact settings would not be possible in most cases.
Not sure why ppl think that developer fix some settings, and then console optimisation just means "picking the appropriate ones for each console", but that's hardly the case (for big AAA productions at least).
 

Gemüsepizza

Member
Oct 26, 2017
1,183
Some of those replies, wow. Imagine getting defensive about something as harmless as graphics presets.

The logic of his proposal is perfectly sound:

1.) Console graphics quality is the main focus of developers, which means it receives the most attention.

2.) The performance cost of additional PC quality levels varies wildly and is mostly intransparent, which is an unpleasant experience for some.

3.) Some people just want to fucking play a game at PS4 Pro fidelity and 100 fps.

What's wrong with having more options?
 

Flash

Member
Oct 27, 2017
236
No, developers do not need to add yet another useless option in the graphical settings. If you can't be bothered to tinker with your own settings to find the perfect balance between visuals and frame rate then just buy a console.
 

karnage10

Member
Oct 27, 2017
1,320
Portugal
There's already low/medium/high/ultra presets. Medium is usually PS4/One level and high ps4pro/OneX level
Console optimization usually goes a bit further than your average quality setting. If you simply want 30fps locked, with low-medium settings, that's usually just a flip of a button.
Does this really happen now? I don't have any recent console but i have a ps3 and xbox 360. I have many games on both console and PC. During that generation console graphics were generally low setting with a few sprinkled medium, none and very low settings. AA is also very low on console, i don't think i ever saw something like MSAA.
Games i tried: oblivion,skyrim, fifas, bioshock, bayonetta, assassins creed 2, crysis 2,mass effect,dragon age, one piece warriors.


Do recent game consoles really run at medium settings?
 

Nzyme32

Member
Oct 28, 2017
2,149
No - it’s absolutely meaningless, and would only be there for people in these redundant dick waving contests

As long as the game has lots of malleability through different settings, labelled appropriately against what the game can output - and if possible a rudimentary detection for suitable defaults - that enough

Anyone crying about being unable to “ultra” or “max” without the context of the game, deserves their own idiocy
 

alexbull_uk

Member
Oct 27, 2017
1,150
UK
An “optimised” preset does make a lot of sense imo.

Developers spend a lot of time tuning their games to look as good as possible on console hardware. I’d say offering that preset as an optional starting point would be a good idea, so you can go up from there, if your system allows it.
 

elyetis

Member
Oct 26, 2017
2,078
Why not, makes sense that some people would want to get them game to look as good as what they saw their friend or a streamer play on console, then they are free to decide of the performance is good enough or not a that setting and tweak it further to their need.

An "optimized" texture setting might be interesting too, I get the feeling ( I could be wrong thought ) that many game on console are probably using something more complex than just the low/medium or high setting we find on PC, with say things like the main character using high texture setting while other aspect of the game use medium settings.
 

Corralx

Member
Aug 23, 2018
178
London, UK
Some of those replies, wow. Imagine getting defensive about something as harmless as graphics presets.

The logic of his proposal is perfectly sound:

1.) Console graphics quality is the main focus of developers, which means it receives the most attention.

2.) The performance cost of additional PC quality levels varies wildly and is mostly intransparent, which is an unpleasant experience for some.

3.) Some people just want to fucking play a game at PS4 Pro fidelity and 100 fps.

What's wrong with having more options?
So what happens when an equivalent setting doesn't exist on PC?
Should developers put whatever is closest to consoles in term of visual fidelity? Or in terms of performance on similar hardware?
I can see youtube videos claiming a certain game is faking console presets to help (or damage) PC vs consoles.
Not sure we need another argument to fuel the war.

And which console should they use as a comparison? All of them? That's calling for more console war again.

Such a setting would expose many devs regarding the quality of their PC ports. It won't happen.
That doesn't make any sense. What are you even implying?
 

TubaZef

Member
Oct 28, 2017
1,636
Brazil
I would like that, not every PC player has the time/patience to tweak the settings to find the best one, I just want the game to look good and stable enough.
 
Jan 21, 2019
962
No, developers do not need to add yet another useless option in the graphical settings. If you can't be bothered to tinker with your own settings to find the perfect balance between visuals and frame rate then just buy a console.
PC gaming is about options, unless I don't like a specific option, then I will act arrogantly and dismiss other people's wishes.

Do you wake up with this attitude?
 

tr1b0re

Member
Oct 17, 2018
571
Trinidad and Tobago
Put most of your settings to Medium-High and you would usually have console level specs for most games, the graphically intensive ones anyway

But that being said, some games use their own custom 'in between' settings for console games as well, so it's unlikely it'll be exact
 

Jonnax

Member
Oct 26, 2017
1,098
This is bad PR for Sony/Microsoft why would a game developer shit on their partners?
 
Jan 21, 2019
962
What if the PC is above or below spec for the game? Since it's never going to be exact. Will you settle for a substandard experience either way?
Didn't you read OP? They want the console setting as a springboard. Having it as a "minimum" to go from there and crank things accordingly because they feel like the games looks fine on console or is the developers general vision. And yeah some setting might be different but it would be a nice option even if it is only close and not a perfect replica.
 

Adulfzen

Member
Oct 29, 2017
487
a toggle between quality and performance settings for every game would be a start or even allowing people to toggle between locked and unlocked framerate if the game can't maintain 60 fps.
 

capitalCORN

Member
Oct 26, 2017
8,268
Didn't you read OP? They want the console setting as a springboard. Having it as a "minimum" to go from there and crank things accordingly because they feel like the games looks fine on console or is the developers general vision. And yeah some setting might be different but it would be a nice option even if it is only close and not a perfect replica.
This is porting issue. There's plenty of below spec PC's out there. Putting a guard rail on gaming is hardly optimal. Not to mention console games dont adhere to standard spec specs, meaning different loads, once again, not specific to PC hardware.
 

TubaZef

Member
Oct 28, 2017
1,636
Brazil
The answers here show why so many people are afraid of PC gaming.

Not everyone playing games on PC is doing it because they like tweaking settings. I play games on PC because I don't have money to get both a gaming PC and a console. I got a PC because I can also use that for work and most games I care are on PC, not to mention they're cheaper on PC thanks to digital stores.

I don't have time or patience to tweak settings until I find the perfect one, I just want the game to look good enough as the devs intended. If I had a "closest possible to the PS4 version" option I would definitely use it, at least as a base. Low, Medium and High are usually too vague and so many games are lacking auto-detect these days.