• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.
  • We have made minor adjustments to how the search bar works on ResetEra. You can read about the changes here.

Mezati99

Banned
Feb 6, 2019
969
Planet Earth
looking at all these elitists with their + $1000 setups crying fps digits while i am here more than satisfied with rocking a gt1030 everything on low at 720p 30fps living my best life

dce.jpg
 

Madjoki

Member
Oct 25, 2017
7,230
Confession time :I'm fake pc gamer who uses default settings 99% of time. And it's been fine experience.

So you list one game with supposedly solid performance on consoles(on one of them and 30fps), then you use games that run like dogshit on console to discredit pc versions?

I love how people still use "shitty port" of Dark Souls as example of how consoles are better, when it's problem was its was 1:1 port from consoles.

Difference is one of these can be fixed without paying 70€ for remaster.
 

Solaris

Member
Oct 27, 2017
5,285
Hyperbole is so strong in this thread. I personally do not remember when was the last i spent more than 5 minutes tweaking graphics setting in the game.
And people tweak settings, because they want to play game on the best fitting, for them, setting possible, but you do it once.

People really love to jump in and explain in detail how traumatic their experiences are with PC gaming, how many driver errors and crashes they've had, how much fiddling they had to do and how they are so much happier on their Comfy Couch™
 

Sandersson

Banned
Feb 5, 2018
2,535
PC has been my main platform since 2009 and I STILL dont know what those are. I know they're AA methods, but I couldn't tell you how they differ. Same with AO, screen space reflections, sub surface scattering, and a bunch of other non sense terms. All I know is if I want more performance, I just start turning stuff off.
This is the dumbest take I have ever seen. xD
 

c0Zm1c

Member
Oct 25, 2017
3,206
That's why I linked Guru3D's benchmark, they have a whole page dedicated to the options they used. Intense options like Hairworks are disabled and others like AO, toned down.
Game was demanding and there are no two ways about that. Same is now with RDR2, and same will be with Cyberpunk.

I've skimmed through the article and - correct me if I'm wrong - but I don't see any benchmarks for lowest/medium settings? That's what I mean about these benchmarks not telling you the whole story. Of course it's going to perform poorly if you run it at "ultra" on a GTX 760!

Also, as the article points out, VRAM usage is really low even at 4K. That's impressive compared with some other VRAM guzzlers I could mention.

This made me curious of how well The Witcher 3 does run on older or lower end hardware at the bottom end of the settings scale. So I've just installed it on such - i7-4770k (not overclocked), 16GB DDR3 RAM, GTX 780, so that's a system from 2013, two years old when The Witcher 3 released. At 1920x1080 and low settings, running around the city of Novigrad I'm getting framerates between the 70s and 90s. Textures look awful on low but popping them up to ultra doesn't hurt the framerate too much.

You can't really claim a game is demanding without also testing how it performs on the lower end and it still doesn't seem that demanding to me. I would expect Cyberpunk 2077 to follow suit, in that, while hardware is going to break a sweat on highest or near highest settings, there will be plenty of opportunity to get it running well with lower settings.
 

pswii60

Member
Oct 27, 2017
26,673
The Milky Way
The game is a locked 30fps at Native 4k on the X...

I feel the same as the OP between Dark Souls, AC Unity, Ryse, Arkham Knight, Watch Dogs 2, Mafia 3 and now RDR 2. DS will be the same. AAA developers don't care about PC.
I can run all those games (except RDR2 - not tried it yet) at 4k/60fps on my PC. I can't say the same about any console.

It's like the Nier Automata port. Sure it was a shitty port, but not because it was worse than console, but because it didn't really improve on the console experience - with the extremely aggressive pop-in and low res textures that are present in the console versions. But, (and in part thanks to FAR mod) Nier Automata runs at a locked 60fps and native 4k on my PC and is an incredible experience compared to both the Pro and X versions with their lower resolutions and constant framerate dives. The Pro version only runs at 1080p!

And a game like Nioh? I can play at full native 4k and 60fps, I don't have to choose. Sekiro? 4k, locked 60fps, ultra settings. No framepacing issues. In fact, I can play 99% of my library at native 4k and locked 60 (or higher) and at Ultra settings - it's only RTX that takes its toll. Just because there is an occasional outlier like RDR2, doesn't change that fact.

Of course I have a i-9900k/RTX2080Ti, so I'm not your average PC gamer. But in my view, when going in to PC gaming, if the reason you're playing on PC is for a better visual and performance experience over console, then it's either go big or go home. And I shall be upgrading to a 3080(Ti) when that arrives. Am I spending a lot of money? Yes. But it's no difference to spending a lot of money on your TV or whatever if the best possible experience is important to you. I'm glad I have the option to do that, and that the vast majority of games receive PC ports these days.

It was Sekiro's performance on consoles that actually pushed me over the edge this year and make the move to invest in a gaming PC. It's an amazing experience on PC at a locked 60fps in native 4k and a huge improvement over the console versions. I was sick to death of console games with frame pacing issues, framerate drops, shitty resolutions (like RAGE 2) etc etc. I love that with PC, I'm in control of all that.
 
Last edited:

My Name is John Marston

Alt account
Banned
Oct 27, 2017
111
I feel the same as the OP between Dark Souls, AC Unity, Ryse, Arkham Knight, Watch Dogs 2, Mafia 3 and now RDR 2. DS will be the same. AAA developers don't care about PC.

Explain how RDR2 devs don't care about PC? Rockstar is offering PC players more options and scalability than most games out there. They didn't restrict the game to current hardware. They went all the way with graphics options instead and gave us options that take advantage of future hardware too. How's that not caring about PC? Explain.

The problem is with PC players who want to run everything on ultra just because it's called "ultra". If Rockstar had called medium "ultra" people would praise it, but they didn't. They gave us more than normal game's "ultra" settings.
 

leng jai

Member
Nov 2, 2017
15,118
The game is a locked 30fps at Native 4k on the X...

I feel the same as the OP between Dark Souls, AC Unity, Ryse, Arkham Knight, Watch Dogs 2, Mafia 3 and now RDR 2. DS will be the same. AAA developers don't care about PC.

Big Rion in another PC thread spreading nonsense, shocker.
 

Deleted member 17207

user requested account closure
Banned
Oct 27, 2017
7,208
It's also gotten a lot better than it used to be, in my experience. I felt the same way you did a few years ago, but I've been primarily PC gaming over the last two years as the experience has just been that much more enjoyable to me than consoles.

Playing RDR on my 1X last year was very unenjoyable.
Really?

I'm doing a bit of an RTTP on RDR2, as I didn't love it the first time around, but all the PC advertisements got me hyped to play it again so I'm trying it again. I'm enjoying it a lot more this time, but yeah - I'm playing on my launch PS4 and it's fine. I guess I haven't experienced the alternative though.
 

Griffith

Banned
Oct 27, 2017
5,585
You have to do it once. There are guides available for the most taxing settings. I recently tried out a game that didn't run at 60fps, lowered from Ultra to High preset. Could probably squeeze a bit more performance but honestly the difference isn't that large to be bothered about it. The fan noise difference is more bothersome than the graphical difference.

There are even programs that can automatically give you recommended settings for your setup. I don't use them because I don't find them that useful, but you can, if you wish, avoid what you consider the worst aspect of PC gaming with a couple of mouse clicks so I find it hard to sympathize with this issue. It's not as if console games are getting easier to play and you don't have to "faf" around with them either. Some are getting graphical settings, just like PC, others require long and tedious installation times not to mention the mandatory patches or sign ins that some require. Console gaming is moving closer to PC gaming in terms of complexity whereas PC gaming has never been as accessible as it is now.

With that said, I can't say more other than I completely, but respectfully, disagree.
 

Minsc

Member
Oct 28, 2017
4,123
This thread makes me wonder if it'd be a good idea if PC games started including Xbox/PS4/PS4 Pro/Switch presets in the graphics options.

Then you just pick the PS4 pre-set and feel good instead of feeling like garbage needing to set everything to low? I dunno, seems like it's a little nicer way to put it lol.
 

Wetalo

Member
Feb 9, 2018
724
Why is this always the first thing that gets fixated on, even if it's not actually the real problem? Many are having problems even with everything at medium settings. I'm also rereading the OP, and I don't see the word "ultra" anywhere in it.
I can promise you I have friends who have done exactly this. Complain that a game runs like trash on their pc when they don't bother lowering the graphics settings. Like the beauty of pc gaming is the settings are fully scalable.
 

BasilZero

Banned
Oct 25, 2017
36,343
Omni
Cranking up to the max is the reason for that.


Also its to be expected every game that comes out regardless whether its PC or console is broken or needs a patch.


Which is why I never buy games day 1 lol.
 

Bosch

Banned
May 15, 2019
3,680
The game is a locked 30fps at Native 4k on the X...

I feel the same as the OP between Dark Souls, AC Unity, Ryse, Arkham Knight, Watch Dogs 2, Mafia 3 and now RDR 2. DS will be the same. AAA developers don't care about PC.
First of all we need to stop the bs on this thread and missinformation

The game is NOT locked on Xbox ONE X @30fps... at has a lot of moments @20 fps...

Second the game run on a mix of low and medium settings on Xbox ONE X.

People are trying HIGH and ULTRA on PC. Settings that doesn't exist on console and consoles can't run these settings.

Third people are targeting 60 fps on PC not SUB 30 fps.

There is a lot of responses here with missinformation mixing things...
 

Minsc

Member
Oct 28, 2017
4,123
You have to do it once. There are guides available for the most taxing settings. I recently tried out a game that didn't run at 60fps, lowered from Ultra to High preset. Could probably squeeze a bit more performance but honestly the difference isn't that large to be bothered about it. The fan noise difference is more bothersome than the graphical difference.

There are even programs that can automatically give you recommended settings for your setup. I don't use them because I don't find them that useful, but you can, if you wish, avoid what you consider the worst aspect of PC gaming with a couple of mouse clicks so I find it hard to sympathize with this issue. It's not as if console games are getting easier to play and you don't have to "faf" around with them either. Some are getting graphical settings, just like PC, others require long and tedious installation times not to mention the mandatory patches or sign ins that some require. Console gaming is moving closer to PC gaming in terms of complexity whereas PC gaming has never been as accessible as it is now.

With that said, I can't say more other than I completely, but respectfully, disagree.

I think with some bigger games they can work wonders. I played around with the graphic settings in The Witcher 3 for a while, to get the best looking settings I could manage at an acceptable frame rate for me, then one day I tried the GeForce Experience automatic settings and it actually gave me an even better looking image at a better framerate. So I definitely think the programs like that when configured well can be worth checking out, obviously way more so if you're not in to tweaking graphics in the first place.
 

Launchpad

Member
Oct 26, 2017
5,161
The year is 2019 and people are still using Arkham Knight as an example to prove that PC gaming is bad. I don't know what's more annoying - that or that stupid MW2 boycott picture.
 

Wetalo

Member
Feb 9, 2018
724
Though i would argue that if pushing settings beyond consoles does not give you noticeable visual difference( that you can see without zooming way in ) then it is a questionable choice to include such settings. Pushing hardware for the sake of pushing hardware is not needed... you can cripple any machine with unnecessary precision without any visual gain.
I disagree with this. One thing I love about pc gaming is booting up a 10 year old game I didn't play on release, cranking up all the settings, and experiencing it with modern hardware. Including settings that are too taxing for current hardware is forward - thinking. I recall The Witcher 2 putting big red warning signs on specific graphics settings that are more taxing, but today 8 years later they're a breeze for current hardware. It's future - proofing your tech, The Witcher 2 looks better today thanks to those post process effects that nobody was using at launch.
 

My Name is John Marston

Alt account
Banned
Oct 27, 2017
111
If game can't deliver console performance/settings on ~similar pc hardware then it's a bad port, otherwise people should just lower their damn settings. That is if there are no other stuff like random crashing, framepacing and streaming issues etc.

Though i would argue that if pushing settings beyond consoles does not give you noticeable visual difference( that you can see without zooming way in ) then it is a questionable choice to include such settings. Pushing hardware for the sake of pushing hardware is not needed... you can cripple any machine with unnecessary precision without any visual gain.

Who said the game can't deliver console performance on similar PC hardware? No one here knows what console settings are and most people are targeting 60fps while it's 30fps and drops to the 20s on consoles. Why are you comparing it to console performance again?

Graphics settings are optional. If they are "not needed" just ignore them and play on low settings instead. Consoles run most likely with low settings and some on medium.

I don't understand why you want the devs to deliberately restrict the graphics options to current hardware maximum capabilities. Why don't they offer more scalability for future hardware?
 
Oct 25, 2017
22,378
The game is a locked 30fps at Native 4k on the X...

I feel the same as the OP between Dark Souls, AC Unity, Ryse, Arkham Knight, Watch Dogs 2, Mafia 3 and now RDR 2. DS will be the same. AAA developers don't care about PC.
What a weird post.
It's like listing like 5 badly optimized games this gen and saying that's evidence that developers don't care for consoles.
Are we back in 2005?
 

Landford

Attempted to circumvent ban with alt account
Banned
Oct 25, 2017
4,678
What is the next thread going to be? "PC Gaming is the reason why DRM exists since most of pc gamers pirate"

What an embarassment of a thread.
 

swnny

Member
Oct 27, 2017
270
This made me curious of how well The Witcher 3 does run on older or lower end hardware at the bottom end of the settings scale. So I've just installed it on such - i7-4770k (not overclocked), 16GB DDR3 RAM, GTX 780, so that's a system from 2013, two years old when The Witcher 3 released. At 1920x1080 and low settings, running around the city of Novigrad I'm getting framerates between the 70s and 90s. Textures look awful on low but popping them up to ultra doesn't hurt the framerate too much.

You can't really claim a game is demanding without also testing how it performs on the lower end and it still doesn't seem that demanding to me. I would expect Cyberpunk 2077 to follow suit, in that, while hardware is going to break a sweat on highest or near highest settings, there will be plenty of opportunity to get it running well with lower settings.

Thank you for your time in setting this up! Only proving my point with that comparison, tho. GTX 780 has a ~20% better performance then the recommended GTX 770 for Witcher 3, so lets say a 770 in your test would net FPS in the 60-80s. Same goes here for RDR2 - the recommended 1060 has a 60-70FPS performance on 1080p/lowest preset. (at least from the performance thread and some utube videos, don't have the chance of testing it myself)
How does that prove the game(s)(W3 and RDR2) not being demanding the more you crank up the visual goods? And even more, what's making you think that Cyberpunk wont be even more demanding (not counting raytracing implementation) and what would stop people going crazy again over this stuff, the same as here now?
Occasionally there these games that push the top of the current hardware to the limit, be it for overtaxing graphical goodies(tesselation, msaa, raytracing), futureproof options(think ubersampling in Witcher 2, I believe they even added a 'note' saying this options is for the hardware of the future), ect. Thats not a reason to hate the game, the dev or the pc gaming as a whole.
 

scrambledeggs

Member
Apr 25, 2018
486
Well, I can't even play it right now because it keeps quitting unexpectedly. :( Was able to put in a couple of hours yesterday and was looking forward to putting an hour more before work too!
 

Pargon

Member
Oct 27, 2017
12,017
You are reacting with your gut and not your mind to be honest, think of low as high and there you have it
It is not a Bad port - rather the audience is 'Bad' here and is honestly not even thinking
You have to wonder if a lot of this could be solved by developers having console-equivalent presets and labeling them as such; even just "console low/console high" without naming names.
I also think that a lot of developers make mistakes with the way that they set up their presets. Presets should be largely focused on features that affect performance.
Overrides for chromatic aberration, depth of field, vignetting, and motion blur should be independent of those performance presets, even if they are technically going to have a minor impact on performance. The quality of those effects could still be included in the presets if they are enabled though.
While there are some exceptions—Dishonored 2 being a notable example—in the majority of games texture resolution tends to have a minimal impact on performance so long as you do not exceed the VRAM limits of your GPU. It would be best if that was automatically configured based on the available amount of VRAM and not affected by the performance presets. I think a lot of people would be more comfortable using lower quality presets if it did not affect the texture resolution, since that is often what is going to stand out the most.

The game is not a mess. It's just not targeting to be limited by current hardware. It has a different graphics scale. Nobody wants to play at "low" settings but they don't realize that those "low" settings are equivalent to medium-high in other AAA games. Everyone wants to play at high and ultra. Those settings equal "extreme" settings in other games.
This is what the game looks like on the lowest possible settings (except textures on ultra but anisotropic filtering is off so they look muddy):
Bx8For6.jpg


Are you kidding me? This is the lowest graphics settings. Draw distance, shadows, volumetric clouds and lighting,..etc. are equivalent to high settings in other games. look at the clouds and trees in the distance casting shadows! Which game has those on "LOW" settings?
That in itself is a problem - and why it doesn't seem to be scaling well to older systems. But you're right that the main issue here is many people's insistence on running games at "ultra" no matter what.

I have no trouble hitting 60fps at lower resolutions and the CPU utilization is low too.
CPU utilization does not have to be anywhere close to 100% to bottleneck the GPU. A single thread, if it's important enough, can bottleneck the entire rendering pipeline.
If GPU utilization is dropping as resolution increases, one possibility is that they're scaling LODs automatically with resolution, increasing the number of draw calls.
I'm not trying to excuse anything about this port or its poor performance on many systems; just explaining that low GPU utilization is typically a sign of being bottlenecked by the CPU. The Evil Within 2 is another game that comes to mind which does not appear to have high CPU utilization but always has low GPU utilization.

Its worse when developers release crappy optimised versions of their games and NEVER patch them.
I'm looking at you Evil Within 2. Has awful CPU and GPU utilization and stutters like a bastard.
Did they fix it? Did they fuck?
That's arguably the worst PC port, in terms of performance, released this entire generation. Still looks and runs better than it does on console though.
The original game shipped in an awful state too, but they did fix it with a performance patch after release (so that it used more than two CPU cores!).

Managed to play 10 hours or so, but the game kept crashing randomly and now at a specific mission. Really tired of jumping through hoops just to get my game to play properly. Even worse than having to jerry rig older games just to get them playing on modern systems.
I'll just point out here that GTA IV was literally unbeatable on PlayStation 3 for me because the final mission of the game would break in the same place every time (the jump onto a helicopter).
Rockstar games have a tendency to break in awful ways no matter where you're playing them.

Isn't this normal for rockstar games. I didn't bother with gta v, but I remember gta iv running like dog shit for no reason.
GTA IV ran terribly if you maxed-out the draw distance sliders. It still runs poorly on most of today's systems because it really only takes advantage of three CPU cores, if I recall correctly.
If you don't max out the draw distance it runs a lot better, but people assume it's an old game so it should run fine maxed-out, even though it's a problem with the engine/optimization that is still not able to be brute-forced by most PCs; just like the original Crysis only barely stays above 60 FPS on the fastest CPUs available today.

On the other hand PC gaming can be outright bad if you're unlucky (even with a beefy pc). I wanted to play the Outer Worlds on my pc (9900K/2080ti) but thanks to my ultrawide gsync monitor i was fucked. The ultrawide resolutions are zoomed in, sadly the developers didn't take the time to do them right (Gears 5 and many other games are working perfectly). Even with ini changes it didn't really got much better.
The thing with ultrawide displays is that the worst-case scenario is having to play a game in 16:9… which is how you would have to play them if you didn't have an ultrawide monitor anyway.
I do of course find it disappointing when games lack proper ultrawide support, but I find that the majority either work or can be modded to support it in a way that feels native with minimal effort; but you might be frustrated if you're pre-ordering games and playing them on release date rather than giving modders a week to figure things out.

So i tried to play on my big ass 4K HDTV but (again) the developers couldn't get the settings right for release. In other games (gears 5, etc.) you can switch the display so that you don't have to switch cables to not play on your main monitor. Because of that the game refused to let me play in more than 1080P on my 4k TV. The game still thought i was playing on my ultrawide monitor and that can't display 4K.
WIN+P is a shortcut you should learn, as it brings up this menu (and switches after a moment, if the screen you're in front of is currently disabled).
winp-t1jik.png


Note: duplicating (mirroring) by its nature must use the lowest common value that both displays can support. If you have a 3440x1440p120 display and a 4K60 TV connected, duplicating will likely pick 2560x1440p60 since that's the first common resolution both should support. I would recommend that you switch between primary/secondary displays only, rather than duplicating or extending.
If you spend ~$8 in the next Steam sale, DisplayFusion is a program that supports more advanced profile switching which can enable/disable multiple displays, switch resolutions, and change audio devices via a single keyboard shortcut (which can be assigned to the controller via Steam Input) or by triggering when certain applications are launched.

I tried PC gaming years ago and during the installation of a game got the blue screen of death. I had to call support and got connected to someone in Eastern Europe. After 20 minutes I finally got the game to play to run but had to restore a bunch of shit.
I went back to consoles and never looked back. To me the best rig and setup doesn't compare with the ease of plug and play. Sure consoles can't technically compete with PC in terms of setting but the hassle free experience of consoles more than makes up for it.
BSODs are generally a sign of a hardware fault. User-mode software, like a game, should not be capable of taking down the entire OS. A certain game may be a consistent trigger for a hardware issue, but that does not mean the game itself is the problem.
Would you abandon console gaming forever and never look back if you had an Xbox 360 that red-ringed or a PS3 with YLOD?
It's not a unique thing to PC gaming either; there are lots of reports about particularly demanding games "bricking" their console. The game itself is not killing consoles, the console happened to die when playing that game.

Yeah... BIOS settings. Kind of integral part of maintaining your PC. Never realized mentioning BIOS could be misinterpreted as sarcasm, but I guess I shouldn't underestimate the power of denialism. Here's a fun exerice for you, go to PC performance thread and count how many times BIOS is mentioned.
I have literally never entered the BIOS on my system after initially setting it up, and it's extremely rare that updating the BIOS is necessary unless you are adding new hardware to the system like a newer generation of CPU or GPU.
There is one specific exception, which is Ryzen 3000 series processors - and from glancing through the PC performance topic for RDR2, those seem to be the ones affected by this.
Ryzen 3000 processors shipped with a bug where a specific CPU feature, RdRand, did not work correctly - and was preventing Destiny 2 from launching. I would not be surprised at all if it is the same issue here with Red Dead Redemption 2, and people affected by this did not update with the original fix back when it was released.
This is not even a PC-specific problem. There are often console hardware revisions that are unable to play a handful of games, and many games that require a specific system update before they will run.
 
Last edited:

My Name is John Marston

Alt account
Banned
Oct 27, 2017
111
That in itself is a problem - and why it doesn't seem to be scaling well to older systems. But you're right that the main issue here is many people's insistence on running games at "ultra" no matter what.

Yes I think they should've added even lower settings and changed the labels or even added warning on high settings that they're too demanding on current hardware.
 

ShinUltramanJ

Member
Oct 27, 2017
12,950
Yeah... BIOS settings. Kind of integral part of maintaining your PC. Never realized mentioning BIOS could be misinterpreted as sarcasm, but I guess I shouldn't underestimate the power of denialism. Here's a fun exerice for you, go to PC performance thread and count how many times BIOS is mentioned.

I set my BIOs when I installed my motherboard/CPU. The only reason I've ever played around with it is to overclock. Otherwise it's left alone for the duration of the time I have that hardware.

I agree that PC ownership requires some tinkering, but I wouldn't even consider setting a BIOs as part of that tinkering.
 

dmix90

Member
Oct 25, 2017
1,885
Who said the game can't deliver console performance on similar PC hardware? No one here knows what console settings are and most people are targeting 60fps while it's 30fps and drops to the 20s on consoles. Why are you comparing it to console performance again?

Graphics settings are optional. If they are "not needed" just ignore them and play on low settings instead. Consoles run most likely with low settings and some on medium.

I don't understand why you want the devs to deliberately restrict the graphics options to current hardware maximum capabilities. Why don't they offer more scalability for future hardware?
I did not say that this game is a bad port i was talking in general. I am waiting for DF comparisons since i do not have PC version.

Like i said in later post if there is no visual gain from "future proofing" your game then what's the point? Only confusion in community and wasted hardware horsepower/electricity.

You can make untextured 3D cube run at 1fps on 2080Ti if you want... it won't make it look better though. I am not saying that RDR2 PC version does that, but from what i have seen so far there is not a big difference. Would be great to get good comparisons and analysis.
 

Deleted member 18161

user requested account closure
Banned
Oct 27, 2017
4,805
What a weird post.
It's like listing like 5 badly optimized games this gen and saying that's evidence that developers don't care for consoles.
Are we back in 2005?

They were five games that I was really interested in and decided to buy then later upgrade my PC to play them at 1080p/60fps.

They were incredibly poorly optimised / lazy ports of big AAA games which have been well documented. I'm sorry if that upsets you.
 

My Name is John Marston

Alt account
Banned
Oct 27, 2017
111
I did not say that this game is a bad port i was talking in general. I am waiting for DF comparisons since i do not have PC version.

Like i said in later post if there is no visual gain from "future proofing" your game then what's the point? Only confusion in community and wasted hardware horsepower/electricity.

You can make untextured 3D cube run at 1fps on 2080Ti if you want... it won't make it look better though. I am not saying that RDR2 PC version does that, but from what i have seen so far there is not a big difference. Would be great to get good comparisons and analysis.

There is visual gain from high settings. And what do you mean what's the point? The high settings are optional. That's the point. You can ignore the high settings if you want. That's the beauty of PC gaming. You think it's a waste, ignore it and lower your settings and it's still better looking than consoles.
 

elyetis

Member
Oct 26, 2017
4,556
I set my BIOs when I installed my motherboard/CPU. The only reason I've ever played around with it is to overclock. Otherwise it's left alone for the duration of the time I have that hardware.

I agree that PC ownership requires some tinkering, but I wouldn't even consider setting a BIOs as part of that tinkering.
Same, the only reason I ever went to my BIOS is because I specifically made the choice of getting a 2500k, and now 8700k, that I wanted to overclock. And that was exclusively because of how core speed is/was important for emulation, I never cared about the performance gain it would bring to PC games.

I don't know about amd, but with intel the only time I can think of someone choosing to go to the bios for something performance related, would be to turn on or off hyperthreading for that 1 out of 1000 games where it not just best to simply always keep it on. I know I never touched it ever since I got my 8700k two years ago.
 

c0Zm1c

Member
Oct 25, 2017
3,206
Thank you for your time in setting this up! Only proving my point with that comparison, tho. GTX 780 has a ~20% better performance then the recommended GTX 770 for Witcher 3, so lets say a 770 in your test would net FPS in the 60-80s. Same goes here for RDR2 - the recommended 1060 has a 60-70FPS performance on 1080p/lowest preset. (at least from the performance thread and some utube videos, don't have the chance of testing it myself)
How does that prove the game(s)(W3 and RDR2) not being demanding the more you crank up the visual goods? And even more, what's making you think that Cyberpunk wont be even more demanding (not counting raytracing implementation) and what would stop people going crazy again over this stuff, the same as here now?
Occasionally there these games that push the top of the current hardware to the limit, be it for overtaxing graphical goodies(tesselation, msaa, raytracing), futureproof options(think ubersampling in Witcher 2, I believe they even added a 'note' saying this options is for the hardware of the future), ect. Thats not a reason to hate the game, the dev or the pc gaming as a whole.
Where did I say or even imply that? In fact in the post you quoted I did say the higher the settings the more hardware is going to break a sweat.

What I'm trying to say is that the game (The Witcher 3) can't be so demanding when it scales well across different hardware low and high end, old and new. The benchmark you linked showed the game running poorly on, for example, a GTX760 because they were using ultra settings. That's not proof that the game is demanding, only that the settings chosen don't fit the hardware.

I know that they can't benchmark everything, it would be a mammoth task, and that's why I said it doesn't give you the full picture.

Edit: I found this Youtube video showing it running at various settings on the minimum GPU (GTX 660). At lowest settings the quality/performance ratio is not too shabby at all in my opinion.
 
Last edited:

Deleted member 18161

user requested account closure
Banned
Oct 27, 2017
4,805
I am shaking right now, that's how upset I am.

It doesn't bother me what you're doing pal but well done for ignoring the rest of my last post.

It is in no way as easy as people on forums make out to get a locked 60fps in big name AAA ports on PC especially at high settings and especially at 1440p, 1800p or 4k (which are the resolutions now offerered by the mid gen consoles) without much of the hassle PC gaming brings.

On top of the farmed out / low budget AAA ports, the rise of exclusive deals and multiple digital stores, the death of cheaper keys and the gargantuan leap in CPU performance of the next gen consoles which are arriving in just over a year were the final nails in the coffin of PC gaming for me personally.

Anyone know how much I'd get for a boxed RTX 2070 in the U.K. second hand market?
 

BeaconofTruth

Member
Dec 30, 2017
3,427
The game is a locked 30fps at Native 4k on the X...

I feel the same as the OP between Dark Souls, AC Unity, Ryse, Arkham Knight, Watch Dogs 2, Mafia 3 and now RDR 2. DS will be the same. AAA developers don't care about PC.
Yeah and I'd call locked 30 awful performance. Shit is sluggish as hell.
 

BeaconofTruth

Member
Dec 30, 2017
3,427
I am not speaking to the PC performance or the 30fps difference.
Missing the conversation point then, because whatever hassle I deal with in a PC game, at the least PC players can find work around and get more ideal performance.

with the console you're often stuck with the wack performance, and however the dev left the game with maybe a few tweaks to trash like motion blur and god willing a FOV slider.
 

Deleted member 21709

User requested account closure
Banned
Oct 28, 2017
23,310
Missing the conversation point then, because whatever hassle I deal with in a PC game, at the least PC players can find work around and get more ideal performance.

with the console you're often stuck with the wack performance, and however the dev left the game with maybe a few tweaks to trash like motion blur and god willing a FOV slider.

So to you this is just the usual console/PC war BS? Grow up.
 
Mar 29, 2018
7,078
In my experience how things look on Low or Medium on PC is usually on par with, or better than, how it looks on consoles.

I generally just let the game autodetect, begin playing, and if any one thing is performing badly i tweak everything down; if anyone thing could be better, i tweak that one thing up.

Tend to find a happy middle ground very quickly and I'm laughing.
 

DeaDPooL_jlp

Banned
Oct 31, 2017
2,518
I have to admit that between Modern Warfare and RDR2 I'm really starting to question my PC first approach to new games. I couldn't play the game the night that Modern Warfare was unlocked because the PC version was screwed, and I can't play RDR2 because the launcher is broken.
I got lucky with Gears 5 and was able to finish it before all the servers went to shit.

As someone whose primary focus is single player campaigns, the PC launches of this year have been buggy as can be. A year from now I don't know if my PC first approach will still be intact.

Those issues were not exclusive to PC though, I played both on console at launch and experienced similar issues as well. This is a developer issue, not a platform one.
 

Deleted member 26394

user requested account closure
Banned
Oct 30, 2017
231
It's not that simple. It also depends on your fans. If vapor chamber cooling made that much sense no matter what, it'd be featured in every high end GPUs. Yet a lot of high end cooling systems prefers heat pipe. Why ?
They both do the exact same thing in the exact same way, only in different packaging. Ease of manufacturing is probably the bigger factor in deciding between the two.
 

BeaconofTruth

Member
Dec 30, 2017
3,427
So to you this is just the usual console/PC war BS? Grow up.
What? Mate what? Your the one taking it as a slight, where as I am just discussing what is as it is. PC generally require more work from the player, because of any number of reasons that may come up. But in the good ol age of google search, one can figure out some work arounds. Customize the game to their needs, which is the main perk of spending all the money someone does on a good rig.

Consoles are more plug n play and more unified in their approach, but it comes at the expense of wack performance often (which yeah, 30 frames is pretty wack when you game mostly at 60), and little to no visual options to clean up the game a bit.


that just is what it is.
 

spam musubi

Member
Oct 25, 2017
9,380
Hundreds of games release just fine on PC and when one game is a slightly questionable port it's the fault of PC gaming
 

Badcoo

Member
May 9, 2018
1,607
I feel you, OP. I love PC gaming but sometime's it just simpler to play on consoles. Anytime I'm playing on PC and gave either stutters, framerate drops, or anything else - i start spending hours on hours tweaking and google to find a solution thinking it's my rig.

With consoles, if it doesn't work properly, you know it's not your systems fault - and that there's not much you can do to fix it. So you just deal with it, which isn't ideal.

Hundreds of games release just fine on PC and when one game is a slightly questionable port it's the fault of PC gaming

Find me a PC performance thread where people aren't having a problem with a game. I think Gears 5 was recently the best PC port in a while and even that had server problems at launch.
 

KKRT

Member
Oct 27, 2017
1,544
"Like every 10 minutes". Is that normal on PC? That's a lot of minutes of not playing, imo. It's fine some people don't mind tweaking a little, but I don't want to try every setting in order to "discover" what cause certain problems. That's not hyperbole, just like my Dutch forum comment is not hyperbole.
2 minutes of setting a game within first hour is 'a lot of minutes'? And when you set it up correctly, you do not touch it again
And thats for a game with a lot of settings, many games have like maybe 8 settings, that you can basically setup within first 30 second.
 
Oct 25, 2017
22,378
It doesn't bother me what you're doing pal but well done for ignoring the rest of my last post.

It is in no way as easy as people on forums make out to get a locked 60fps in big name AAA ports on PC especially at high settings and especially at 1440p, 1800p or 4k (which are the resolutions now offerered by the mid gen consoles) without much of the hassle PC gaming brings.

On top of the farmed out / low budget AAA ports, the rise of exclusive deals and multiple digital stores, the death of cheaper keys and the gargantuan leap in CPU performance of the next gen consoles which are arriving in just over a year were the final nails in the coffin of PC gaming for me personally.

Anyone know how much I'd get for a boxed RTX 2070 in the U.K. second hand market?

What do you want me to say, you personally wanted to play these specific games at 60 FPS and couldn't. Not really much I can add to that now, is there?
Glad you found solace in the next gen consoles and the hassle free 60FPS 4K games they'll bring with them.