• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.
  • We have made minor adjustments to how the search bar works on ResetEra. You can read about the changes here.

Piggus

Member
Oct 27, 2017
4,700
Oregon
I keep trying to tell PC gamers exactly this. Like I love Xbox, my co worker has a beefy ass PC and sold his Xbox. He's trying to convince me to join the master race since we both make pretty good money at this job and he's saying I'd save money in the long run, which isn't wrong. But recently he got red dead 2 (the game that made me go out and buy a 4K tv and a Xbox one x, LITERALLY for that one game... that I have yet to beat lol) and he's having similar problems about not hitting the settings he was hoping to fix. That and some game called blood stained he has had problems with. When he mentioned it I was like bro this EXACTLY why a lot of us hardcore console gamers don't game on PC.

I've heard people say "well you still have to download patches and installs" and the simple answer to that is nobody cares, at the end of the day I press a button, it downloads, then I'm playing the game. Nothing more nothing less, honestly I wish steam machines would've taken off because I've always wanted to mod and stuff... plus I thought the controller was pretty rad lol.

Here's my take from someone who games on console AND PC. It's great and all that console games are sometimes less of a hassle to get working properly, but most people still game on PC because the overall benefits significantly outweigh the drawbacks or temporary issues that sometimes come with new releases. The prospect of mods in particular for a game like RDR2 is huge if you know anything about the existing GTA modding community and the use of trainers. So yeah, it's not the smoothest launch, but it's just something people are willing to put up with in exchange for a better experience, especially down the line. As soon as the first videos come out of players using trainers to play as any NPC or animal in the game, this will be the version everyone wants, convenience be damned.
 

pswii60

Member
Oct 27, 2017
26,673
The Milky Way
This is lengthy, but heh, I can't find a different way to go through it all:

- While waiting for Digital Foundry to come up with some reliable data on PC/Console comparison, I've myself compared side by side, frame by frame footage between a supposedly maxed quality 4k PC and PS4 Pro. I'm not seeing the difference people claim to be. The PC footage is very clearly much more defined and high resolution, but what's on screen is almost identical to the PS4 footage. The same distance ranges, the same geometry, trees on the distance, grass draw distance. The exact same distance where shadows pop up. It's close to 1:1 reproduction. The PS4 seems to have a stronger "haze", and has much less clarity because of the resolution, but it looks just the same screen rescaled. So I'm still doubting that the console version is a mix of low and medium, unless low and medium are nearly identical to the max.

This is the footage I've used:
https://www.youtube.com/watch?v=wZpgt6L89hY (PS4 Pro)
https://www.youtube.com/watch?v=hNutWJ7Xw2Q (PC)

I've compared 2:05 onward from the first video VS 1:42:30 of the second one. Going on for the next 10 minutes. Since they are scripted sequences in the open, they are quite easy to compare side by side.

- The second point is about the topic itself. The advantage of playing on a console isn't simply that the game "just works", but also that it was built FOR the hardware. On PC when you have to juggle with settings to find a decent compromise of performance and quality, you either spend three months taking screenshots, compare every setting and so on. There are always settings that tank the performance while being negligible visually, so you just don't know what's the best compromise unless you really spend hours researching, and even then it's always a rough estimation.

This on top of emergent technical issues. For example I remember the first Titanfall, the developers explained they spent a lot of time reorganizing the texture pool on PC, so that all the big, important textures that take the priority on screen still retained a very high quality, while they used lower textures for stuff that was more hidden, less noticeable. The result what that on a screenshot comparison there wasn't almost any perceivable distinction between different texture settings. Even if the texture requirements went up dramatically from one setting to the next.

Compare this I just said, to Assassin Creed Unity. If you tried to match the same texture quality of a PS4, on the PC version, you obtained a version that looked HORRENDOUS. This because the PS4 had a mix of medium and high textures, carefully handpicked, that on PC was impossible to achieve. You either selected "high", or medium would have been already lower than PS4 quality despite some trivial textures would look better. Because on PC the textures pools weren't well done and the developers only focused to make the high settings good.

Here we come to the conclusion about one aspect that I've never seen expressed: games on consoles, because of the single hardware, are VERY FINELY TUNED FOR ART DIRECTION. It means that there are devs whose whole job is match the very best performance concessions to look the best possible. It means there are professionals who spend days doing this fine tuning of details, cutting the corners that are the least important. On PC you can replicate some of this through manual settings, but it's a very long shot from tuning the code directly, and you can never match the time and care spent by devs paid for the job.

This directly leads, on PC, the impulse of pushing everything to the max. It's natural.
On my PC (2080Ti) I can run 99% of games in native 4k, 60fps+ and with Ultra settings with no fiddling with settings. Some of your points might be valid for low to mid range cards but for those who are playing on PC for the graphics and performance advantages, they're going to be high end. Not everyone is playing on PC to enjoy better graphics or performance though, many prefer the controls, mods, Steam ecosystem, or want to play on their gaming laptop.

Also your entire focus seems to be on RDR2, which seems to be a dodgy port. Meanwhile forgetting the thousands more other games which have night and day improvements on PC whether via performance or graphics. There are dodgy ports on console too of course, all the time.

And this idea that games are artistically perfected for console hardware lol. Then why do games like Monster Hunter World and Sekiro run like utter shite in consoles compared to PC. MHW suffers from abysmal LoD pop in on consoles and frame pacing issues. Games like Ace Combat 7, Rage 2 and Nier Automata being stuck at a lowly 1080p on the Pro. And I must have imagined the ray tracing in Control and Metro Exodus. I could go on, but as soon as I boot any game up on the 2080Ti it's literally jaw dropping. Everything looks and runs so good that it makes going back to console a really miserable experience. Roll on next gen and I'll likely spend more time on consoles again at least initially, but the 3080 Ti might be out by then!
 

leng jai

Member
Nov 2, 2017
15,119
Honestly if I said what I really thought about some of the mental gymnastics and misinformation parroted around here about PC gaming I would be banned within the hour.
 

Deleted member 36186

User requested account closure
Banned
Dec 14, 2017
395
I agree with OP. Whenever I think of upgrading my old rig, I remember these performance threads and how much shit you sometimes have to go through to play a game in a stable way on PC. Its getting better but there are still many shitty ports.
This was the major reason why I went with console gaming some years ago, and while I do miss high resolutions and framerates, I certainly don't miss all the god damn troubleshooting, crashes, stuttering issues and so forth that is so common with pc ports.
 

TaterTots

Member
Oct 27, 2017
12,966
at the end of the day I press a button, it downloads, then I'm playing the game.

What do you think people on PC do? If I turn on my PC and a new driver is available I get a notification with a "ding" sound and I just press a button. Games come out with poor performance on all platforms. However, I'm starting to suspect performance for this game might not be all bad. 5700 xt gets about 30 fps on the highest setting at 4k and its not a "ULTRA 4K 60 FPS" card.
 

Sanctuary

Member
Oct 27, 2017
14,229
On my PC (2080Ti) I can run 99% of games in native 4k, 60fps+ and with Ultra settings with no fiddling with settings. Some of your points might be valid for low to mid range cards but for those who are playing on PC for the graphics and performance advantages, they're going to be high end. Not everyone is playing on PC to enjoy better graphics or performance though, many prefer the controls, mods, Steam ecosystem, or want to play on their gaming laptop.

I don't even care that a card can hit 60fps. To me that's completely irrelevant. I don't want it dropping below 60fps aside from cutscenes and crap where I don't have any control anyway. The 2080 Ti can't do that with most modern titles and ultra settings. The RTX tax on it is ridiculous too. I would have paid up to $999 for one if it could keep a consistent 4K/60, but it can't even do that without lowering a bunch of settings to a mix of medium and high. May as well just stick with 1440p and higher quality settings instead.
 

Deleted member 2533

User requested account closure
Banned
Oct 25, 2017
8,325
With all do respect, that sort of leans towards my point about general confusion from the consumers point of view.

Certainly, but I just mean that you can't actually "standardise" settings in a meaningful way. Like you're never get a video card that says, "medium guaranteed 60fps on every game." But performance and graphical fidelity varies wildly in game to game on console.

Many console games run sub-60 fps at sub-native resolutions, and console players don't seem to mind. If you load up a PC game and don't touch any settings, you can get sub-60fps and sub-native resolutions as well, and you don't have to mind it if you don't want to either.

Now I haven't played RDR2 on PC, and some games flat out run worse on PC, but some run better when they take advantage of a PC if it happens to be running on superior hardware.

Most modern games I play do a very good job of autodetecting optimal settings, and I don't actually have to go in and tinker.
 

nsilvias

Member
Oct 25, 2017
23,790
most of the games that have problems on pc are from companies that dont really care about pc to begin with.
 

rodrigolfp

Banned
Oct 30, 2017
1,235
This is lengthy, but heh, I can't find a different way to go through it all:

- While waiting for Digital Foundry to come up with some reliable data on PC/Console comparison, I've myself compared side by side, frame by frame footage between a supposedly maxed quality 4k PC and PS4 Pro. I'm not seeing the difference people claim to be. The PC footage is very clearly much more defined and high resolution, but what's on screen is almost identical to the PS4 footage. The same distance ranges, the same geometry, trees on the distance, grass draw distance. The exact same distance where shadows pop up. It's close to 1:1 reproduction. The PS4 seems to have a stronger "haze", and has much less clarity because of the resolution, but it looks just the same screen rescaled. So I'm still doubting that the console version is a mix of low and medium, unless low and medium are nearly identical to the max.

This is the footage I've used:
https://www.youtube.com/watch?v=wZpgt6L89hY (PS4 Pro)
https://www.youtube.com/watch?v=hNutWJ7Xw2Q (PC)

I've compared 2:05 onward from the first video VS 1:42:30 of the second one. Going on for the next 10 minutes. Since they are scripted sequences in the open, they are quite easy to compare side by side.

- The second point is about the topic itself. The advantage of playing on a console isn't simply that the game "just works", but also that it was built FOR the hardware. On PC when you have to juggle with settings to find a decent compromise of performance and quality, you either spend three months taking screenshots, compare every setting and so on. There are always settings that tank the performance while being negligible visually, so you just don't know what's the best compromise unless you really spend hours researching, and even then it's always a rough estimation.

This on top of emergent technical issues. For example I remember the first Titanfall, the developers explained they spent a lot of time reorganizing the texture pool on PC, so that all the big, important textures that take the priority on screen still retained a very high quality, while they used lower textures for stuff that was more hidden, less noticeable. The result what that on a screenshot comparison there wasn't almost any perceivable distinction between different texture settings. Even if the texture requirements went up dramatically from one setting to the next.

Compare this I just said, to Assassin Creed Unity. If you tried to match the same texture quality of a PS4, on the PC version, you obtained a version that looked HORRENDOUS. This because the PS4 had a mix of medium and high textures, carefully handpicked, that on PC was impossible to achieve. You either selected "high", or medium would have been already lower than PS4 quality despite some trivial textures would look better. Because on PC the textures pools weren't well done and the developers only focused to make the high settings good. The result of this was that on PC you either maxed textures, or the game would look worse than a PS4. No middle ground.

Here we come to the conclusion about one aspect that I've never seen expressed: games on consoles, because of the single hardware, are VERY FINELY TUNED FOR ART DIRECTION. It means that there are devs whose whole job is match the very best performance concessions to look the best possible. It means there are professionals who spend days doing this fine tuning of details, cutting the corners that are the least important. On PC you can replicate some of this through manual settings, but it's a very long shot from tuning the code directly, and you can never match the time and care spent by devs paid for the job.

This directly leads, on PC, the impulse of pushing everything to the max. It's natural.

And now maybe a personal thing: when I look at PC footage, compared to consoles, the PC footage gives me a feeling that things aren't quite "right". But it was hard to pinpoint why. I eventually realized that its because of the animations. When moved to 60 fps footage the added smoothness has the incidental effect of making the same animations more robotic and stiff. The same happens with textures, when you increase the resolution to 4k the much higher definition simply brings out flaws that you wouldn't notice. Basically the PC, with the added clarity, enhances the problems too, because the game wasn't originally made for this definition. And ultimately it looks "off", weird.

The exact same animation that looks perfectly fine at 30 fps, when seen at 60 becomes extremely unnatural and robotic, as if sticks out the rest of the environment like a sore thumb. Same for facial expressions, or textures quality. When you see the console footage, the game looks like a marvel because it all blends together naturally. When you see the PC footage you have this surgical sight that suddenly emphasizes all the things that aren't quite right, and it all appears more glitchy and rough.

Are you an alter account of that poster of "the Switcher 3 is better than any other version and more immersive"? Because this is pure gold like that thread.
 

Deleted member 2533

User requested account closure
Banned
Oct 25, 2017
8,325
I don't even care that a card can hit 60fps. To me that's completely irrelevant. I don't want it dropping below 60fps aside from cutscenes and crap where I don't have any control anyway. The 2080 Ti can't do that with most modern titles and ultra settings. The RTX tax on it is ridiculous too. I would have paid up to $999 for one if it could keep a consistent 4K/60, but it can't even do that.

PC devs have no reason to kneecap their gfx options to today's top hardware, especially for GaaS. Look at the longevity of GTAV. What does R* care about how well the game runs on top-flight stuff on release, when the game is still massively popular and now runs better on today's mid-range specs.

Early RTX cards may fail with Control today at 4k/60, but someone 5 years from now might find themselves playing Control at 8k/144, and they won't even need a remaster, it will just work.

(caveat, "can it run Crysis," will be eternally "no," because Crysis is designed around high-clock single-threaded performance which turned out to be a technological dead-end, so modern systems actually run Crysis a little worse -oops- but nevertheless!)
 

ShinUltramanJ

Member
Oct 27, 2017
12,950
Posts like this are exactly why PC players get stereotyped bud.

"Consoles are simpler, I just get to put the game in and play" is by no means a bad or reach-y defense on why some prefer console gaming.


I said the lengths console players go to downplay PC.
Literally nothing was mentioned about console users preferring consoles.
 

Sanctuary

Member
Oct 27, 2017
14,229
PC devs have no reason to kneecap their gfx options to today's top hardware, especially for GaaS. Look at the longevity of GTAV. What does R* care about how well the game runs on top-flight stuff on release, when the game is still massively popular and now runs better on today's mid-range specs.

Early RTX cards may fail with Control today at 4k/60, but someone 5 years from now might find themselves playing Control at 8k/144, and they won't even need a remaster, it will just work.

I'm not sure why this is your response to my post, since it had nothing to do with what I was talking about. I was replying to the claim that the 2080 Ti can run "99% of the games" at 4K/60 without lowering the settings. That's only partially true if you're just focusing on highs and not lows.
Thanks for trying to educate me on some pretty common knowledge (to those who have been gaming on PC for a while now) though I guess?

Although that Control example isn't remotely realistic. Maybe that's possible in ten years with the way things have been going lately.
 

ShinUltramanJ

Member
Oct 27, 2017
12,950
What do you think people on PC do? If I turn on my PC and a new driver is available I get a notification with a "ding" sound and I just press a button.

Lucky. I usually have to download the update to a floppy disc, and then unzip the file, and enter some text with notepad.

Then I just go into my Bios and update my motherboard in preparation of the file. Usually I can finish everything in about 45 minutes if I don't crash my OS.
 

leng jai

Member
Nov 2, 2017
15,119
I'm not sure why this is your response to my post, since it had nothing to do with what I was talking about. I was replying to the claim that the 2080 Ti can run "99% of the games" at 4K/60 without lowering the settings. That's only partially true if you're just focusing on highs and not lows.
Thanks for trying to educate me on some pretty common knowledge (to those who have been gaming on PC for a while now) though I guess?

Although that Control example isn't remotely realistic. Maybe that's possible in ten years with the way things have been going lately.

It definitely can't, not at locked 60 that's for sure. Dialing down the one or two Ultra settings that sap 10+ fps alone would probably do the trick though. Some guys drop frames in certain spots no matter what too in certain circumstances due to poor optimisation.

I'm a stickler for "locked 60fps" and when people say 60 they rarely mean locked. I still see people claim they can run games easily at 1440/60fps on their 1070 and that sure as shit isn't locked on any game released in the last 2 years.

It's the same for consoles too or even worse. All these 60fps perfomance modes in the Pro consoles are running at 40-50fps whenever they're under heavy load.

There is no need to be upset.

Being "upset" is usually the normal response when you have to constantly read strawmans and disenguous shit mate.
 

TaterTots

Member
Oct 27, 2017
12,966
Lucky. I usually have to download the update to a floppy disc, and then unzip the file, and enter some text with notepad.

Then I just go into my Bios and update my motherboard in preparation of the file. Usually I can finish everything in about 45 minutes if I don't crash my OS.

Don't forget you have to disassemble your entire PC and put it back together before it works.
 

AustinJ

Member
Jul 18, 2018
938
And this idea that games are artistically perfected for console hardware lol.

The following sounds stupid, but this is how my dumb brain works.

XBOX: "This game runs like shit. Welp, nothing I can do about it." *plays game*

PC: "This game runs like shit. I must fix it." *spends time trying to get it perfect, but can't. Ultimately lose interest out of frustration*

It could be the exact same game. Knowing that there's nothing I can do (consoles) makes me ultimately settle. Knowing I could probably fix something in the settings (PC) just makes me frustrated when I can't figure it out.
 

riotous

Member
Oct 25, 2017
11,341
Seattle
On PC when you have to juggle with settings to find a decent compromise of performance and quality, you either spend three months taking screenshots, compare every setting and so on. There are always settings that tank the performance while being negligible visually, so you just don't know what's the best compromise unless you really spend hours researching, and even then it's always a rough estimation.

Or do like me and spend 3 seconds launching GeForce experience and let it set what it thinks are the optimal settings for my PC.

I've rarely done anything beyond that settings wise with any game.
 

Deleted member 2533

User requested account closure
Banned
Oct 25, 2017
8,325
I'm not sure why this is your response to my post, since it had nothing to do with what I was talking about. I was replying to the claim that the 2080 Ti can run "99% of the games" at 4K/60 without lowering the settings. That's only partially true if you're just focusing on highs and not lows.
Thanks for trying to educate me on some pretty common knowledge (to those who have been gaming on PC for a while now) though I guess?

Although that Control example isn't remotely realistic. Maybe that's possible in ten years with the way things have been going lately.

You just seem annoyed that a 2080ti can't play all current games at max 4k/60, which has never been a thing. Setting are subjective, future-looking, people have different resolutions and refresh rates. No dev is saying, "2080ti, we need to hit 4k/60."
 

maximumzero

Member
Oct 25, 2017
22,927
New Orleans, LA
The following sounds stupid, but this is how my dumb brain works.

XBOX: "This game runs like shit. Welp, nothing I can do about it." *plays game*

PC: "This game runs like shit. I must fix it." *spends time trying to get it perfect, but can't. Ultimately lose interest out of frustration*

It could be the exact same game. Knowing that there's nothing I can do (consoles) makes me ultimately settle. Knowing I could probably fix something in the settings (PC) just makes me frustrated when I can't figure it out.

I haven't done the PC Gaming thing in over a decade, but I remember being frustrated that I couldn't put all those sliders to high/ultra without performance taking a nosedive in the process. Either I could move the sliders down to medium and accept that I'm getting a "lesser" experience or I could dump more money into a GPU or other component(s).

And of course any given console is going to be a "lesser" experience than a powerful PC, I'm not dumb. But at the same time there isn't any evidence in-game that "Hey this could look better, but too bad." and thus my brain doesn't flip out as much.

PC Gaming is the hobbyist's option, that's all. Some folks want to drop thousands of dollars into their gaming hardware and get the optimal experience, some folks want to spend a fraction of that and just have things easy. You can apply it to pretty much any industry. Some folks are fine with point & shoot cameras, some folks want to spend big bucks on a DSLR and lenses. Some folks are okay with MP3s from iTunes, some folks feel the need to go all-in on Vinyl or other "high-def" sound sources.

Heck, DVDs are still kicking despite Blu-Ray being a clearly better visual experience. But DVDs are cheap, readily available, and good enough, so folks are okay with it.
 

Monster Zero

Member
Nov 5, 2017
5,612
Southern California
I will take the issues to be able to tune the games any way I want. Red Dead didn't even launch for me so I got a refund and started playing Gears 5 instead. Playing Gears 5 at 1440p 120Hz, a setting that's not available to console players.
 
Nov 11, 2017
1,583
Software
This is lengthy, but heh, I can't find a different way to go through it all:

- While waiting for Digital Foundry to come up with some reliable data on PC/Console comparison, I've myself compared side by side, frame by frame footage between a supposedly maxed quality 4k PC and PS4 Pro. I'm not seeing the difference people claim to be. The PC footage is very clearly much more defined and high resolution, but what's on screen is almost identical to the PS4 footage. The same distance ranges, the same geometry, trees on the distance, grass draw distance. The exact same distance where shadows pop up. It's close to 1:1 reproduction. The PS4 seems to have a stronger "haze", and has much less clarity because of the resolution, but it looks just the same screen rescaled. So I'm still doubting that the console version is a mix of low and medium, unless low and medium are nearly identical to the max.

This is the footage I've used:
https://www.youtube.com/watch?v=wZpgt6L89hY (PS4 Pro)
https://www.youtube.com/watch?v=hNutWJ7Xw2Q (PC)

I've compared 2:05 onward from the first video VS 1:42:30 of the second one. Going on for the next 10 minutes. Since they are scripted sequences in the open, they are quite easy to compare side by side.

- The second point is about the topic itself. The advantage of playing on a console isn't simply that the game "just works", but also that it was built FOR the hardware. On PC when you have to juggle with settings to find a decent compromise of performance and quality, you either spend three months taking screenshots, compare every setting and so on. There are always settings that tank the performance while being negligible visually, so you just don't know what's the best compromise unless you really spend hours researching, and even then it's always a rough estimation.

This on top of emergent technical issues. For example I remember the first Titanfall, the developers explained they spent a lot of time reorganizing the texture pool on PC, so that all the big, important textures that take the priority on screen still retained a very high quality, while they used lower textures for stuff that was more hidden, less noticeable. The result what that on a screenshot comparison there wasn't almost any perceivable distinction between different texture settings. Even if the texture requirements went up dramatically from one setting to the next.

Compare this I just said, to Assassin Creed Unity. If you tried to match the same texture quality of a PS4, on the PC version, you obtained a version that looked HORRENDOUS. This because the PS4 had a mix of medium and high textures, carefully handpicked, that on PC was impossible to achieve. You either selected "high", or medium would have been already lower than PS4 quality despite some trivial textures would look better. Because on PC the textures pools weren't well done and the developers only focused to make the high settings good. The result of this was that on PC you either maxed textures, or the game would look worse than a PS4. No middle ground.

Here we come to the conclusion about one aspect that I've never seen expressed: games on consoles, because of the single hardware, are VERY FINELY TUNED FOR ART DIRECTION. It means that there are devs whose whole job is match the very best performance concessions to look the best possible. It means there are professionals who spend days doing this fine tuning of details, cutting the corners that are the least important. On PC you can replicate some of this through manual settings, but it's a very long shot from tuning the code directly, and you can never match the time and care spent by devs paid for the job.

This directly leads, on PC, the impulse of pushing everything to the max. It's natural.

And now maybe a personal thing: when I look at PC footage, compared to consoles, the PC footage gives me a feeling that things aren't quite "right". But it was hard to pinpoint why. I eventually realized that its because of the animations. When moved to 60 fps footage the added smoothness has the incidental effect of making the same animations more robotic and stiff. The same happens with textures, when you increase the resolution to 4k the much higher definition simply brings out flaws that you wouldn't notice. Basically the PC, with the added clarity, enhances the problems too, because the game wasn't originally made for this definition. And ultimately it looks "off", weird.

The exact same animation that looks perfectly fine at 30 fps, when seen at 60 becomes extremely unnatural and robotic, as if sticks out the rest of the environment like a sore thumb. Same for facial expressions, or textures quality. When you see the console footage, the game looks like a marvel because it all blends together naturally. When you see the PC footage you have this surgical sight that suddenly emphasizes all the things that aren't quite right, and it all appears more glitchy and rough.
Oh yeah comparing compressed videos is a great idea! :) And that 30fps vs 60fps rant tho..
 
Dec 15, 2017
1,590
I just love this forum. It's by far the strongest console defense force on the internet. People seem to be waiting for a dodgy pc game to start with their doom and gloom about PC gaming. Console games are better optimized, I believe that's true. But that does not mean a console game will run 2 times better than a pc game with equivalent specs. PC issues are blown way out of proportions here.

I have a FX 6300 and a GTX 760 2 GB since late 2013 and have played 95% of games at equal settings/performance or better than base consoles, as it should be. There are some outliers of course, but that's the developers fault and Kepler aging like milk. A 7950 R9 280 aged much better and costed the same. My bad.

PC gamers have played on consoles and still play. Most console gamers never played anything else than PlayStation but still spew myths about how PCs are for Turbo Tax only, the comfy couch, viruses, tales of gamers spending weeks installing drivers and more.
 

Nzyme32

Member
Oct 28, 2017
5,245
Uh no. No need to baby up PC versions because people get mad they cant ULTRA max everything.

Same. Ideally if the game can put me in the right ball park, I can easily prioritise what I want and knock down what I don't.
If devs want to ad some extra descriptions, or not how demanding a change will be - its more than welcome, but certainly no need to dumb down or not include intensive options
 

rodrigolfp

Banned
Oct 30, 2017
1,235
Or do like me and spend 3 seconds launching GeForce experience and let it set what it thinks are the optimal settings for my PC.

I've rarely done anything beyond that settings wise with any game.

Do you expect people that have little to no idea about pc gaming to know about GeForce Experience auto optimization? xD
 

AustinJ

Member
Jul 18, 2018
938
PC gamers have played on consoles and still play. Most console gamers never played anything else than PlayStation but still spew myths about how PCs are for Turbo Tax only, the comfy couch, viruses, tales of gamers spending weeks installing drivers and more.

This is a bizarre generalization of console gamers. I have a pretty beefy PC, I still prefer my One X. How do you know what "most console gamers" are like?
 
Oct 27, 2017
125
This thread is sort of ironic for me because I never got to play Red Dead Redemption 1 on my old PS3 because it would overheat.

The solution was far harder than anything on PC, disassembling the whole console and applying thermal paste. Which I did, and it still didn't work.

And I'm currently unable to play RDR2 on PC because Rockstar made an awful launcher that is broken. It's almost like games have issues on an individual basis and none of this is platform specific.
 

AustinJ

Member
Jul 18, 2018
938
Why would you prefer something with inferior performance and visuals?

My post a few posts above are one example. Game Pass is much better on Xbox than PC is another example. Just preference ultimately *shrug*. I'm satisfied with how games look on the X (for the most part), so I'm not hankering for more power.
 
Oct 27, 2017
6,891
Do you expect people that have little to no idea about pc gaming to know about GeForce Experience auto optimization? xD

If you're "juggling around" with in-game settings and taking screenshots for hours on end to compare graphical settings (I don't even do this), then yes - you should know about GeForce Experience auto optimization (if you have an Nvidia card). Also, if you're already deep into messing around with in-game settings on PC, you definitely aren't in the category of "little to no idea about PC gaming".
 

Spark

Member
Dec 6, 2017
2,540
Bad ports are not exclusive to PC, see Control on PS4 and Halo MCC on Xbox. There is a definite bias on this forum when you look at the thread title and tone of replies here. Can you imagine if the thread for Controls console issues was "OPINION: Control on Consoles is exactly the reason why some people still HATE gaming on Consoles". How do you think that would go down on this forum?

Controls thread title was more like "this port has issues and it must be fixed", not "console gaming is a pain" like whenever something like that occurs on PC.
 

maximumzero

Member
Oct 25, 2017
22,927
New Orleans, LA
Hard to believe but it does happen. Even on enthusiast sites like this one. In a couple of weeks we should be seeing a thread about someone that bought a new PC and had to basically enter the Matrix to change resolution.

There's outliers, of course, on "both sides" of this "debate", both the folks that spout off the term "pc master race"unironically and folks that think pc gaming requires a masters degree, but you're talking a fraction of a percent. I think you'll find most customers know the pros and cons of either choice and have their own preferences.
 

Sanctuary

Member
Oct 27, 2017
14,229
You just seem annoyed that a 2080ti can't play all current games at max 4k/60

Oh, I am, but that's also one of the reasons I didn't buy one. The main reason we did not get a true 4K/60 card is because of the horribly underutilized ray tracing and the early adoption tax that came with it. Next year, I fully expect to see the first true successors to the 1080 Ti that will meet the demands of modern resolutions. Honestly though, I don't really care about "Ultra" settings anyway if there's no discernible difference, or it's a difference that's only noticeable in screenshots. I barely use AA anymore either. Resolution scaling is also always a welcome feature, because it lets you raise the settings by slightly lowering the resolution for a few frames more, and you might not even be able to tell the difference.

which has never been a thing.

Heh, I never personally said that it was, and have never believed those claiming it is. Also, I never said, nor have I ever thought that developers should only target current hardware. They need to make sure their games run well on the current hardware, and at least look decent, but if they want to add in futuresampling mode, then I'm not going to complain.

I'm a stickler for "locked 60fps" and when people say 60 they rarely mean locked. I still see people claim they can run games easily at 1440/60fps on their 1070 and that sure as shit isn't locked on any game released in the last 2 years.

Same.
 
Last edited:

Pargon

Member
Oct 27, 2017
12,020
"Consoles are simpler, I just get to put the game in and play" is by no means a bad or reach-y defense on why some prefer console gaming.
Here's a video of me launching and configuring The Witcher 3 after a fresh install earlier today. You can see from Steam that it hasn't been launched since 2018 so this was not set up in advance.
Watch my confusion as I say to myself "hey, this doesn't look right" after loading up a save and then spend hours poring over the graphics options before I can play the game.



Welcome to the start of my video series detailing how difficult it is to "just play the game" on PC with all the production values that zero planning and 60 seconds of editing gets you, as I wait for another project to render out in the background.
I'm only half-joking. It's a project I've actually thought about doing for a while now thanks to this forum blowing things way out of proportion.
 

reKon

Member
Oct 25, 2017
13,739
You wouldn't have time to post it anyway because you're too busy fiddling with settings and taking comparison screenshots.
71VtmwN.png
 

rodrigolfp

Banned
Oct 30, 2017
1,235
Here's a video of me launching and configuring The Witcher 3 after a fresh install earlier today. You can see from Steam that it hasn't been launched since 2018 so this was not set up in advance.
Watch my confusion as I say to myself "hey, this doesn't look right" after loading up a save and then spend hours poring over the graphics options before I can play the game.



Welcome to the start of my video series detailing how difficult it is to "just play the game" on PC with all the production values that zero planning and 60 seconds of editing gets you, as I wait for another project to render out in the background.
I'm only half-joking. It's a project I've actually thought about doing for a while now thanks to this forum blowing things way out of proportion.

Meanwhile PS4 is still loading the main menu.
 

Deleted member 1589

User requested account closure
Banned
Oct 25, 2017
8,576
I've seen some screenshots, and videos of the game in Ultra setting and it doesn't look impressive... but I guess that's fine since it's usually minutae differences between Very High and Ultra that I cant appreciate on a 24 inch screen anyway.

Although I agree with OP's headline because I've heard how terrible the Rockstar client is from friends. I'm also not a fan of having so many different clients needed to play games either.