• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.

PennyStonks

Banned
May 17, 2018
4,401
I disagree. I review PC games for a site I partly own and I play a lot of PC games not day one but even before that, at the beta stage. I have a very modest PC too, an i5 6500 and a 1050ti. Out of the many games I've reviewed in the last three years only one was broken, Just Cause 4.
I do play a decent bit of early access so that 5% is probably very inflated. My "It just works" would be games that didn't require anything changed outside of in game settings.
 
Oct 28, 2017
5,050
If you dont want a customized optimal experience, then why even bother with PC?

Dont hate on games that future proof
 

MXG

Member
Oct 29, 2018
307
This is lengthy, but heh, I can't find a different way to go through it all:

- While waiting for Digital Foundry to come up with some reliable data on PC/Console comparison, I've myself compared side by side, frame by frame footage between a supposedly maxed quality 4k PC and PS4 Pro. I'm not seeing the difference people claim to be. The PC footage is very clearly much more defined and high resolution, but what's on screen is almost identical to the PS4 footage. The same distance ranges, the same geometry, trees on the distance, grass draw distance. The exact same distance where shadows pop up. It's close to 1:1 reproduction. The PS4 seems to have a stronger "haze", and has much less clarity because of the resolution, but it looks just the same screen rescaled. So I'm still doubting that the console version is a mix of low and medium, unless low and medium are nearly identical to the max.

This is the footage I've used:
https://www.youtube.com/watch?v=wZpgt6L89hY (PS4 Pro)
https://www.youtube.com/watch?v=hNutWJ7Xw2Q (PC)

I've compared 2:05 onward from the first video VS 1:42:30 of the second one. Going on for the next 10 minutes. Since they are scripted sequences in the open, they are quite easy to compare side by side.

- The second point is about the topic itself. The advantage of playing on a console isn't simply that the game "just works", but also that it was built FOR the hardware. On PC when you have to juggle with settings to find a decent compromise of performance and quality, you either spend three months taking screenshots, compare every setting and so on. There are always settings that tank the performance while being negligible visually, so you just don't know what's the best compromise unless you really spend hours researching, and even then it's always a rough estimation.

This on top of emergent technical issues. For example I remember the first Titanfall, the developers explained they spent a lot of time reorganizing the texture pool on PC, so that all the big, important textures that take the priority on screen still retained a very high quality, while they used lower textures for stuff that was more hidden, less noticeable. The result what that on a screenshot comparison there wasn't almost any perceivable distinction between different texture settings. Even if the texture requirements went up dramatically from one setting to the next.

Compare this I just said, to Assassin Creed Unity. If you tried to match the same texture quality of a PS4, on the PC version, you obtained a version that looked HORRENDOUS. This because the PS4 had a mix of medium and high textures, carefully handpicked, that on PC was impossible to achieve. You either selected "high", or medium would have been already lower than PS4 quality despite some trivial textures would look better. Because on PC the textures pools weren't well done and the developers only focused to make the high settings good. The result of this was that on PC you either maxed textures, or the game would look worse than a PS4. No middle ground.

Here we come to the conclusion about one aspect that I've never seen expressed: games on consoles, because of the single hardware, are VERY FINELY TUNED FOR ART DIRECTION. It means that there are devs whose whole job is match the very best performance concessions to look the best possible. It means there are professionals who spend days doing this fine tuning of details, cutting the corners that are the least important. On PC you can replicate some of this through manual settings, but it's a very long shot from tuning the code directly, and you can never match the time and care spent by devs paid for the job.

This directly leads, on PC, the impulse of pushing everything to the max. It's natural.

And now maybe a personal thing: when I look at PC footage, compared to consoles, the PC footage gives me a feeling that things aren't quite "right". But it was hard to pinpoint why. I eventually realized that its because of the animations. When moved to 60 fps footage the added smoothness has the incidental effect of making the same animations more robotic and stiff. The same happens with textures, when you increase the resolution to 4k the much higher definition simply brings out flaws that you wouldn't notice. Basically the PC, with the added clarity, enhances the problems too, because the game wasn't originally made for this definition. And ultimately it looks "off", weird.

The exact same animation that looks perfectly fine at 30 fps, when seen at 60 becomes extremely unnatural and robotic, as if sticks out the rest of the environment like a sore thumb. Same for facial expressions, or textures quality. When you see the console footage, the game looks like a marvel because it all blends together naturally. When you see the PC footage you have this surgical sight that suddenly emphasizes all the things that aren't quite right, and it all appears more glitchy and rough.
Comparing graphical fidelity using compressed YouTube 4K footage is the least effective way.

I wouldn't even comment on 60 vs. 30 fps, this is certainly meme material.

Higher resolution textures brings out flaws???!!!! Can this be more comedic?
 
Last edited:
Oct 28, 2017
1,951
....
The exact same animation that looks perfectly fine at 30 fps, when seen at 60 becomes extremely unnatural and robotic, as if sticks out the rest of the environment like a sore thumb. Same for facial expressions, or textures quality. When you see the console footage, the game looks like a marvel because it all blends together naturally. When you see the PC footage you have this surgical sight that suddenly emphasizes all the things that aren't quite right, and it all appears more glitchy and rough.

If we go by that explanation, then playing games at lower resolution will remove quite a lot of glitches off screen with the facial animations being the first and we should all be playing games at lower resolution?
 

Alexandros

Member
Oct 26, 2017
17,795
I do play a decent bit of early access so that 5% is probably very inflated. My "It just works" would be games that didn't require anything changed outside of in game settings.

Early Access has helped a lot in ensuring smooth launches for many games, it was a great idea, but it goes without saying that playing in-development versions of games might cause issues.
 

Ryugarr

Member
Dec 6, 2018
217
What did you think would happen? You only have a 1060.....? Consoles don't have ultra graphic settings and a lot of other settings that make the PC version look so much better. Of course its going to run that way...
 

Kadath

Member
Oct 25, 2017
621
rTao8x0.gif


To explain better what I mean with higher resolutions emphasizing problems you can see a similar effect there. Look for example at the nose and the area of the neck, the image overall is a lot more natural looking at its original resolution than rescaled. The one on the right is a low-fi image that still looks nice today. The one of the left is... weird.

The same as people on this forum commenting how some assets like the trees that weren't updated for PC now are more noticeable since they don't quite match the detail of everything else.

They are probably small things you only notice if you pay attention to details, but they are real.

For the texture it's not that the higher resolution brings out problems in the textures themselves, but in relation to the model. A super high res texture applied to a simpler model looks weird. In the console version the lower resolution gives a more natural, smooth overall look. On PC the higher clarity makes the model itself not quite match the texture quality. You can notice that detail is missing.

The same as when you rescale a picture UP from its natural resolution, it looks bad. If you rescale it down it looks great. The higher the display and the resolution, the much higher needs to be the graphic detail too (Apple built the "retina" concept on this). Smaller screens/resolutions get away looking good with much less.

Even the fact that in retrogaming a CRT screen looks immensely better than a super sharp LCD with optimal precision somewhat falls in the same category.

Or even the concept of LOD in all games: if something is up close it requires a high level of detail to look decent, if it's far away it can use a much simpler model, because it's a lot less noticeable. But if you then greatly push up the resolution and screen size, then that simpler LOD starts to stick out and will look bad. Etc...

Last example:

YxQ0ZfI.jpg


g9saDsD.jpg



The first looks fine, right? Well the second one is the exact same image. There the higher resolution "brings out the flaws", and it really looks bad.
 
Last edited:

Alexandros

Member
Oct 26, 2017
17,795
rTao8x0.gif


To explain better what I mean with higher resolutions emphasizing problems you can see a similar effect there. Look for example at the nose and the area of the neck, the image overall is a lot more natural looking at its original resolution than rescaled. The one on the right is a low-fi image that still looks nice today. The one of the left is... weird.

The same as people on this forum commenting how some assets like the trees that weren't updated for PC now are more noticeable since they don't quite match the detail of everything else.

They are probably small things you only notice if you pay attention to details, but they are real.

Is anyone forcing you to use a higher resolution if you don't like it?
 

c0Zm1c

Member
Oct 25, 2017
3,199
It was said earlier in the thread - with Ubisoft used as an example - but I'll reiterate that it would really help if more developers gave details on what quality/framerate the minimum/recommended hardware is supposed to roughly target. The 1060 is not a bad graphics card but if the recommended is for just, say, "high" at 30fps then that's why you're not hitting 60fps.

Thinking about that, assuming a playable framerate is the target, I wonder what postage stamp resolution Hello Games expects people to play No Man's Sky at with their stated GTX 480 minimum. :\

There are plenty of benchmarks around on various websites but they often only test games on "ultra" settings, which isn't helpful for people with older or lower end hardware and I think often a culprit in people crying "unoptimised!" Looking up benchmarks on Youtube for your specific hardware and, if they're popular enough, the games you intend to play can be much more helpful.
 
Dec 15, 2017
1,590
How is this card holding up for the latest games?

I'd say it still holds up pretty well considering it's a 6 year old mid range card that Nvidia basically sent to die once Maxwell released. Your safest bet is to go for 900p 30 fps medium preset and low textures in the most demanding AAA games from 2018/19.

ID Tech 6 games like Doom Wolfenstein 2 and rage 2 are basically unplayable with this card. You can play Doom at 1080 30 medium/high preset but Doom it's meant to be played at 60 fps or more.

I had issues with nier automata as well but your millage may vary with that game. It always crashes on my PC for some reason.
 

Deleted member 2172

Account closed at user request
Banned
Oct 25, 2017
4,577
This is lengthy, but heh, I can't find a different way to go through it all:

- While waiting for Digital Foundry to come up with some reliable data on PC/Console comparison, I've myself compared side by side, frame by frame footage between a supposedly maxed quality 4k PC and PS4 Pro. I'm not seeing the difference people claim to be. The PC footage is very clearly much more defined and high resolution, but what's on screen is almost identical to the PS4 footage. The same distance ranges, the same geometry, trees on the distance, grass draw distance. The exact same distance where shadows pop up. It's close to 1:1 reproduction. The PS4 seems to have a stronger "haze", and has much less clarity because of the resolution, but it looks just the same screen rescaled. So I'm still doubting that the console version is a mix of low and medium, unless low and medium are nearly identical to the max.

This is the footage I've used:
https://www.youtube.com/watch?v=wZpgt6L89hY (PS4 Pro)
https://www.youtube.com/watch?v=hNutWJ7Xw2Q (PC)

I've compared 2:05 onward from the first video VS 1:42:30 of the second one. Going on for the next 10 minutes. Since they are scripted sequences in the open, they are quite easy to compare side by side.

- The second point is about the topic itself. The advantage of playing on a console isn't simply that the game "just works", but also that it was built FOR the hardware. On PC when you have to juggle with settings to find a decent compromise of performance and quality, you either spend three months taking screenshots, compare every setting and so on. There are always settings that tank the performance while being negligible visually, so you just don't know what's the best compromise unless you really spend hours researching, and even then it's always a rough estimation.

This on top of emergent technical issues. For example I remember the first Titanfall, the developers explained they spent a lot of time reorganizing the texture pool on PC, so that all the big, important textures that take the priority on screen still retained a very high quality, while they used lower textures for stuff that was more hidden, less noticeable. The result what that on a screenshot comparison there wasn't almost any perceivable distinction between different texture settings. Even if the texture requirements went up dramatically from one setting to the next.

Compare this I just said, to Assassin Creed Unity. If you tried to match the same texture quality of a PS4, on the PC version, you obtained a version that looked HORRENDOUS. This because the PS4 had a mix of medium and high textures, carefully handpicked, that on PC was impossible to achieve. You either selected "high", or medium would have been already lower than PS4 quality despite some trivial textures would look better. Because on PC the textures pools weren't well done and the developers only focused to make the high settings good. The result of this was that on PC you either maxed textures, or the game would look worse than a PS4. No middle ground.

Here we come to the conclusion about one aspect that I've never seen expressed: games on consoles, because of the single hardware, are VERY FINELY TUNED FOR ART DIRECTION. It means that there are devs whose whole job is match the very best performance concessions to look the best possible. It means there are professionals who spend days doing this fine tuning of details, cutting the corners that are the least important. On PC you can replicate some of this through manual settings, but it's a very long shot from tuning the code directly, and you can never match the time and care spent by devs paid for the job.

This directly leads, on PC, the impulse of pushing everything to the max. It's natural.

And now maybe a personal thing: when I look at PC footage, compared to consoles, the PC footage gives me a feeling that things aren't quite "right". But it was hard to pinpoint why. I eventually realized that its because of the animations. When moved to 60 fps footage the added smoothness has the incidental effect of making the same animations more robotic and stiff. The same happens with textures, when you increase the resolution to 4k the much higher definition simply brings out flaws that you wouldn't notice. Basically the PC, with the added clarity, enhances the problems too, because the game wasn't originally made for this definition. And ultimately it looks "off", weird.

The exact same animation that looks perfectly fine at 30 fps, when seen at 60 becomes extremely unnatural and robotic, as if sticks out the rest of the environment like a sore thumb. Same for facial expressions, or textures quality. When you see the console footage, the game looks like a marvel because it all blends together naturally. When you see the PC footage you have this surgical sight that suddenly emphasizes all the things that aren't quite right, and it all appears more glitchy and rough.
This might be one of the worst posts I've ever seen on this site lol, unbelievable.
 

Launchpad

Member
Oct 26, 2017
5,154
I never really understand why when a game is a technical mess on consoles it's the developers fault and they should fix it but when games have similar issues on PC it's somehow the platforms fault and "this is why people hate PC gaming". I hate playing games at 30fps and with slow loading times but I don't make threads saying "This is why some people still HATE gaming on consoles".
 

Phantom88

Banned
Jan 7, 2018
726
This might be one of the worst posts I've ever seen on this site lol, unbelievable.


every pc related thread on resetera is the most grotesque thing on the web. It simply boggles the mind how removed from pc gaming this comunity is. Its the most casual, light weight gaming forum you could possibly find anywhere. When you read people refering to this forum in the third person and calling it hardcore, entushiast forum you just wanna turn off the screen and go out. You could not get more casual than this forum if your life depended on it
 

ShinUltramanJ

Member
Oct 27, 2017
12,949
rTao8x0.gif



The first looks fine, right? Well the second one is the exact same image. There the higher resolution "brings out the flaws", and it really looks bad.

On the top, the image on the left looks better then the image on the right. Nobody on consoles ever argued that Bleemcast console games looked worse on Dreamcast then Playstation. Even though it did exactly what your top image shows, by cleaning up the graphics. Funny how that works? Only when PC is brought into the equation do console users claim to prefer low resolutions riddled with jaggies.

When it's about PC, some people start talking out of the side of their neck.
 

Kadath

Member
Oct 25, 2017
621
Funny how that works? Only when PC is brought into the equation do console users claim to prefer low resolutions riddled with jaggies.

I think I've explained myself clearly now and I'll leave it at that.

But I'll address this since it came up more than once: if you want to take it personally then your inferences are wrong. In my life I had a Commodore 64, then a PC. I bought a PSX late in its cycle and that's the only console I've ever owned personally, an original PS1. That, and more recently a 2DS XL. All my game history is on PC, so I'm not defending the consoles out of some convoluted personal interest.
 

Csr

Member
Nov 6, 2017
2,028
Kadath it looks like the old game in your example was created with the notion that it would be displayed in a low resolution and that is why it looks like the model makes more sense there.
Your RDR2 example is a zoomed in picture not a higher resolution one, it actually has a lot less pixels than the first. Show the complete first picture in a lower resolution (or in higher) and the conclusion will be obvious.

Edit: not sure if the whole picture that the zoomed part comes from is higher res but the zoom in is what makes the second image worse.
 
Last edited:

My Name is John Marston

Alt account
Banned
Oct 27, 2017
111
rTao8x0.gif


To explain better what I mean with higher resolutions emphasizing problems you can see a similar effect there. Look for example at the nose and the area of the neck, the image overall is a lot more natural looking at its original resolution than rescaled. The one on the right is a low-fi image that still looks nice today. The one of the left is... weird.

The same as people on this forum commenting how some assets like the trees that weren't updated for PC now are more noticeable since they don't quite match the detail of everything else.

They are probably small things you only notice if you pay attention to details, but they are real.

For the texture it's not that the higher resolution brings out problems in the textures themselves, but in relation to the model. A super high res texture applied to a simpler model looks weird. In the console version the lower resolution gives a more natural, smooth overall look. On PC the higher clarity makes the model itself not quite match the texture quality. You can notice that detail is missing.

The same as when you rescale a picture UP from its natural resolution, it looks bad. If you rescale it down it looks great. The higher the display and the resolution, the much higher needs to be the graphic detail too (Apple built the "retina" concept on this). Smaller screens/resolutions get away looking good with much less.

Even the fact that in retrogaming a CRT screen looks immensely better than a super sharp LCD with optimal precision somewhat falls in the same category.

Or even the concept of LOD in all games: if something is up close it requires a high level of detail to look decent, if it's far away it can use a much simpler model, because it's a lot less noticeable. But if you then greatly push up the resolution and screen size, then that simpler LOD starts to stick out and will look bad. Etc...

Last example:

YxQ0ZfI.jpg


g9saDsD.jpg



The first looks fine, right? Well the second one is the exact same image. There the higher resolution "brings out the flaws", and it really looks bad.

Except you can increase draw distance and far details...
 

BlueManifest

One Winged Slayer
Member
Oct 25, 2017
15,301
I gotta be honest with you, I was kind of excited that RDR2 was finally hitting PC after a year, in fact I do own a PS4 but decided to wait so I could hopefully run it at 1080p@60fps since most games I've tried run perfectly fine with my rig:

Ryzen 5 1600 3.20 GHz
16gb of ram
GTX 1060 6GB

I checked the recommended specs and found this:

Recommended Specifications:
  • OS: Windows 10 - April 2018 Update (v1803)
  • Processor: Intel Core i7-4770K / AMD Ryzen 5 1500X
  • Memory: 12GB
  • Graphics Card: Nvidia GeForce GTX 1060 6GB / AMD Radeon RX 480 4GB
  • HDD Space: 150GB
  • Sound Card: DirectX compatible

Truth be told, even guys with even better graphics card have been struggling to get 60 fps consistently, just read this thread and you'll notice it's a total mess:
www.resetera.com

Red Dead Redemption II PC performance thread

(Click for general game info.) 🖥️ System requirements: 🧰 Latest drivers: (as at 24/02/2022) AMD: 22.2.2 Nvidia: 511.79 📊 Benchmarks: ComputerBase (German; 06/11/2019 build[?]) GamersNexus (English; launch build; CPU-focused) GamersNexus (English; 06/11/2019 build[?]; CPU-focused) GamersNexus...


Don't get me wrong, I love PC gaming but I hate wasting my time trying different settings. Now I understand why some people refuse to let consoles behind.
Sometimes I go as far as thinking that developers don't optimize their games properly on purpose so companies like NVIDIA can sell their new cards at ridiculous prices.

It's really discouraging.
There's a solution for this (if your internet is ready) it's called stadia
 

dmix90

Member
Oct 25, 2017
1,883
This is lengthy, but heh, I can't find a different way to go through it all:

- While waiting for Digital Foundry to come up with some reliable data on PC/Console comparison, I've myself compared side by side, frame by frame footage between a supposedly maxed quality 4k PC and PS4 Pro. I'm not seeing the difference people claim to be. The PC footage is very clearly much more defined and high resolution, but what's on screen is almost identical to the PS4 footage. The same distance ranges, the same geometry, trees on the distance, grass draw distance. The exact same distance where shadows pop up. It's close to 1:1 reproduction. The PS4 seems to have a stronger "haze", and has much less clarity because of the resolution, but it looks just the same screen rescaled. So I'm still doubting that the console version is a mix of low and medium, unless low and medium are nearly identical to the max.

This is the footage I've used:
https://www.youtube.com/watch?v=wZpgt6L89hY (PS4 Pro)
https://www.youtube.com/watch?v=hNutWJ7Xw2Q (PC)

I've compared 2:05 onward from the first video VS 1:42:30 of the second one. Going on for the next 10 minutes. Since they are scripted sequences in the open, they are quite easy to compare side by side.

- The second point is about the topic itself. The advantage of playing on a console isn't simply that the game "just works", but also that it was built FOR the hardware. On PC when you have to juggle with settings to find a decent compromise of performance and quality, you either spend three months taking screenshots, compare every setting and so on. There are always settings that tank the performance while being negligible visually, so you just don't know what's the best compromise unless you really spend hours researching, and even then it's always a rough estimation.

This on top of emergent technical issues. For example I remember the first Titanfall, the developers explained they spent a lot of time reorganizing the texture pool on PC, so that all the big, important textures that take the priority on screen still retained a very high quality, while they used lower textures for stuff that was more hidden, less noticeable. The result what that on a screenshot comparison there wasn't almost any perceivable distinction between different texture settings. Even if the texture requirements went up dramatically from one setting to the next.

Compare this I just said, to Assassin Creed Unity. If you tried to match the same texture quality of a PS4, on the PC version, you obtained a version that looked HORRENDOUS. This because the PS4 had a mix of medium and high textures, carefully handpicked, that on PC was impossible to achieve. You either selected "high", or medium would have been already lower than PS4 quality despite some trivial textures would look better. Because on PC the textures pools weren't well done and the developers only focused to make the high settings good. The result of this was that on PC you either maxed textures, or the game would look worse than a PS4. No middle ground.

Here we come to the conclusion about one aspect that I've never seen expressed: games on consoles, because of the single hardware, are VERY FINELY TUNED FOR ART DIRECTION. It means that there are devs whose whole job is match the very best performance concessions to look the best possible. It means there are professionals who spend days doing this fine tuning of details, cutting the corners that are the least important. On PC you can replicate some of this through manual settings, but it's a very long shot from tuning the code directly, and you can never match the time and care spent by devs paid for the job.

This directly leads, on PC, the impulse of pushing everything to the max. It's natural.

And now maybe a personal thing: when I look at PC footage, compared to consoles, the PC footage gives me a feeling that things aren't quite "right". But it was hard to pinpoint why. I eventually realized that its because of the animations. When moved to 60 fps footage the added smoothness has the incidental effect of making the same animations more robotic and stiff. The same happens with textures, when you increase the resolution to 4k the much higher definition simply brings out flaws that you wouldn't notice. Basically the PC, with the added clarity, enhances the problems too, because the game wasn't originally made for this definition. And ultimately it looks "off", weird.

The exact same animation that looks perfectly fine at 30 fps, when seen at 60 becomes extremely unnatural and robotic, as if sticks out the rest of the environment like a sore thumb. Same for facial expressions, or textures quality. When you see the console footage, the game looks like a marvel because it all blends together naturally. When you see the PC footage you have this surgical sight that suddenly emphasizes all the things that aren't quite right, and it all appears more glitchy and rough.
rTao8x0.gif


To explain better what I mean with higher resolutions emphasizing problems you can see a similar effect there. Look for example at the nose and the area of the neck, the image overall is a lot more natural looking at its original resolution than rescaled. The one on the right is a low-fi image that still looks nice today. The one of the left is... weird.

The same as people on this forum commenting how some assets like the trees that weren't updated for PC now are more noticeable since they don't quite match the detail of everything else.

They are probably small things you only notice if you pay attention to details, but they are real.

For the texture it's not that the higher resolution brings out problems in the textures themselves, but in relation to the model. A super high res texture applied to a simpler model looks weird. In the console version the lower resolution gives a more natural, smooth overall look. On PC the higher clarity makes the model itself not quite match the texture quality. You can notice that detail is missing.

The same as when you rescale a picture UP from its natural resolution, it looks bad. If you rescale it down it looks great. The higher the display and the resolution, the much higher needs to be the graphic detail too (Apple built the "retina" concept on this). Smaller screens/resolutions get away looking good with much less.

Even the fact that in retrogaming a CRT screen looks immensely better than a super sharp LCD with optimal precision somewhat falls in the same category.

Or even the concept of LOD in all games: if something is up close it requires a high level of detail to look decent, if it's far away it can use a much simpler model, because it's a lot less noticeable. But if you then greatly push up the resolution and screen size, then that simpler LOD starts to stick out and will look bad. Etc...

Last example:

YxQ0ZfI.jpg


g9saDsD.jpg



The first looks fine, right? Well the second one is the exact same image. There the higher resolution "brings out the flaws", and it really looks bad.
Preach! Lol

A lot of people will laugh but i agree with most of what you're saying.
 

Nzyme32

Member
Oct 28, 2017
5,238
Jesus this thread. And we're at the point of trying to justify that anything the game provides beyond the console version including well beyond 60fps is "unnatural"... haha

OPINION: This thread on ERA is exactly the reason why some people still HATE "gamers" with their entire Ego lodged in their console's company
 

Guaraná

Banned
Oct 25, 2017
9,987
brazil, unfortunately
PC has been my main platform since 2009 and I STILL dont know what those are. I know they're AA methods, but I couldn't tell you how they differ. Same with AO, screen space reflections, sub surface scattering, and a bunch of other non sense terms. All I know is if I want more performance, I just start turning stuff off.
The problem here is you, not the technology.
There's a lot of information available to explain how these things work.
 

Dog of Bork

Member
Oct 25, 2017
5,988
Texas
Did someone really just zoom in on an image to claim that higher resolutions look worse when zooming in on an image literally makes it lower res and proves the exact opposite point?

Because that is what it looks like lmao.

Keep on digging, I guess.
 

Md Ray

Member
Oct 29, 2017
750
Chennai, India
PS4 equivalent settings in this video:



Tip: If you struggle to hit consistent 60fps, change Refresh Rate to "50", then turn Vsync "On".
Hope this helps people struggling to get smooth performance on their rig.
 

Alexandros

Member
Oct 26, 2017
17,795
I think I've explained myself clearly now and I'll leave it at that.

But I'll address this since it came up more than once: if you want to take it personally then your inferences are wrong. In my life I had a Commodore 64, then a PC. I bought a PSX late in its cycle and that's the only console I've ever owned personally, an original PS1. That, and more recently a 2DS XL. All my game history is on PC, so I'm not defending the consoles out of some convoluted personal interest.

I'd appreciate an answer to my previous question: If you like the image on the right more than the image on the left, isn't it the case that you can get that on PC too?
 

ss_lemonade

Member
Oct 27, 2017
6,641
Then be clear with what you're actually objecting to.

Here's a 1060 getting 57fps at 1080p low. You can extrapolate what pushing 4X the pixels will do to that number pretty easily.
So the 1060 being the "recommended" GPU according to Rockstar is to just hit 1080p 60 at low settings? I wonder if that's part of the reason people are complaining about performance
 

Quinton

Specialist at TheGamer / Reviewer at RPG Site
Member
Oct 25, 2017
17,244
Midgar, With Love
There are a bunch of reasons I don't play games on PC (I don't even own a PC) but one of them is certainly the fact that I take one glance at all the numbers and chatter involving system specs and optimal settings and yes, I zone out and retreat to the extreme simplicity of my PS4 and Switch.

I have no reason to dislike PC gaming on the whole; that would be as silly as a PC gamer disliking the very existence of consoles, and I trust we're all above that.

It's simply not for me, yeah.
 

My Name is John Marston

Alt account
Banned
Oct 27, 2017
111
So the 1060 being the "recommended" GPU according to Rockstar is to just hit 1080p 60 at low settings? I wonder if that's part of the reason people are complaining about performance

Yeah the 1060 is recommended for 1080p 60fps. People just don't want to see the label "low" in their settings menu. The console version is most likely running with low on most settings. We're waiting for Digital Foundry video to confirm.
 

ZSJ

Alt-Account
Banned
Jul 21, 2019
607
Sure, but what does that effort entail? What is the typical effort required to run a game on PC?
I just got New Vegas after being massively disappointed with Outer Worlds and I'm greeted to this PCGW essential improvements list
Untitled.jpg


I said fuck it, I'll wait till the weekend. That's a lot of shit to do.

Obviously there's some stupid ass ignorant console fanboy shit being slung ITT but as someone who loves PC, let's not act like there isn't anything fucking tedious about it. It is the best platform but that doesn't mean it's all rose smelling farts.
 
Mar 29, 2018
7,078
Sure, but what does that effort entail? What is the typical effort required to run a game on PC?
I mean PC as an entire platform has a ton more effort involved at every step. And yes if you're lucky games will work right "out the box" and be perfect. But on PC there's a decent chance something won't run right or look right. And you'll have to tweak.

More generally, PC has:

- more research involved before you even buy a machine. What do all these letters and numbers mean?
- a ton more set up time. A console is literally 15 seconds, a PC is at least an hour or two but often say more. Yes, even a straight up prebuild. Getting the software up, sorting out antivirus, checking everything is working from browsers to the OS
- the actual installation of software delivery platforms, like Steam etc.
- then ultimately when you boot any game it might have problematic performance kr flat out not work, which by and large doesn't happen on console, and it takes a ton of time and effort to troubleshoot that stuff. Esp if it's a rare issue. Just the RISK of having to fiddle with settings AT ALL is off putting to loads of people

It's getting better but console is still infinitely easier. Whenever I play on my PS4 it's a simple joy. Whenever I play on PC it's many minutes of fiddling for many games.

I say this all as someone who played only console from 2005-2017 then built my own machine from scratch. I since bought a gaming laptop.

PC gaming is incredible but to say it requires less time/effort than console is laughable. It is POSSIBLE for PC gaming to be as fast and convenient as console gaming, and it often is. Like some games might run perfectly, or your prebuild happens to work and didn't need any fiddling. But it's also FAR MORE LIKELY to demand more more time and effort than console gaming ever will.
 

Launchpad

Member
Oct 26, 2017
5,154
I just got New Vegas after being massively disappointed with Outer Worlds and I'm greeted to this PCGW essential improvements list
Untitled.jpg


I said fuck it, I'll wait till the weekend. That's a lot of shit to do.

Obviously there's some stupid ass ignorant console fanboy shit being slung ITT but as someone who loves PC, let's not act like there isn't anything fucking tedious about it. It is the best platform but that doesn't mean it's all rose smelling farts.
Ah yes. Fallout: New Vegas. The game very well known for being incredibly optimised and bug-free on consoles.

New Vegas is absolutely fucked on every platform, but on the PC you can at least fix it.

Edit: I should say that obviously PC is more effort than consoles but to paint New Vegas as an example of what your average PC game takes to run is extremely disingenuous.
 

ZSJ

Alt-Account
Banned
Jul 21, 2019
607
User Warned: Hostility
Ah yes. Fallout: New Vegas. The game very well known for being incredibly optimised and bug-free on consoles.

New Vegas is absolutely fucked on every platform, but on the PC you can at least fix it.
Yea I didn't say it wasn't fucked elsewhere but keep going with your snarky bullshit. I answered a question.

PC gaming can be a hassle of shit to do before you start a game. It's always for a better game but that doesn't mean it's not fucking annoying to do. If you don't agree you're likely as big of a fanboy as those you loathe so much. Or maybe you LIKE downloading 6GB+ Witcher 3 mods from Nexus' slow as shit 100kb/s download site.

I love how the second you say something on PC isn't literally fucking heaven you get these posts. It's like the Switch in that regard.
 

laxu

Member
Nov 26, 2017
2,782
I just got New Vegas after being massively disappointed with Outer Worlds and I'm greeted to this PCGW essential improvements list
Untitled.jpg


I said fuck it, I'll wait till the weekend. That's a lot of shit to do.

Obviously there's some stupid ass ignorant console fanboy shit being slung ITT but as someone who loves PC, let's not act like there isn't anything fucking tedious about it. It is the best platform but that doesn't mean it's all rose smelling farts.

None of this is something you have to do. These are all quality of life improvements and community fixes because the developers didn't bother patching their clunky shit. The fact that you can fix these on your own is one of the great things about the PC. In a similar way people are trying to find workarounds to RDR2 problems because the devs aren't doing their part.

On the console front I have seen the "where's the performance fix patch?!!" cries many many times and you just can't do anything about it. That's one of the prices you pay for convenience, you leave a lot more to the hands of the developers for better or worse.
 

bic

Member
Oct 28, 2017
432
Messing around with settings to get as close to 1080p/60 doesn't bother me. The thing I really dread about PC gaming are the issues some have been having with RDR2 and the launcher crashing. I wasn't affected this time but having had similar experiences in the past, I know what a pain PC gaming can sometimes be.
 

Deleted member 54292

User requested account closure
Banned
Feb 27, 2019
2,636
Or even the concept of LOD in all games: if something is up close it requires a high level of detail to look decent, if it's far away it can use a much simpler model, because it's a lot less noticeable. But if you then greatly push up the resolution and screen size, then that simpler LOD starts to stick out and will look bad. Etc...

Last example:

YxQ0ZfI.jpg


g9saDsD.jpg



The first looks fine, right? Well the second one is the exact same image. There the higher resolution "brings out the flaws", and it really looks bad.
Where did you get that Red Dead 2 pic cause the LOD on Xbox was NEVER that bad. Something seems up with that shot.
 

aerozombie

Banned
Oct 25, 2017
1,075
Apparently my freaking 2070 can't run this at high/ultra at 4K. I'm grateful I haven't bought it yet, I was so hyped to see it, but I'll hold off for now. Don't usually care for GTA/RD games, so I'm not diving in for graphics that won't run decently
 

Alexandros

Member
Oct 26, 2017
17,795
I just got New Vegas after being massively disappointed with Outer Worlds and I'm greeted to this PCGW essential improvements list
Untitled.jpg


I said fuck it, I'll wait till the weekend. That's a lot of shit to do.

Obviously there's some stupid ass ignorant console fanboy shit being slung ITT but as someone who loves PC, let's not act like there isn't anything fucking tedious about it. It is the best platform but that doesn't mean it's all rose smelling farts.

You don't actually have to do any of the things you listed. You can, but you don't have to. This choice is the core element of the platform and this thread shows that people still can't wrap their heads around that concept. You can spend hours tweaking all sorts of stuff or you can download the game from Steam, hit "Play" and, well, play.

I mean PC as an entire platform has a ton more effort involved at every step. And yes if you're lucky games will work right "out the box" and be perfect. But on PC there's a decent chance something won't run right or look right. And you'll have to tweak.

More generally, PC has:

- more research involved before you even buy a machine. What do all these letters and numbers mean?
- a ton more set up time. A console is literally 15 seconds, a PC is at least an hour or two but often say more. Yes, even a straight up prebuild. Getting the software up, sorting out antivirus, checking everything is working from browsers to the OS
- the actual installation of software delivery platforms, like Steam etc.
- then ultimately when you boot any game it might have problematic performance kr flat out not work, which by and large doesn't happen on console, and it takes a ton of time and effort to troubleshoot that stuff. Esp if it's a rare issue. Just the RISK of having to fiddle with settings AT ALL is off putting to loads of people

It's getting better but console is still infinitely easier. Whenever I play on my PS4 it's a simple joy. Whenever I play on PC it's many minutes of fiddling for many games.

I say this all as someone who played only console from 2005-2017 then built my own machine from scratch. I since bought a gaming laptop.

PC gaming is incredible but to say it requires less time/effort than console is laughable. It is POSSIBLE for PC gaming to be as fast and convenient as console gaming, and it often is. Like some games might run perfectly, or your prebuild happens to work and didn't need any fiddling. But it's also FAR MORE LIKELY to demand more more time and effort than console gaming ever will.

Again we come back to the matter of choice. You don't actually have to do any of the stuff that you mentioned. You can build a custom PC from scratch but you don't have to. You can install Windows and antivirus yourself but you don't have to. You can tweak your games but you don't have to. You can, but you don't have to. It's as simple as that.

And your claim that problematic performance by and large doesn't happen on console can be disproven in a couple of minutes by linking tons of Youtube videos from Digital Foundry and others showing consoles struggling to properly run games. This sort of myth could be perpetuated back when there weren't gigabytes of footage proving otherwise. Nowadays that claim is simply ridiculous.
 

My Name is John Marston

Alt account
Banned
Oct 27, 2017
111
Apparently my freaking 2070 can't run this at high/ultra at 4K. I'm grateful I haven't bought it yet, I was so hyped to see it, but I'll hold off for now. Don't usually care for GTA/RD games, so I'm not diving in for graphics that won't run decently

But why do you need to run it at high/ultra at 4K? Inform yourself of what those settings do and how they look instead of getting hung up on labels. It runs decently. Just lower the settings and it will still look incredibly good.
 

exodus

Member
Oct 25, 2017
9,936
Apparently my freaking 2070 can't run this at high/ultra at 4K. I'm grateful I haven't bought it yet, I was so hyped to see it, but I'll hold off for now. Don't usually care for GTA/RD games, so I'm not diving in for graphics that won't run decently

1080Ti was never adequate for consistent 4K60. I don't know why you'd expect a 2070 to fare.
 

Rizific

Member
Oct 27, 2017
5,946
The problem here is you, not the technology.
There's a lot of information available to explain how these things work.
absolutely, no doubt about that. but my reply was in response to someone who only uses graphics preset options to set up and play their games. you can get by pc gaming without knowing what these terms are and without "fiddling" was the point i was trying to make.