You must not play fast-paced or competitive multiplayer games. Having a higher frame rate in a game like Gears of War is far more important than 4K.
I'd even say the same thing applies to something like Devil May Cry. The extra fluidity helps with executing combos.
I did not realize 60fps was considered barely playable slideshow
PS. Best version of RE4 is GameCube one in original resolution on CRT or original res + msaa in Dolphin emulator on LCD.
Nah...
? Is it not the same people playing on gtx 1050 60hz vs rtx 20xx 240hz?
Nope - it's data in the wild, so that's telling you "the median player with a GTX1080 has a better K/D than the median player with a 1050", but they're different people - they're not giving different video cards to a group of players and see how their performance changes across different rigs or whatever.
I just don't see myself dropping down image quality for the sake of higher framerates. I scratch my head at people with high end PCs still running games at 1080p.
Just have to play older gamesCome on Intel/Nvidia, shrink some dies, crank some yields, do some architectural magic and get me some 4k320hz.
Hopefully console only people can get a taste of 120fps. It's crazy how fluid it can be.
I once got an email from Nvidia telling me if I upgraded my 1070ti to a 2000 series card I would see an improvement in fps on Apex Legends and thus would get more kills.
Be happy with what you're happy with. The alternative is to spend a bunch more money just to become habituated to a higher standard and reduce the range of options you can enjoy.
Higher performance matters when you play competitivelyI just don't see myself dropping down image quality for the sake of higher framerates. I scratch my head at people with high end PCs still running games at 1080p.
The resolution boost ruining assets argument never holds any weight for me, since people happily buy "remasters" in the millions that don't touch texture resolution, in fact very few of them do. i really don't see that as an issue except for when texture seams become noticeable.Consistency and what developers designed their game for is what matters.
Gameplay might become easier or harder if input method used by player is different to what game was primarily designed for(Resident Evil 4 Wii or too easy to aim and destroy ai with mouse in primarily console shooters as examples) or framerate is lower or higher than intended.
Same for graphics tbh... you might see what you were not supposed to or if assets are not able to keep up with resolution boost then game might look worse in high resolution than original resolution.
Oh yeah, all that is applicable to curated SP experiences. No one cares about nuances like that for competitive MP.
PS. Best version of RE4 is GameCube one in original resolution on CRT or original res + msaa in Dolphin emulator on LCD.
š
most games It just works like resolutions. Some games will be capped and can't be unlocked, and a few will need an ini edit or mod.So assuming you have the hardware to do it, how much fiddling around is there to be done to get games to run at those FPS, Im sure it depends on the game but roughly speaking how many games work out of the box versus having to fiddle with patchers, ini edits, mods etc?
Thats the main detractor for me, I have an ultra wide monitor but just kind of gave up on playing games in 21:9 as I dont have the time to look for mods to force games to run like that.
On most games a higher framerate will only make the game more fluid, not faster. As you can see on the Digital Foundry Counter Stryke comparison, the gameplay speed is synced regardless of the framerate difference.
Super well said and I agree 110%.The gaming audience is large enough that many different interests and preferences fit. Some of us will never care about reaction speeds and 'winning', but rather about immersion.
It's really hard to go back to 60 after 144. Feels like mud. It could definitely effect your gameplay.
30 is blehhhhh.
How do you like the C9? I think that's my next big update.Consider me a recent 120fps convert.
Finally got my gaming PC this year (after being console only for almost two decades) along with my C9 OLED. And basically, holy shit.
At 120hz there is ZERO motion blur (like the old CRT days but without the flicker) and it feels so damn responsive it's ridiculous.
2D platformers are a night and day improvement over 60fps at 120fps. Firstly, because there is no motion blur it means the beautiful art of Hollow Knight or The Messenger remain totally pristine in motion. Switching back to 60fps and everything looks so muddy in comparison; it's amazing how quickly you get used to 120. And secondly, the input latency is so much lower it makes twitch platformers an absolute bliss to play.
Unfortunately the C9 only currently supports a maximum of 1440p at 120hz though, albeit there is a firmware update later this year to HDMI 2.1 which will provide 120hz at full native 4k on the glorious OLED screen.
It's really hard to go back to 60 after 144. Feels like mud. It could definitely effect your gameplay.
30 is blehhhhh.
Everything below 300 is a slideshow to my cyborg eyes. At 30 fps, I write my novel between frames.