• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.

Solid Shake

Member
Oct 28, 2017
2,254
Exactly the same things that would happen if the game was 60FPS I would expect.

It's pretty noticable.

Play a game on PC like Destiny 2 in 60fps then immediately hop over to Destiny on console and the difference is night and day.

If you play a ton of 30fps games your eyes adjust to seeing 30fps just fine, but once you go from 60 back to 30 is when it's insanely noticeable.

Some games do 30fps just fine though, usually platformers or anything not a shooter thats in third person looks generally fine in 30fps.

FPS or TPS look and feel way better to play at 60 though. I'm pretty sure you played Metal Gear Online 2 and 3 right? MGO2 ran at like 20-30fps and generally felt pretty slow even though it was the better game by a mile. MGO3 though felt BUTTERY smooth compared to MGO2, could you honestly not feel that difference?

If anyone feels like this and wants to test it out, I recommend playing Rainbow Six Siege on console. Play one multiplayer match then switch to Terrorist Hunt with v-sync enabled. The gameplay feels INSANELY slow in Terrorist Hunt because it lowers your FPS to 30 with V-sync enabled.
 

catswaller

Banned
Oct 27, 2017
1,797
You should totally be able to see a difference, a 4k picture is totally distinct from a 1080p picture. There's not really a significant difference in quality -- absolutely nothing like the kind of graphical or performance improvements you can get from a game developed for 1080p for the same hardware -- but there should be a significant increase in sharpness and detail.
 

EdibleKnife

Member
Oct 29, 2017
7,723
Wouldn't actually playing something make it easier to see the difference between framerates?
I've honestly never tracked the framerate of anything I've played. I play games as is and never have a moment of "this is too framey" short of the game literally freezing. I've played games that I'm sure people complained about not being 60 FPS and have nothing trigger that something is different or wrong. Everything I play I just play. Even videos of gameplay the players say "last video was 30fps but finally it's in 60 so sorry about the discrepancy" and I simply have to shrug because I literally cannot tell what they're seeing.
 

Xiaomi

Member
Oct 25, 2017
7,237
Even not seeing the difference between 30 and 60 when watching a screen, it should be pretty easy to feel the difference in terms of responsiveness. Like, I can't really see the difference between 90 and 144 on my monitor unless I look at things like mouse movement, but I can certainly feel it in games like Overwatch.
 

ss_lemonade

Member
Oct 27, 2017
6,648
What normal gameplay is this?
vxZAKRm.gif


I think this one depends on where you are viewing this image. For example, only my phone with Chrome, the framerate seems to be fine (30 left, 60 right). But viewing this from the imgur app on my phone seems to show the framerate cut in half.

I've honestly never tracked the framerate of anything I've played. I play games as is and never have a moment of "this is too framey" short of the game literally freezing. I've played games that I'm sure people complained about not being 60 FPS and have nothing trigger that something is different or wrong. Everything I play I just play. Even videos of gameplay the players say "last video was 30fps but finally it's in 60 so sorry about the discrepancy" and I simply have to shrug because I literally cannot tell what they're seeing.
What I meant is controlling a game in 60 fps should feel different/better than a game in 30 fps. More so when playing on a PC with a keyboard and mouse, where snappy movements are more possible and higher framerates even past 60 make things much more responsive.
 
Last edited:

EdibleKnife

Member
Oct 29, 2017
7,723
vxZAKRm.gif


I think this one depends on where you are viewing this image. For example, only my phone with Chrome, the framerate seems to be fine (30 left, 60 right). But viewing this from the imgur app on my phone seems to show the framerate cut in half.
Even this I completely understand. But you could give me a version that was only 30fps and I'd play it like normal and come away saying nothing was wrong. Literally the only way I think I'd notice is if you jostled a running game from 60 to 30 to 60 over and over again.
 

Lucreto

Member
Oct 25, 2017
6,631
I am the same and can't tell the difference between 30fps and 60fps. Frankly unless I am told it was 60fps I wouldn't notice. I apparently played some 60fps games, moved to 30fps games and didn't notice a difference. The game really has to judder before I notice.

I am the same and can't tell the difference between 30fps and 60fps. Frankly unless I am told it was 60fps I wouldn't notice. I apparently played some 60fps games, moved to 30fps games and didn't notice a difference. The game really has to judder before I notice.
I've honestly never tracked the framerate of anything I've played. I play games as is and never have a moment of "this is too framey" short of the game literally freezing. I've played games that I'm sure people complained about not being 60 FPS and have nothing trigger that something is different or wrong. Everything I play I just play. Even videos of gameplay the players say "last video was 30fps but finally it's in 60 so sorry about the discrepancy" and I simply have to shrug because I literally cannot tell what they're seeing.

Precisely how I feel.
 

BadAss2961

Banned
Oct 25, 2017
3,069
Viewing distance and screen size has a lot to do with it.

4K is kinda overkill under a certain size.
 

EdibleKnife

Member
Oct 29, 2017
7,723
What I meant is controlling a game in 60 fps should feel different/better than a game in 30 fps. More so when playing on a PC with a keyboard and mouse, where snappy movements are more possible and higher framerates even past 60 make things much more responsive.

It should and if you did a Pepsi Challenge with me with which was better to control, I'd say 60. But it's something I barely think about. If you gave me an FPS that was locked at 30, I'd more than likely beat it and say "that was fine". I just never internalize the fact that it's a variable that could be better. A 30 FPS game, if I can move and shoot, I'd just say that's the way the game feels without wondering if my shots could be smoother and quicker.
 

Pachinko

Member
Oct 25, 2017
954
Canada
I honestly don't notice it on a big TV either because so many movies (even UHD releases) are upscaled 1080p , the bitrate is higher so the picture will look better but not dvd to bluray better. For gaming , I really notice it on my PC but I also sit barely 2 feet from my screen , anything less than native 4K gets blurry, 1080p especially looks pretty rough. So it's not so insane to barely notice a difference I don't think. Especially when so many Xbox one X and PS4 pro enhanced titles are closer to 1440p than any other resolution. (both do have titles that are 4K though, I imagine those ones will cleaner looking even from a feet away).
 

LuckyLinus

Member
Jun 1, 2018
1,935
People that done see the difference between 30-60 fps or 4k must have serious issues with their eyes. Its night and day to me.
 

Yasuke

Banned
Oct 25, 2017
19,817
Wouldn't actually playing something make it easier to see the difference between framerates?

Nope.

Hell, I'm convinced I can only see the difference in the gif because they're side by side. I can pick up on different resolutions without needing a constant comparison, but I can't for FPS.

And my eyes are fine, thank you very much.
 
OP
OP
Rhaknar

Rhaknar

Member
Oct 26, 2017
42,452
so funny story (not funny at all actually), just did some quickie eye tests at the glasses store, not only did the dude refer me to a actual doctor, but he was pretty much like "yeah dawg your right eye is FUCKED"

So yeah, my eyes ARE broken :p
 

Branson

Member
Oct 27, 2017
2,770
I went from a 60fps maxed out odyssey on PC to Lost Legacy on PS4 and got a headache. So much motion blur and the 30fps felt sluggish as hell.
 

Niosai

One Winged Slayer
Member
Oct 28, 2017
4,919
I have 20/20 vision and I can say that in my experience, there's really not much of a difference.
 

Deleted member 4072

User requested account closure
Banned
Oct 25, 2017
880
vxZAKRm.gif


I think this one depends on where you are viewing this image. For example, only my phone with Chrome, the framerate seems to be fine (30 left, 60 right). But viewing this from the imgur app on my phone seems to show the framerate cut in half.


What I meant is controlling a game in 60 fps should feel different/better than a game in 30 fps. More so when playing on a PC with a keyboard and mouse, where snappy movements are more possible and higher framerates even past 60 make things much more responsive.
There's crazy people out there that prefer the left and think it looks better lol. Crazy, crazy people. Right looks better every day of the week.
 

Echo

Banned
Oct 29, 2017
6,482
Mt. Whatever
How many console games are shipping with 4K assets? Cuz yeah, without the textures to match 4K does seem a bit less impressive.

But on PC with 4K texture packs in say R6 Siege and FFXV the difference is quite apparent.

It's really quite unfortunate that console gamers keep making topics like this about 4K. Without 4K textures and effects and shadows and all the other sacrifices consoles have to make... These are poor judgements at best.
 

Lukemia SL

Member
Jan 30, 2018
9,384
for the record, I can notice 720 to 1080 btw. In fact, one of the reasons I dont like using the Switch on the big screen is because of how soft / blurry it is in some cases.

edit: also im not sure if stuff like Shadow of the TR or Hellblade are even dropping down to 1080 in performance mode, maybe 1440 and thats why I cant tell the difference? I mean there would still be a difference tho.

But yeah I need glasses lol.

It's easier to see the difference between and lower and even lower resolution than the other wat around. But the clarity of 4K up from 1080p is undeniable. At least to me, I can't speak for everyone.
 
Oct 25, 2017
1,575
I can tell the difference between 1080 and 4k up to about 4-5 feet on my 50 inch tv, but from where I usually sit it's not super apparent most of the time (8-10 feet away), so I tend to opt for performance modes when they're available in games.
 

Deleted member 14313

User requested account closure
Banned
Oct 27, 2017
1,622
Wait wait wait...wait. Wait. What?
same and yeah I'm very short sighted but my glasses correct to normal so...

I've even done blind tests for framerate where a 30fps video and equivalent 60fps one were randomly put on the left or right and I couldn't tell the difference. ~50% success rate. I can tell the difference between 30fps and 15fps and I can also tell when the framerate fluctuates (i.e. significant fluctuation around 60fps is worse for me than consistent 30fps) but consistent 30 vs consistent 60 I honestly can't tell the difference.

4K vs 1080 I can tell the difference if I specifically look for it but just normal viewing/playing I don't notice anything different.
 
Last edited:

DeuceGamer

Member
Oct 27, 2017
3,476
Wait wait wait...wait. Wait. What?
That's definitely not normal, lol.


I think this is more common than a lot of people realize. This is why we have had debates in the past about what certain games are running at. If recall we had issues with certain people claiming a few Switch games were 60 FPS, even games journalist who you would think could clearly tell the difference.

I think the percentage of the average gamers that can't tell a stable 30FPS from 60FPS is probably higher than a lot of us on Era realize.
 

DixieDean82

Banned
Oct 27, 2017
11,837
I can't speak to 4K but, I'm the same when it comes to HDR. I've just upgraded to a LG oled and I honestly can hardly notice it. I feel like I'm taking crazy pills when everyone says what a difference it makes.
 
OP
OP
Rhaknar

Rhaknar

Member
Oct 26, 2017
42,452
It's really quite unfortunate that console gamers keep making topics like this about 4K. Without 4K textures and effects and shadows and all the other sacrifices consoles have to make... These are poor judgements at best.

those evil console gamers trying to disparage 4K clearly as part of some sort of anti-PC cabala of course.
 

Echo

Banned
Oct 29, 2017
6,482
Mt. Whatever
those evil console gamers trying to disparage 4K clearly as part of some sort of anti-PC cabala of course.

An interesting, if presumptuous, take. This has nothing to do with platform warz though.

Fact of the matter is if all you're doing is testing 4K on consoles you're not getting all 4K has to offer. Again, try it on PC where you can actually get shadows, textures, AA-solutions that don't blur, no checkerboard bullshit, and effects to actually match the resolution.

I mean, realistically, what were you expecting? To me, the difference between my PS4 Pro and my PC was enough such that I haven't even turned my Pro on since building my PC. Currently you're judging 4K based on extremely weak presentations.

Maybe next-gen when the consoles have more RAM, they'll aim to ship higher-res textures. Currently, the unified RAM pools in consoles is hurting that effort, as the amount required for 4K textures is simply too much when a console also has to load everything else into it's shared RAM pool.
 

leng jai

Member
Nov 2, 2017
15,117
I think this is more common than a lot of people realize. This is why we have had debates in the past about what certain games are running at. If recall we had issues with certain people claiming a few Switch games were 60 FPS, even games journalist who you would think could clearly tell the difference.

I think the percentage of the average gamers that can't tell a stable 30FPS from 60FPS is probably higher than a lot of us on Era realize.

Game journalists are the last people I would trust when it comes a to technical issues.
 

ss_lemonade

Member
Oct 27, 2017
6,648
I can't speak to 4K but, I'm the same when it comes to HDR. I've just upgraded to a LG oled and I honestly can hardly notice it. I feel like I'm taking crazy pills when everyone says what a difference it makes.
I think that's more to do with people not really knowing what good HDR looks like. To some, HDR just looks different compared to SDR.

Then there are other issues like some games just not implementing it well, or people being doubly confused with other TVs offering a faux HDR mode for SDR content (like Samsung's HDR+ setting)
 

astro

Member
Oct 25, 2017
56,887
Even this I completely understand. But you could give me a version that was only 30fps and I'd play it like normal and come away saying nothing was wrong. Literally the only way I think I'd notice is if you jostled a running game from 60 to 30 to 60 over and over again.
If you played an entire game that you spent a good deal of time with at 60fps, then went straight to a 30fps game, you'd be able to see and feel the difference.

At a glance might not be noticeable for some people, but there's a huge difference of clarity in motion and responsiveness.
 

ThreepQuest64

Avenger
Oct 29, 2017
5,735
Germany
After a certain distance away from the screen, depending on the size of the TV, the detail benefits of a higher resolution are lost.

kVqU4UxwbJ01pMdCNZMeFc1Wg3nxLOqja0xwUJn303s.jpg


I don't know how accurate this chart is. There's a site that lets you calculate optimized viewing distance before detail is lost, but I can't remember it for the life of me. Whether your eyes are bad or not, this is an element to consider.

By this, if you're sitting ~7.5 feet away from a 55" 4K set and you have 20/20 vision, it will more or less look like 1080p.
This.

Viewing distance matters. I mean, just try out an extreme example and watch a 240p or 1080p video on your smartphone at the other side of the room. You'll most probably won't see a difference. Pixel density only matters when your eyes are able to distinguish it, and with light being emitted from a screen, this rather a physical question. The two headlights of a car in the distance become ultimately one headlight if it's far enough away (which you can even calculate roughly).
 

Lukas Taves

Banned
Oct 28, 2017
5,713
Brazil
My X arrived a few weeks before my 4 tv. Going from a 1080p set to 4k, on the same games I was playing at the time made an absurd difference on the same console.

And for the games I tested with 60fps options on X, the better visuals on 4k are always super noticeable on the same TV as well, so I dunno what to say.
720p to 1080p seemed like a bigger jump to me. 4k feels like a half step despite how many more pixels there are.
I can get this feeling if you are on a Ps4. Despite being 1080p it has very good image quality on most games, so despite the sheer resolution, the games already had good IQ. But coming from xbone, whichbad IQ was quite more common it's day and night
 

Akauser

Member
Oct 28, 2017
833
London
Struggled to spot differences too. Then went from gaming on a 65 inch down to a 32 and sonetimes on a 22 in monitor I notice the difference there becuase your sat a lot closer. I also first noticed a big difference playing Tomb Raider.
 

BlueManifest

One Winged Slayer
Member
Oct 25, 2017
15,315
I see a difference, but not big enough to matter

It's not like it was going from 480p to 1080p

I notice it more in movies than games though because movies are more detailed
 

Jawmuncher

Crisis Dino
Moderator
Oct 25, 2017
38,394
Ibis Island
I've definitely not seen any reason to make a jump to 4k. I can tell the difference (Watching the regular Blu-ray of Infinty War vs the 4K was night and day on a buddies 4K setup). But I just don't see the jump as that big of a deal for me personally, especially not when it comes to gaming and the performance. As of right now 1080P & 60FPS is perfect for me.
 

Stoze

Member
Oct 26, 2017
2,588
I can tell the difference between 1080 and 4k up to about 4-5 feet on my 50 inch tv, but from where I usually sit it's not super apparent most of the time (8-10 feet away), so I tend to opt for performance modes when they're available in games.
That's probably how it is for most people, and why I don't plan on getting a 4K TV anytime soon. TV's are for sitting back and relaxing, and usually that means sitting back far enough not to notice any quality difference above 1080p.

On the other side of the spectrum, 1080p on my 1440p monitor is a massive drop in quality for me, and it's almost guaranteed the last setting I will turn down in a game if I need to.

60hz feels pretty jarring/bad to me now too, sadly.
 

headspawn

Member
Oct 27, 2017
14,605
Contrary to popular opinion, regular eye exams will not kill you.

I thought the same myself, but here I am, still alive.
 

PrimeBeef

Banned
Oct 27, 2017
5,840
You might want to have an eye exam. That's really all I can say >.<

Like, I have eye problems, but even to me it was pretty noticeable immediately.
I recently got glasses to correct my vision. .25 and .5 in the left and right eyes respectively. Before 1440 - 4k I didn't notice much. Afterwards, I still don't notice much difference. Maybe 8k I will. I don't see much going from 60Hz - 120Hz. But 144Hz is a huge improvement.
 

RagdollRhino

Banned
Oct 10, 2018
950
I recently got glasses to correct my vision. .25 and .5 in the left and right eyes respectively. Before 1440 - 4k I didn't notice much. Afterwards, I still don't notice much difference. Maybe 8k I will. I don't see much going from 60Hz - 120Hz. But 144Hz is a huge improvement.


Yep 20/20 for me and I get yearly exams. I didn't see a big jump to 4k either.
 

Bradford

terminus est
Member
Aug 12, 2018
5,423
I can tell the difference and vastly prefer 4k when watching movies and TV. I have a 4K OLED home theater setup for the express purpose of watching high quality movies.

But for games? I can see the quality difference but truly do not feel the performance hit is worth it. I recently made the jump from 1080p 120hz to 1440p 144hz and the improvement to detail while maintaining high frame rates (usually hovering between 120-144) was a far preferable balance to my experience with 4k.

Don't get me wrong, 4K does look amazing in-game, specifically if you can push really high texture resolution. It is beautiful, but I'd much rather have smoother framerates when so many games I play don't even have high-res assets that would take advantage of the higher pixel count. Even at 1440p low res assets stand out like a sore thumb. That may contribute to why it doesn't feel like a huge upgrade in some games.
 
Oct 29, 2017
13,478
1080p is pretty decent with good AA. If you got used to playing in 1440p with good AA (or better yet, downsampling) then the jump wouldn't have been that big when you went 4K.

Dynamic resolution wouldn't be a thing if we couldn't hide it, and the fact that we have means to hide it suggest that in practice and thanks to software there is a bit of a gradual increment in terms of IQ as well when we went from 1080p to 1440p, and from 1440p to 2160p.

You notice the difference, but between 4K and 1440p there are a bunch of other intermediate variations of 1440p. Let alone between those and non-native 4K solutions.
 
Last edited: