Digital Foundry: Beyond 60FPS: How Running Games at 144FPS/240FPS Can Improve The Gameplay Experience

blumen_dieb

Member
Mar 18, 2019
311
Hypothetical question here. If you had a 2060S and were planning to upgrade your monitor would you go with 144hz at 1080p or at 1440p?

A future GPU upgrade is not on the horizon for me and I am wondering what the longevity of the 2060S will be at 1440p. It's hard to accept the 440 Euro price tag of 144hz/1440p monitors. Maybe better to stick with 1080p and high frames? Hardest upgrade decision I've had to make.
 

blumen_dieb

Member
Mar 18, 2019
311
I'm new to this but one thing is confusing me is how to properly enable freesync

Vsync off? Enhanced vsync off? Radeon chill? Riva tuner? Limit fps via in game?


I own Ryzen 3700x and Radeon 5700XT btw. All around reddit it seems a bit conflicted.
Check your monitor settings, turn it on there. Then check the Radeon settings and turn it on there. Once it is on you don't need Vsync.
 

Zonal Hertz

Member
Jun 13, 2018
485
That’s pure marketing bla bla, like super Super Retina displays, where you need a magnification glass to see the difference. Ridiculous to slow down the video to show the difference when the human eye is not capable of even doing 60 FPS.
I can't tell if this is sarcastic or not?

But it's very easy to tell the difference between 60 and 144hz when you are playing a game - especially any game that's on mkb and remotely responsive.
 

MCD

Member
Oct 27, 2017
1,707
Check your monitor settings, turn it on there. Then check the Radeon settings and turn it on there. Once it is on you don't need Vsync.
Doing that made my FPS jump +10 more than my monitor (monitor is 144, it jumped to 15x or so). I read online that's not good or something.

At the end I limited the framerate via in game Overwatch settings with vsync/enhanced sync off. have yet to try riva tuner and wanted to try radeon chill but my radeon drivers aren't detecting Overwatch automatically and adding it manually seems to do nothing.
 

blumen_dieb

Member
Mar 18, 2019
311
Doing that made my FPS jump +10 more than my monitor (monitor is 144, it jumped to 15x or so). I read online that's not good or something.

At the end I limited the framerate via in game Overwatch settings with vsync/enhanced sync off. have yet to try riva tuner and wanted to try radeon chill but my radeon drivers aren't detecting Overwatch automatically and adding it manually seems to do nothing.
I don't think having frames over the capacity of your monitor has any effect. You just don't see them. You can turn Radeon Chill on universally for all games and cap everything to 144 if you are worried. Not sure what you mean about the drivers detecting Overwatch.
 

MCD

Member
Oct 27, 2017
1,707
I don't think having frames over the capacity of your monitor has any effect. You just don't see them. You can turn Radeon Chill on universally for all games and cap everything to 144 if you are worried. Not sure what you mean about the drivers detecting Overwatch.
When you open radeon settings and go to Game tab, there is an auto detect game button there or manual add exe. It can't auto detect OW and if I add OW manually and enable radeon chill and anti lag, I don't really see anything changing with the fps.

I'm not home where my PC is at and I can't test out anything for at least two weeks but any tips is appreciated.
 

dgrdsv

Member
Oct 25, 2017
2,690
Msk / SPb, Russia
Hypothetical question here. If you had a 2060S and were planning to upgrade your monitor would you go with 144hz at 1080p or at 1440p?
Depends on what monitor you have right now. If it's <=1080p/60 then either option is a good upgrade. If it's above 1080p or 60 then going with less will feel as a downgrade. Same goes for its size.
But personally I would aim at 1440p/120-144 with adaptive sync. It's fine to hold off the purchase for some months to get the money for it.
2060S isn't an issue as you will most certainly upgrade the GPU faster than you'll upgrade your monitor again and it'll be fine for 1440p for the next couple of years.
 

blumen_dieb

Member
Mar 18, 2019
311
When you open radeon settings and go to Game tab, there is an auto detect game button there or manual add exe. It can't auto detect OW and if I add OW manually and enable radeon chill and anti lag, I don't really see anything changing with the fps.

I'm not home where my PC is at and I can't test out anything for at least two weeks but any tips is appreciated.
I don't have an AMD card anymore but when I did (last month) I would just open the radeon settings, go to the general settings tab, turn on Chill and then right below it adjust the min and max frame rates. I didn't bother to see if a specific game was detected, it was just an overall setting for all games. I did this because it was running really hot and louder than I wanted, while pushing above 75hz, which was unnecessary.
 

blumen_dieb

Member
Mar 18, 2019
311
Depends on what monitor you have right now. If it's <=1080p/60 then either option is a good upgrade. If it's above 1080p or 60 then going with less will feel as a downgrade. Same goes for its size.
But personally I would aim at 1440p/120-144 with adaptive sync. It's fine to hold off the purchase for some months to get the money for it.
2060S isn't an issue as you will most certainly upgrade the GPU faster than you'll upgrade your monitor again and it'll be fine for 1440p for the next couple of years.
I have a 75hz 1080p screen. I lost freesync upgrading the GPU and that is what is making me antsy to upgrade more than anything else. But you are right. I'll save up for a few months and pick something at 1440p that can work with gsync. Monitor naming/numbering schemes make my head ache.
 

MCD

Member
Oct 27, 2017
1,707
I don't have an AMD card anymore but when I did (last month) I would just open the radeon settings, go to the general settings tab, turn on Chill and then right below it adjust the min and max frame rates. I didn't bother to see if a specific game was detected, it was just an overall setting for all games. I did this because it was running really hot and louder than I wanted, while pushing above 75hz, which was unnecessary.
Ahhh. OK OK. I guess I don't need to add OW manually then.

Thank you.
 

TheMadTitan

Member
Oct 27, 2017
5,263
I'm not going above 60 until TVs release with higher refresh rates at affordable prices. Wanting to move from monitor gaming to TV gaming is the reason why I've been holding off on 1440p monitors.
 

Ash735

Member
Sep 4, 2018
609
One thing I've always wondered about recently is WHY 144Hz???? Why is that the popular one??? It feels like it was settled on by NVIDIA to promote GSync. 120Hz should have been the next logical step as a set refresh rate (non-variable) as it'd be most compatible with current standards across the board:

24fps [Movies/TV Shows] = x5 repeating frames to fit 120Hz
30fps [TV/Recorded Console Games/Internet Videos] = x4 repeating frames to fit 120Hz
60fps [Recorded Console Games/Internet Video/Majority of capped PC Games] = x2 repeating frames to fit 120Hz

So you could just set the refresh rate ONCE and have it comfortably display most formats that you'd be viewing. I suppose 240Hz solves this minor problem but still the gap between 120Hz and 144Hz is close but the latter is pushed more.
 

Omnistalgic

Member
Oct 27, 2017
2,713
NJ
Enjoying your games despite the hyperbole in here, hopefully.
Lol yeah I guess so. But I already feel like I’m countering and pressing the guard button in time in DMCV, and I get bopped. The timing isn’t that strict and I’m sure sometimes it’s the input delay of the TV and wireless controller. We need to get up to monitor standards first before we start thinking about above 60fps as console standard.

But I truly do hope with the upgrade to CPU, we finally have more 60fps AAA titles.
 

Syril

Member
Oct 26, 2017
4,138
Lol yeah I guess so. But I already feel like I’m countering and pressing the guard button in time in DMCV, and I get bopped. The timing isn’t that strict and I’m sure sometimes it’s the input delay of the TV and wireless controller. We need to get up to monitor standards first before we start thinking about above 60fps as console standard.

But I truly do hope with the upgrade to CPU, we finally have more 60fps AAA titles.
The TV can be a huge factor, especially the ones that have motion smoothing and other such effects turned on by default.