I was under the same impression but yeah you need to enable it.
I suggest checking this guide and also this video:
Hold on so it's actually better to not have V-Sync enabled on driver-level but rather inside the games?
I was under the same impression but yeah you need to enable it.
I suggest checking this guide and also this video:
In my experience the opposite is true. Far more games suffer from issues with their own V-Sync implementation than having it enabled in the NVIDIA Control Panel.Hold on so it's actually better to not have V-Sync enabled on driver-level but rather inside the games?
In my experience the opposite is true. Far more games suffer from issues with their own V-Sync implementation than having it enabled in the NVIDIA Control Panel.
The most important thing is that you do have V-Sync on though; whether that's in the control panel or in the game. However:
It is extremely rare for there to be problems caused by having V-Sync enabled in the driver in my experience, and far more common that I may wish to disable V-Sync in a game due to some issue that its implementation causes.
- Some games do not provide a V-Sync option.
- The V-Sync implementation in some games is bad, and causes it to stutter (even with G-Sync).
- Some games silently implement a frame rate limiter when V-Sync is enabled or disabled so you may need to disable the in-game option as a way of controlling that.
So it's a lot less effort to enable V-Sync globally, and make per-game exceptions where required, than set it on a per-game basis.
Similarly, NULL is something which causes more problems than it fixes in my experience, so I would disable it globally (set low latency to "On" not "Ultra") and only enable it on a per-game basis where required; e.g. in games that do not have a frame rate limiter of their own, and other external tools like RTSS do not work.
I think it's more that Chris likes to tweak everything so he personally leaves it up to the game, and would probably enable it on a game profile if the game's own implementation has issues.Okay thanks for the clarification, yeah that's was where I was standing too mostly so I was surprised to see the video saying the opposite.
I think it's more that Chris likes to tweak everything so he personally leaves it up to the game, and would probably enable it on a game profile if the game's own implementation has issues.
I'd rather use a solution that works for most games by default, with minimal intervention required, than one which requires every game to be configured individually.
In my experience the opposite is true. Far more games suffer from issues with their own V-Sync implementation than having it enabled in the NVIDIA Control Panel.
The most important thing is that you do have V-Sync on though; whether that's in the control panel or in the game. However:
It is extremely rare for there to be problems caused by having V-Sync enabled in the driver in my experience, and far more common that I may wish to disable V-Sync in a game due to some issue that its implementation causes.
- Some games do not provide a V-Sync option.
- The V-Sync implementation in some games is bad, and causes it to stutter (even with G-Sync).
- Some games silently implement a frame rate limiter when V-Sync is enabled or disabled so you may need to disable the in-game option as a way of controlling that.
So it's a lot less effort to enable V-Sync globally, and make per-game exceptions where required, than set it on a per-game basis.
Similarly, NULL is something which causes more problems than it fixes in my experience, so I would disable it globally (set low latency to "On" not "Ultra") and only enable it on a per-game basis where required; e.g. in games that do not have a frame rate limiter of their own, and other external tools like RTSS do not work.
Correct, the video stated that VSYNC in some games might trigger some other engine optimizations that make it work better but it can also have the opposite effect depending on the engine/developer (bad frame pacing, etc), so it seems that it might be better to enable VSYNC on the NVCP globally just to be on the safer side.Okay thanks for the clarification, yeah that's was where I was standing too mostly so I was surprised to see the video saying the opposite.
Correct, the video stated that VSYNC in some games might trigger some other engine optimizations that make it work better but it can also have the opposite effect depending on the engine/developer (bad frame pacing, etc), so it seems that it might be better to enable VSYNC on the NVCP globally just to be on the safer side.
Are you using the rollback option in Retroarch? I found that that helps latency a whole lot. In general I think going from my 60hz TV to my 144hz monitor for retroarch feels way better, but maybe the input lag in general on my other TV was higher.Got my first freesync monitor a few days ago. Biggest surprise is the classic games like Unreal and Duke 3D which suddenly run much smoother. They have terrible hitching with vsync & 60Hz but they're smooth like they used to be many years ago with freesync and higher refresh rates! Did not expect that.
It also really does improve latency for a lot of games - low spec and high.
However, RetroArch, even properly tweaked, didn't really feel much different. My old monitor was very low latency for 60Hz, and of course I was tweaking the hell out of latency settings. But I don't perceive any difference from what was probably ~20ms latency to ~5ms. I'd like to compare it to a CRT on real hardware to see how much more latency a real SNES has. I bet I would feel the jump from ~5ms latency to ~40ms on a real SNES and CRT.
I doesn't do that, but it makes it feel much smoother when the framerate fluctuates. When I'm playing a game that can't consistently hit 60, it definitely doesn't feel like consistent 60, but it feels like consistent something, like the game isn't chugging or speeding up intermittently.Does G sync makes 40fps feels the same as 60fps? What's the game changing thing about this compare to triple buffering.
Does G sync makes 40fps feels the same as 60fps? What's the game changing thing about this compare to triple buffering.
No, 40 FPS feels like 40 FPS.Does G sync makes 40fps feels the same as 60fps? What's the game changing thing about this compare to triple buffering.
I think runahead has done so much to improve latency in emulation that those benefits are less noticeable now - at least in games where there's a couple of frames you can shave off.However, RetroArch, even properly tweaked, didn't really feel much different. My old monitor was very low latency for 60Hz, and of course I was tweaking the hell out of latency settings. But I don't perceive any difference from what was probably ~20ms latency to ~5ms. I'd like to compare it to a CRT on real hardware to see how much more latency a real SNES has. I bet I would feel the jump from ~5ms latency to ~40ms on a real SNES and CRT.
Are you using the rollback option in Retroarch? I found that that helps latency a whole lot. In general I think going from my 60hz TV to my 144hz monitor for retroarch feels way better, but maybe the input lag in general on my other TV was higher.
I think runahead has done so much to improve latency in emulation that those benefits are less noticeable now - at least in games where there's a couple of frames you can shave off.
That said, you can still combine G-Sync with runahead to reduce latency further.
G-Sync removes latency from the display end of the equation, while runahead removes latency from the game.
But make sure that you have RetroArch configured correctly: it has a "sync to exact content frame rate" option you should enable.
This is required for it to work correctly with VRR displays, and can make a big difference with arcade games such as R-Type which run at 55Hz rather than the typical 60 (curiously my monitor reports a fluctuating 52-55 FPS when it's running, but the image is perfectly smooth).
From what I've read, it's just 10 series cards and up afaikG-Sync monitors are so absurdly expensive here :/
Are GTX 970 cards even compatible with FreeSync? I keep getting mixed messages from Google searches.
Nope. But it does smooth out a lot of the jittering when framerates are bouncing around.Does G sync makes 40fps feels the same as 60fps? What's the game changing thing about this compare to triple buffering.
Do Freesync monitors do both freesync and Gsync? I just bought an Nvidia gaming laptop yet also use my X on it. Will be using next gen AMD consoles on it as well.
Does it? I thought NVIDIA still only supported G-Sync and VESA Adaptive-Sync via DisplayPort (which AMD brands as "FreeSync") and HDMI-VRR.Your nVidia laptop should support FreeSync over HDMI with the latest drivers, assuming the monitor supports HDMI 2.0b or later. Get a FreeSync display with that and you should be good to go.
Does it? I thought NVIDIA still only supported G-Sync and VESA Adaptive-Sync via DisplayPort (which AMD brands as "FreeSync") and HDMI-VRR.
FreeSync-over-HDMI is a different standard from HDMI-VRR. The latter was introduced as part of the HDMI 2.1 spec and back-ported to 2.0b, while the former is a proprietary AMD extension.
The Acer Predator 27: 165Hz IPS Gsync 1440p monitor is at an all time low $379!!!
https://www.amazon.com/Acer-Predator-XB271HU-bmiprz-2560x1440/dp/B06ZXZ3QBD (not a referral link)
If I wasn't holding out for the 4K version, I'd be all over this again.Damn that's a REALLY good deal. I have this monitor and it's great.
If you're considering a gsync monitor this is one of the best you can get at 1440p. Absolutely get it if you can.