• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.
  • We have made minor adjustments to how the search bar works on ResetEra. You can read about the changes here.

Klean

Banned
Nov 3, 2017
641
The thing is that if freesync was working properly it would completely bypass whatever sync method the developers implemented.

I don't have a freesync/gsync monitor, but does it work that way on PC if you turn on in game vsync or adaptive vsync?

Sometimes when I'd use RTSS to cap fps to 60 and use adaptive vsync, I'd get tearing even when I wasn't dropping frames.
 

Kuro

Member
Oct 25, 2017
20,764
This tech does nothing for games under 40fps and still has a lot of stutter if 5+ frames are dropped at once. I've noticed GSYNC only stays stable at drops under 5 frames around 2-4.
 

Kage Maru

Member
Oct 27, 2017
3,804
Thanks for the video, I've been meaning to test this out on my freesync monitor.

This feature would be perfect for monster hunter since they refuse to lock down the frame rate.
 

TSM

Member
Oct 27, 2017
5,830
I don't have a freesync/gsync monitor, but does it work that way on PC if you turn on in game vsync or adaptive vsync?

Sometimes when I'd use RTSS to cap fps to 60 and use adaptive vsync, I'd get tearing even when I wasn't dropping frames.

When freesync or gsync are in their functional ranges they bypass the game's own sync. When you stray outside the range or hit the maximum refresh rate of the monitor freesync and gsync disengage and the game uses it's own sync method. This is why it's important to cap your game's fps at several frames less than your freesync/gsync monitor's maximum refresh rate if the game can reach that level.

If you don't have a gsync/vsync monitor then the game just uses whatever method you have selected in the game, or the method you have selected if you override the game using your drivers.
 

Deleted member 3058

User requested account closure
Member
Oct 25, 2017
6,728
Oh, I've been plenty of times but it's always fun. Was there for EGX Rezzed and another top secret DF Retro project. ;)
Ooooh!

It's so cool that you, an O.G. in these tech-in-games streets, got noticed and hired by the fine fine folk at DF. Your output has always been great, btw.


Anyway, back to the video.. can't wait for some TVs to adopt this tech. I'd love to see it more widely adopted by the other console manufacturers in the future.
 

D65

Member
Oct 26, 2017
6,862
This WAS more like a livestream than a concisely edited video. That's exactly what it is. Just a hands on impression piece designed to be quickly edited by Rich.

We had been away at a show all weekend and I had a huge filming project to do the next day - we did this off the cuff after just receiving the monitor.

I don't understand why this is a problem since the video didn't set out to do anything more than that.

Look, the fact is, we can't do high production values on every video. It's just not possible and smaller things like this are a good way to do it.

I can't really say much to otherwise stress that I'm an avid viewer of DF content -- I'm not going to like every video and I really didn't like this one. It's a huge topic and for a 22m video, there was more expectation. That's all it is really. I hope that comes off as constructive criticism and not entitlement: the video is titles "Tested", and yet it was more like a preview when comparing to other in-depth analysis. Perhaps name it "quick look" or something.

Anyway...

The thing is that if freesync was working properly it would completely bypass whatever sync method the developers implemented. Freesync and g-sync replace the in game sync method with their own. If the game is still dictating it's own sync method then freesync is not working properly and who knows what's really going on with it enabled in those games.

I think the bigger issue is the 40-60hz range of the monitor. Given that freesync and gsync both disable at a monitor's maximum refresh rate the actual freesync range is 40-59hz. It's most likely far too easy to fall outside that range without the developer putting a lot of effort into making sure their game has consistent frame times with freesync enabled. Ideally the range would be from 20hz to 70+hz. This should be enough to cover all the cases the vast majority of console games would need.

I'm not so sure here. If in the NVCP I enable Gsync but keep the Vsync setting to be controlled by the application (my preference since I can set the fps cap per game), the game can choose the Vsync method -- or rather, what will be engaged when the fps creeps out of range.

And sadly, even if you get a monitor with a higher than 60hz upperlimit, it won't make a single difference. The console is outputting 60hz and when this happens, this becomes the upperlimit of VRR method.

For example, if I play Lost Planet and set the refresh rate to 60hz, any fps higher than 60 causes tearing despite having a 120hz monitor.
 

ganaconda

Member
Jan 24, 2018
114
This WAS more like a livestream than a concisely edited video. That's exactly what it is. Just a hands on impression piece designed to be quickly edited by Rich.

We had been away at a show all weekend and I had a huge filming project to do the next day - we did this off the cuff after just receiving the monitor.

I don't understand why this is a problem since the video didn't set out to do anything more than that.

Look, the fact is, we can't do high production values on every video. It's just not possible and smaller things like this are a good way to do it.

Did you guys contact Microsoft about the issues you saw with their FreeSync implementation? Curious if they are aware of them and are working on fixing them.
 

bod

Member
Nov 1, 2017
158
I can't really say much to otherwise stress that I'm an avid viewer of DF content -- I'm not going to like every video and I really didn't like this one. It's a huge topic and for a 22m video, there was more expectation. That's all it is really. I hope that comes off as constructive criticism and not entitlement: the video is titles "Tested", and yet it was more like a preview when comparing to other in-depth analysis. Perhaps name it "quick look" or something.

Coming in and stating there video was a waste of time is not constructive criticism. You came across as a massive ass
 

TSM

Member
Oct 27, 2017
5,830
I'm not so sure here. If in the NVCP I enable Gsync but keep the Vsync setting to be controlled by the application (my preference since I can set the fps cap per game), the game can choose the Vsync method -- or rather, what will be engaged when the fps creeps out of range.

And sadly, even if you get a monitor with a higher than 60hz upperlimit, it won't make a single difference. The console is outputting 60hz and when this happens, this becomes the upperlimit of VRR method.

For example, if I play Lost Planet and set the refresh rate to 60hz, any fps higher than 60 causes tearing despite having a 120hz monitor.

If your monitor supports more than 60hz then you should always be in freesync if the game caps out at 60fps as this is less than the maximum refresh rate of the monitor. The reason that 60fps is bad in the case shown here is that the freesync monitor disengages at 60hz because that's it's maximum refresh rate limit. I wouldn't expect games to render at greater than 60fps, but this would solve at least one of the problems. Freesync is a completely software solution. It should be easy for them to make sure that higher refresh rate monitors are staying in freesync mode at 60hz.
 

Deleted member 3058

User requested account closure
Member
Oct 25, 2017
6,728
Should you turn vsync of in games when gsync is enabled?
Quoting from an old thread:
Code:
NVIDIA Control Panel --> Manager 3D settings -->
    Vertical Sync: On
    Monitor Technology: G-SYNC

Rivatuner Statistics Server -->
    Framerate limit --> (monitor refresh rate - 3 (e.g. 144Hz becomes 141))

In Game -->
    Full Screen
    Disable VSync

Any issues -->
    Visit pcgamingwiki

Thanks to D65 and LowParry for the info.

The rivatuner step is optional but has good results.

Do all of these for the best results in every game.
 

TSM

Member
Oct 27, 2017
5,830
Should you turn vsync of in games when gsync is enabled?

It doesn't really matter that much. With gsync your vsync selection only comes into play when you hit the maximum refresh rate of your monitor. So if you turn vsync off you will get no tearing until you hit a frame rate that matches or exceed your display (60/120/144/240/etc.) With vsync on when you hit the maximum you just get a little more lag. If a game is running consistently at your monitor's maximum refresh rate you should either cap the frame rate several frames lower than the max or decide if you'd rather have slight tearing (no vsync) or slight input lag (vsync).
 
Oct 27, 2017
9,435
Can some one explain why they thought that hooking up a free sync monitor would impact frame pacing if it is cpu limited? Were they expected the X's cpu to power through it in the dark souls example. I feel like I was missing something since the game was capped at 30 fps.
 
Oct 27, 2017
9,435
It doesn't matter. With gsync your refresh rate selection only comes into play when you hit the maximum refresh rate of your monitor. So if you turn vsync off you will get no tearing until you hit a frame rate that matches or exceed your display (60/120/144/240/etc.) With vsync on when you hit the maximum you just get a little more lag.

What no. You dont want vsync on by doing so you are pretty much losing the whole point of having gsync.
 

TSM

Member
Oct 27, 2017
5,830
What no. You dont want vsync on by doing so you are pretty much losing the whole point of having gsync.

Yeah, I made an edit, but it doesn't really matter which one they select as it only comes into play when gsync is at the maximum refresh rate of the monitor. As long as they are under the maximum refresh rate of their monitor the setting does nothing while gsync is enabled. Them hitting the maximum refresh rate of their monitor is a separate issue.
 

DavidDesu

Banned
Oct 29, 2017
5,718
Glasgow, Scotland
Wonder how much reach this will have in the short term when games still need to be optimised for the vast majority of TVs that won't have this feature. I really do hope it's a feature in every TV within 5 years though.
 
Oct 27, 2017
9,435
Yeah, I made an edit, but it doesn't really matter which one they select as it only comes into play when gsync is at the maximum refresh rate of the monitor. If you are under the maximum refresh rate of your monitor the setting does nothing. Them hitting the maximum refresh rate of their monitor is a separate issue.

But it does. If you have a 120 hz monitor lets say and the game is floating around 100-110 fps the screen is only going to display 60 fps with game vsync on. Where as with it off you will be getting matching frames up to the 120fps. So lets say it is during one second it is 110 frames you will see 110 frames as they are rendered.
 

TSM

Member
Oct 27, 2017
5,830
But it does. If you have a 120 hz monitor lets say and the game is floating around 100-110 fps the screen is only going to display 60 fps with game vsync on. Where as with it off you will be getting matching frames up to the 120fps. So lets say it is during one second it is 110 frames you will see 110 frames as they are rendered.

You are misunderstanding how gsync/freesync work. They are a driver level override for the game's own vsync implementation. While you are within the functional range of your monitor the drivers completely ignore the software's vsync setting and handle the buffer themselves.
 

D65

Member
Oct 26, 2017
6,862
If your monitor supports more than 60hz then you should always be in freesync if the game caps out at 60fps as this is less than the maximum refresh rate of the monitor. The reason that 60fps is bad in the case shown here is that the freesync monitor disengages at 60hz because that's it's maximum refresh rate limit. I wouldn't expect games to render at greater than 60fps, but this would solve at least one of the problems. Freesync is a completely software solution. It should be easy for them to make sure that higher refresh rate monitors are staying in freesync mode at 60hz.

No you misunderstand, if at the OS level (or more relevantly, GPU level) is sending out a 60hz signal, that is all you're going to get. The monitor will be in a 60hz mode which means anything close to 60fps and Freesync will be disengaged. Regardless if your monitor supports 120hz or not.
 

D65

Member
Oct 26, 2017
6,862
It doesn't really matter that much. With gsync your vsync selection only comes into play when you hit the maximum refresh rate of your monitor. So if you turn vsync off you will get no tearing until you hit a frame rate that matches or exceed your display (60/120/144/240/etc.) With vsync on when you hit the maximum you just get a little more lag. If a game is running consistently at your monitor's maximum refresh rate you should either cap the frame rate several frames lower than the max or decide if you'd rather have slight tearing (no vsync) or slight input lag (vsync).

"little more lag"

No, you get DOUBLE the input latency with vsync kicking in, this is why fps limit is so important.

When you cap 3fps below (for Gsync, lower for Freesync) you limit the disengaging of Gsync.
 
Oct 27, 2017
9,435
You are misunderstanding how gsync/freesync work. They are a driver level override for the game's own vsync implementation. While you are within the functional range of your monitor the drivers completely ignore the software's vsync setting and handle the buffer themselves.

I did a little reading and I see what you mean. I was incorrect.

No you misunderstand, if at the OS level (or more relevantly, GPU level) is sending out a 60hz signal, that is all you're going to get. The monitor will be in a 60hz mode which means anything close to 60fps and Freesync will be disengaged. Regardless if your monitor supports 120hz or not.

I was talking about Gsync though, either way what I was saying wasnt right.
 

TSM

Member
Oct 27, 2017
5,830
No you misunderstand, if at the OS level (or more relevantly, GPU level) is sending out a 60hz signal, that is all you're going to get. The monitor will be in a 60hz mode which means anything close to 60fps and Freesync will be disengaged. Regardless if your monitor supports 120hz or not.

It's a software solution. It should be trivial for AMD to work with Microsoft and keep higher refresh rate freesync monitors active in freesync at 60hz. This could be as simple as capping games at 58 or 59fps when in freesync mode.
 

D65

Member
Oct 26, 2017
6,862
What no. You dont want vsync on by doing so you are pretty much losing the whole point of having gsync.

You want Vsync on but limit the fps to -3 for Gsync.

Guys I posted the battlenonsense video in this thread for a reason.

Sometimes I like to benchmark in new games (I know Gsync affects maximum fps) a bit so I keep Vsync on application controlled, but generally speaking you should have it forced on in the NVCP and then put on the limit.

But it's not a big deal to not have it on, at -3fps, you will very rarely go over the limit and you may (may) get a single tear in within minutes of gameplay. At the very least -3 is a very safe amount.
 

AntiMacro

Member
Oct 27, 2017
3,149
Alberta
Oh my god, this video is a waste of time.

--

I'll drop this here just in case people wanted to know about FreeSync and how it compares to Gsync https://www.youtube.com/watch?v=mVNRNOcLUuA -- which is probably something they thought this 22 minute video would have at least touched on.
I think if you went into a video titled 'Xbox One X FreeSync/ Adaptive Sync Tested: Smoother, Faster, Better?' from a Eurogamer article titled 'FreeSync display support tested on Xbox One X' linked to from a post titled 'Digital Foundry tests FreeSync support on Xbox One X' expecting to find coverage of Gsync, the disappointment at it being simply early coverage of FreeSync on Xbox One X's current beta implementation of it is probably on you.
 

severianb

Banned
Nov 9, 2017
957
I feel like when they release actual tv's with this its gonna cost and arm and a leg

Nope. It's part of the HDMI 2.1 standard which will be on pretty much everything starting next year. Samsung will be updating a lot of this years TVs with the VRR feature, even the modestly priced NU8000 series.

Well this is disappointing

How? It's Beta software using a cheap monitor. I think it's the start of a nice feature that will be widely available on consoles and TVs in the next year or two.
 

Atisha

Banned
Nov 28, 2017
1,331
Sounds like it's early days but I'm very excited for adaptive-sync to take hold in the Console and TV space.
Hopefully we'll see a nice 30-120hz HDMI 2.1 TV by next-gen in 2020 (and not the limited 40-60hz monitor they're testing with here).
Exactly.

Will this give 30fps games a 60fps look?

Yes.

A game running 30fps on a 60hz monitor ( half refresh ) will smear / interpolate / frame double whenever you move the camera past a certain ( microscopic ) speed threshold which will make the changing game imagery - blurry.

If the variable refresh monitor is capable of dropping down to 30 hz, and run in sync with the 30fps game, interpolation / artifacting / will not display, and the image of 'the game in motion', for instance when you pivot the camera, will remain sharp and clear.
 
Last edited:
Oct 27, 2017
9,435

D65

Member
Oct 26, 2017
6,862
It's a software solution. It should be trivial for AMD to work with Microsoft and keep higher refresh rate freesync monitors active in freesync at 60hz. This could be as simple as capping games at 58 or 59fps when in freesync mode.

No this is my point.

AFAIK the Xbox does not output at higher than 60hz. I don't know if the OS will even allow it. We would be getting HFR on Xbox before we have this functionality.

It's not exactly trivial since AMD hasn't fixed this issue with desktop freesync. As mentioned in a previous post, to stop Freesync from disengaging Battle(non)sense needed to cap the fps to 130fps (-14 below 144) to ensure that the fps counter never went over the limit.

If TVs are made to disengage at 60hz because we're sending a 60hz signal, we might struggle to get VRR working properly on Xbox without a big overhaul.
 

D65

Member
Oct 26, 2017
6,862
I think if you went into a video titled 'Xbox One X FreeSync/ Adaptive Sync Tested: Smoother, Faster, Better?' from a Eurogamer article titled 'FreeSync display support tested on Xbox One X' linked to from a post titled 'Digital Foundry tests FreeSync support on Xbox One X' expecting to find coverage of Gsync, the disappointment at it being simply early coverage of FreeSync on Xbox One X's current beta implementation of it is probably on you.

I'll address this once more. That video does far more than just compare the two, but it describes the differences.

Regardless, the discussion in this thread is full of questions that very video answers... about how it works on PC -- the point is, DF has barely any information about why things might be wrong or did any proper testing, nor did they properly define for the layman what it does anyway. They barely "tested" this, misleading and a definite waste of 22 minutes.

Hopefully, MS understands that they can't just throw VRR support in and hope for the best. Freesync has some quirks that cause issues.

I also would like to know if the setting is increasing input latency. From prior PC testing... in theory it should be, but by a small amount.
 

Lumination

Member
Oct 26, 2017
12,506
So you're telling me that now I need a 4k OLED low input lag Freesync TV. Dear god.

But good on MS. I'm very eager to see their next gen console offering at the pace they're going now.
 

Railgun

Member
Oct 27, 2017
3,148
Australia
Nope. It's part of the HDMI 2.1 standard which will be on pretty much everything starting next year. Samsung will be updating a lot of this years TVs with the VRR feature, even the modestly priced NU8000 series.



How? It's Beta software using a cheap monitor. I think it's the start of a nice feature that will be widely available on consoles and TVs in the next year or two.
It's Beta software that is going public next week, I don't see how this isn't dissapointing when it sometimes works like it does on PC but other times doesn't. The video wasn't very positive either.
 

Pargon

Member
Oct 27, 2017
12,048
I could be wrong - perhaps you have been supplied with information from Microsoft about what they're doing - but I think you made some assumptions here that are not correct.
I should point out that I don't have any FreeSync hardware, but do have experience with G-Sync - and a lot of the issues that you have reported appear to behave exactly the same as they do on a G-Sync monitor.

It doesn't look like there is any kind of low framerate compensation active here at all.
For LFC to operate, the maximum refresh rate must be at least double the minimum refresh rate - and FreeSync requires 2.5x because games are highly variable.
With a 40-60Hz range, it could not possibly support LFC because 39x2 = 78Hz, and for FreeSync 39x2.5 = 97.5Hz.

So with a 40Hz minimum refresh rate, the monitor would have to be running at 100Hz - not just supporting it.
I don't believe the Xbox One will output anything higher than 60Hz, which means that a display would have to support a range of 24-60Hz for LFC to be active, if the Xbox even supports LFC. Nothing currently supports a range that low.

What you are seeing when the monitor reports 60Hz on the counter for games like Dark Souls 3 is FreeSync being deactivated, not some form of LFC.


Now, as for tearing: VRR can only guarantee a tear-free experience inside the active range.
If you are outside the active range and V-Sync is disabled, it will tear.
If you go outside the active range with V-Sync enabled, it transitions to standard V-Sync behavior.

With an LFC-capable setup, the active range is effectively 0 FPS up to the display's maximum refresh rate.
But with G-Sync, if you disable V-Sync you will find that you can still get tearing when the framerate is pushing against the upper limit of the display's range.

This is why games that otherwise run well, but set the display to 60Hz automatically, often have people reporting screen tearing problems on G-Sync monitors.
Bayonetta is one example where I saw that complaint a lot. It runs at 60 FPS on a modest gaming PC, but since it sets the refresh rate to 60Hz, that will cause tearing if V-Sync is disabled - despite G-Sync also being enabled.
With my ASUS monitor, there is a "Turbo" button that will force the refresh rate to its maximum while the game is running to prevent that, and for other monitors there are other potential solutions like running in borderless mode or using software that lets you set the refresh rate via a keyboard hotkey.

My understanding is that this is caused by frame-time spikes that most software won't pick up.
With a 100Hz G-Sync display, I have found that under extreme circumstances, you can push the display into tearing in some games as much as 20% below the maximum refresh rate if you're doing things to cause highly variable performance in the game, if V-Sync is disabled.

But that's why G-Sync originally didn't give you the option to disable V-Sync at all.
With G-Sync and V-Sync enabled, it should not be possible for a game to tear at any framerate. What can happen is that if your framerate hits the maximum refresh rate, it will transition over to standard V-Sync behavior and start to feel quite laggy - particularly if you are using a mouse.
This is why people with G-Sync displays use a framerate limiter (RTSS is recommended) to cap the framerate 3 FPS below the maximum refresh rate. That 3 FPS cap prevents the display transitioning out of G-Sync into V-Sync behavior, and you still benefit from having V-Sync enabled to prevent sudden frame-time changes from tearing (a framerate limiter alone cannot do this) without fully transitioning over to V-Sync behavior and adding latency.

That's likely why games such as F1 2017 are still tearing even if the framerate is reporting 60 or is in the very high 50s.


As for Final Fantasy XV stuttering inside the VRR range, well I don't have that specific game on PC and the situation may be different on the Xbox anyway.
I have found several games on PC which have stuttering issues that G-Sync cannot solve however. As far as I can tell, it's a frame presentation issue with the game itself.
SOMA and Dishonored 2 both look like they have very bad "microstutter" on a G-Sync display if you disable their internal 60 FPS framerate limiters.
Dishonored 2 running at 90 FPS via G-Sync on my 100Hz display - whether it's using the internal framerate limiter or an external one - looks like it's running at 45 FPS. Use the internal 60 FPS cap, and it runs buttery smooth (so long as you have the latest patch).
So in some situations it appears to be the engine at fault.


It will be interesting to see what Samsung do with their 2018 TVs.
The problem with limited VRR ranges is that most LCD panels do not like refreshing at less than 40Hz or so. LCDs cannot hold an image indefinitely. It will start to fade to white the longer it is held, so they need to refresh in order to sustain the image and prevent flickering. I don't believe that is the case for OLED though.
So I don't expect them to natively support <40Hz despite being a television with "24Hz" capabilities.

But their televisions are already using 4K 120Hz native panels.
You cannot send them a 4K120 signal due to the current limitations of HDMI, but they update at 120Hz for things like motion interpolation and displaying 24p content without judder.
In theory - and this is pure speculation on my part - a television with a 40-120Hz capable panel could report that it supports a 1-60Hz range for input, and perform LFC inside the display using a range of 48-120Hz since the panel itself can do it. But I'm being very hopeful there - nothing currently does this.
Samsung's higher end TVs from 2018 are supposed to get a software update with support for 120hz and Freesync later this year.
60Hz VRR.
120Hz will need full bandwidth HDMI 2.1 support - which will hopefully be the 2019 displays.
 
Last edited:

Prine

Attempted to circumvent ban with alt account
Banned
Oct 25, 2017
15,724
X continues to prove its the best gaming device ever made, the forethought the engineers have put into this machine is an example and benchmark for any machine that comes after it, MS has one hell of a team for hardware.
 

UltraDSA

Member
Oct 28, 2017
16
The video states that the Freesync feature is simply software enabled, does that also apply to the monitor? What i'm asking is why do some monitors that aren't branded Freesync work with Freesync and some don't. I have an old 120hz Samsung monitor and if/when Sony decides to implement Freesync, it would be neat to try it out before I do any kind TV upgrading when the time comes around.