• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.
  • We have made minor adjustments to how the search bar works on ResetEra. You can read about the changes here.

Pargon

Member
Oct 27, 2017
12,126
Oh interesting, that explains alot actually, thanks for clearing that up for me! So what I was fighting the whole time was just foolishness and misunderstandings on my part, the more you know. I can spot if FSO works correctly if the game appears to be running Borderless but actually minimizes if I ALT+TAB right? Is there anyway to whitelist DX11 game manually?
I would normally confirm it by having the game set to the "exclusive" or "fullscreen" option and then hit WIN+G to bring up the Game Bar.
The screen will flash twice if it's still using "real" full-screen exclusive mode, or the Game Bar will be displayed if FSO has upgraded it to flip-mode.

I don't think there's any way to get Windows 10 to apply FSO to other games - though I wish it could.
I hope Microsoft doesn't abandon its development after seeing so many people do their best to disable full-screen optimizations because they don't understand it - like they did with Game Mode.
The 1709 release of Windows 10 even applied FSO to borderless games, not just "exclusive" ones, but that seemed to cause problems for some people and was disabled in the next update.

And another thing I want to bring up to you. I have the hardest time with G-Sync and DX12 games. I can trigger it semi-reliably in Forza Horizon 3 and 4 by pressing ALT+ENTER over and over so the game will constantly shift from windowed back to FSO Fullscreen over and over till at one point G-Sync will finally pick it up. I confirmed it with the Nvidia G-Sync indicator and also via my monitor's OSD. Back then when I tried the Modern Warfare Beta that only ran on DX12 though I for my life couldn't get G-Sync to trigger with it at all. The only time I could get it working was in Windowed mode with borders, even if I manually removed the borders afterwards with Borderless Gaming G-Sync would just shut-off. Is there any "trick" to DX12 games perhaps?
I don't know about the Modern Warfare beta, but the Forza games enable VRR as soon as I disable V-Sync in them - though it's not applied during cutscenes. Having to disable V-Sync in these games to get VRR is an example of why you want to keep V-Sync enabled on the global profile in the driver.
There's also a Variable Refresh Rate setting in the Windows 10 display settings, if your OS and video drivers are up-to-date. It's under Settings > Display > Graphics Settings. I believe that's for older UWP games like Forza 6 Apex, which were released before there was an option to disable V-Sync (enabling VRR in UWP).

So, is Freesync basically Gsync with another name now? Is it 100% identical in terms of features/performance? If Freesync is just as fast and reliable as Gsync i don't understand how Gsync monitors continue to be so expensive.
G-Sync is more expensive because it provides a guaranteed experience, and uses FPGA hardware to drive the display.
Now that VRR is a part of the HDMI spec, and display scalers are improving, there's arguably less need for this now - except in the high-end G-Sync displays.
 

catpurrcat

Member
Oct 27, 2017
7,802
So, is Freesync basically Gsync with another name now? Is it 100% identical in terms of features/performance? If Freesync is just as fast and reliable as Gsync i don't understand how Gsync monitors continue to be so expensive.

Seems that way, doesn't it? I am thinking of going the freesync route even though I have an nvidia card simply because of open standards and better resale down the road.

Also what SomeoneSomewhere said above:

FreeSync monitors are cheap as hell. You only overpay for Gsync monitors.
 

laxu

Member
Nov 26, 2017
2,785
So, is Freesync basically Gsync with another name now? Is it 100% identical in terms of features/performance? If Freesync is just as fast and reliable as Gsync i don't understand how Gsync monitors continue to be so expensive.

With G-Sync you are pretty much guaranteed a good experience in terms of adaptive sync performance because the monitors use a Nvidia developed FPGA chip. With Freesync you are at the mercy of display manufacturers to implement it right, which has led to a very varying degree of performance from Freesync displays. Freesync 2 displays are a safer bet because they apply a bit stricter demands for displays advertised as such. See https://www.techspot.com/article/1630-freesync-2-explained/

Of course, even G-Sync displays need good panels with good overdrive implementation etc so they are not built equal either. For example with LG's 34" ultrawides there are both Freesync and G-Sync models that are quite similar but the Freesync one ended up being better in everything but response time. See https://www.tftcentral.co.uk/reviews/lg_34gk950g.htm#f_g_comparison

I recently moved from a 2014 1440p, 144 Hz G-Sync display to a 2018 5K x 1440p, 120 Hz Freesync 2 HDR display and personally cannot see a difference in the experience using a Nvidia GPU. My Samsung CRG9 is not on the Nvidia G-Sync Compatible certified list either.
 

Skyfireblaze

Member
Oct 25, 2017
11,257
I would normally confirm it by having the game set to the "exclusive" or "fullscreen" option and then hit WIN+G to bring up the Game Bar.
The screen will flash twice if it's still using "real" full-screen exclusive mode, or the Game Bar will be displayed if FSO has upgraded it to flip-mode.

I don't think there's any way to get Windows 10 to apply FSO to other games - though I wish it could.
I hope Microsoft doesn't abandon its development after seeing so many people do their best to disable full-screen optimizations because they don't understand it - like they did with Game Mode.
The 1709 release of Windows 10 even applied FSO to borderless games, not just "exclusive" ones, but that seemed to cause problems for some people and was disabled in the next update.


I don't know about the Modern Warfare beta, but the Forza games enable VRR as soon as I disable V-Sync in them - though it's not applied during cutscenes. Having to disable V-Sync in these games to get VRR is an example of why you want to keep V-Sync enabled on the global profile in the driver.
There's also a Variable Refresh Rate setting in the Windows 10 display settings, if your OS and video drivers are up-to-date. It's under Settings > Display > Graphics Settings. I believe that's for older UWP games like Forza 6 Apex, which were released before there was an option to disable V-Sync (enabling VRR in UWP).


G-Sync is more expensive because it provides a guaranteed experience, and uses FPGA hardware to drive the display.
Now that VRR is a part of the HDMI spec, and display scalers are improving, there's arguably less need for this now - except in the high-end G-Sync displays.

Ah yeah the GameBar can be useful like this I guess, I have it disabled because I do everything it offers otherwise already but I might enable it just for this, thanks for the tip! And yeah now that I know what it actually is I don't want it abandoned either, though I can't really blame people, I consider myself fairly tech-savvy but as you noticed I've been totally misinformed about it aswell despite me looking into it in the past. Microsoft should really try to clarify that on a larger scale. And yeah I heard about the 1709 issues, I think that's where my whole misinformation originated from even.

Hmm I see, thanks for the tip, toggling the V-Sync option did help at times but not always so I chalked it up to the same randomness as switching modes over and over. And I do have this VRR setting enabled, good to know what it actually might be for finally. Oh and something other that came to mind, now that Ultra Low Latency can be used with G-Sync in the new drivers, I guess the only reason to not enable this globally is some games not handling having no future-frame rending well right?
 

Pargon

Member
Oct 27, 2017
12,126
Oh and something other that came to mind, now that Ultra Low Latency can be used with G-Sync in the new drivers, I guess the only reason to not enable this globally is some games not handling having no future-frame rending well right?
Setting low latency mode to "on" should be fine globally, but I'm not sure about the "ultra" setting. It would be safer to set that per-game, but there's no harm in trying it.
 

dadjumper

Member
Oct 27, 2017
1,932
New Zealand
G-Sync is more expensive because it provides a guaranteed experience, and uses FPGA hardware to drive the display.
Now that VRR is a part of the HDMI spec, and display scalers are improving, there's arguably less need for this now - except in the high-end G-Sync displays.
So basically the verdict is if you go for a properly tested/reviewed Freesync display then you're good?
 

Skyfireblaze

Member
Oct 25, 2017
11,257
Pargon I just had a quick test of Rocket League with the new drivers, Fullscreen ingame, V-sync off ingame, V-Sync set globally and Ultra Low Latency, my god, it might be somewhat placebo but I felt like I just experienced G-Sync for the first time again, it was smoother than a ice-cube sliding over a frozen lake o.o
 

TaySan

SayTan
Member
Dec 10, 2018
31,707
Tulsa, Oklahoma
Looks like i'm back on the Gsync wagon. Bought a LG C9 and i'n hyped as hell for next gen. lol
Has anyone with a pascal gpu tried it yet?
 

Pargon

Member
Oct 27, 2017
12,126
Pargon I just had a quick test of Rocket League with the new drivers, Fullscreen ingame, V-sync off ingame, V-Sync set globally and Ultra Low Latency, my god, it might be somewhat placebo but I felt like I just experienced G-Sync for the first time again, it was smoother than a ice-cube sliding over a frozen lake o.o
If you've been playing in borderless mode this whole time, it doesn't surprise me - since that doesn't give you the real G-Sync smoothness (1709 being the exception).
It's probably because I've been trying to play The Outer Worlds recently, which is constantly stuttering, but trying out some other games with the new G-Sync + Ultra-Low Latency Mode did feel really smooth - so maybe there is something to it.

So basically the verdict is if you go for a properly tested/reviewed Freesync display then you're good?
Yeah, if it's well-reviewed there doesn't seem to be any reason to avoid a FreeSync display.
The really high-end G-Sync Ultimate monitors with hundreds of LED dimming zones seem to justify the extra cost, but now with OLED TVs being supported, there may not be much reason to buy one unless you specifically need a small display or want native 4K120 right now (since no GPU has HDMI 2.1 yet, they are currently limited to 1440p120).
 

dadjumper

Member
Oct 27, 2017
1,932
New Zealand
Yeah, if it's well-reviewed there doesn't seem to be any reason to avoid a FreeSync display.
The really high-end G-Sync Ultimate monitors with hundreds of LED dimming zones seem to justify the extra cost, but now with OLED TVs being supported, there may not be much reason to buy one unless you specifically need a small display or want native 4K120 right now (since no GPU has HDMI 2.1 yet, they are currently limited to 1440p120).
Word. I'd totally go the TV route but they don't make them small enough. I have a 40" 1080p TV and it's frankly a bit big in my apartment. 55" being the minimum on the LG is a dealbreaker.
Will look out for freesync monitor deals during black friday sales! Cheers
 

Skyfireblaze

Member
Oct 25, 2017
11,257
If you've been playing in borderless mode this whole time, it doesn't surprise me - since that doesn't give you the real G-Sync smoothness (1709 being the exception).
It's probably because I've been trying to play The Outer Worlds recently, which is constantly stuttering, but trying out some other games with the new G-Sync + Ultra-Low Latency Mode did feel really smooth - so maybe there is something to it.


Yeah, if it's well-reviewed there doesn't seem to be any reason to avoid a FreeSync display.
The really high-end G-Sync Ultimate monitors with hundreds of LED dimming zones seem to justify the extra cost, but now with OLED TVs being supported, there may not be much reason to buy one unless you specifically need a small display or want native 4K120 right now (since no GPU has HDMI 2.1 yet, they are currently limited to 1440p120).

Well it's not that my experience before was unsmooth, but right now it's extra smooth so really thanks for educating me on all my misunderstandings. Interesting that you made similar observations with Ultra Low Latency, I'll keep it enabled for now and see how I fare.
 

XR.

Member
Nov 22, 2018
6,633
I would normally confirm it by having the game set to the "exclusive" or "fullscreen" option and then hit WIN+G to bring up the Game Bar.
The screen will flash twice if it's still using "real" full-screen exclusive mode, or the Game Bar will be displayed if FSO has upgraded it to flip-mode.

I don't think there's any way to get Windows 10 to apply FSO to other games - though I wish it could.
I hope Microsoft doesn't abandon its development after seeing so many people do their best to disable full-screen optimizations because they don't understand it - like they did with Game Mode.
The 1709 release of Windows 10 even applied FSO to borderless games, not just "exclusive" ones, but that seemed to cause problems for some people and was disabled in the next update.


I don't know about the Modern Warfare beta, but the Forza games enable VRR as soon as I disable V-Sync in them - though it's not applied during cutscenes. Having to disable V-Sync in these games to get VRR is an example of why you want to keep V-Sync enabled on the global profile in the driver.
There's also a Variable Refresh Rate setting in the Windows 10 display settings, if your OS and video drivers are up-to-date. It's under Settings > Display > Graphics Settings. I believe that's for older UWP games like Forza 6 Apex, which were released before there was an option to disable V-Sync (enabling VRR in UWP).


G-Sync is more expensive because it provides a guaranteed experience, and uses FPGA hardware to drive the display.
Now that VRR is a part of the HDMI spec, and display scalers are improving, there's arguably less need for this now - except in the high-end G-Sync displays.
I haven't followed the topic fully but is there a scenario where you would recommend disabling fullscreen optimizations?

Also, since you mentioned it is it a good idea to leave game mode enabled? I can't test it at the moment but last I tried it made videos/streams stutter pretty badly on secondary screens whenever I played a fullscreen game. So I'd gladly take multitasking over any miniscule performance gain, if that's the case.
 
Last edited:

GhostTrick

Member
Oct 25, 2017
11,471
I switched from a 75hz to 144hz with Freesync LFC (Gsync compatible) monitor.
The move from 60 to 75hz was already noticeable. But damn, 144hz is truly a game changer.
I seen a lot of people claiming that "30fps feels cinematic". BS. 144fps feels cinematic. Playing DMC5 at such a high refresh rate or Ace Combat 7 really gives the whole thing an entire new look.
 

laxu

Member
Nov 26, 2017
2,785
Yeah, if it's well-reviewed there doesn't seem to be any reason to avoid a FreeSync display.
The really high-end G-Sync Ultimate monitors with hundreds of LED dimming zones seem to justify the extra cost, but now with OLED TVs being supported, there may not be much reason to buy one unless you specifically need a small display or want native 4K120 right now (since no GPU has HDMI 2.1 yet, they are currently limited to 1440p120).

Freesync 2 HDR is the equivalent of G-Sync Ultimate which is largely just a marketing term for G-Sync with HDR support. I don't know if Nvidia puts a requirement to have say FALD for G-Sync Ultimate. FALD has its own issues that won't be solved until we get Mini-LED backlights with a lot more dimming zones or dual layer LCDs which would work closer to how OLED does. Since OLED is not available in desktop size at any reasonable cost it's not much of an option unless you can mount a big TV far away from you.
 

Skyfireblaze

Member
Oct 25, 2017
11,257
I switched from a 75hz to 144hz with Freesync LFC (Gsync compatible) monitor.
The move from 60 to 75hz was already noticeable. But damn, 144hz is truly a game changer.
I seen a lot of people claiming that "30fps feels cinematic". BS. 144fps feels cinematic. Playing DMC5 at such a high refresh rate or Ace Combat 7 really gives the whole thing an entire new look.

I went with a similar upgrade-path just that I overclocked my old non VRR monitor from 60 to 75hz and yeah I was very surprised back then that 15hz more already made 60 feel sluggish, it's stunning o.o
 

Pargon

Member
Oct 27, 2017
12,126
I haven't followed the topic fully but is there a scenario where you would recommend disabling fullscreen optimizations?
It mostly seems to be an issue for people that are wanting to downsample games, as you sometimes need "real" exclusive mode for it to change the display resolution.

Also, since you mentioned it is it a good idea to leave game mode enabled? I can't test it at the moment but last I tried it made videos/streams stutter pretty badly on secondary screens whenever I played a fullscreen game. So I'd gladly take multitasking over any miniscule performance gain, if that's the case.
Game Mode has largely been cancelled by Microsoft.
In older versions of Windows 10 what it did was reserve 3/4 of the available CPU cores to the game process, and move all non-game processes over to the first 1/4 of the available cores.
Here's an example of it in action, with me forcing an undemanding game to only run on core 15 while also running software in the background to put some work on all cores.

Game Mode Off. CPU usage is split evenly across all cores:
gamemodeoffeoucg.png


Game Mode On. All non-game processes are moved to the first four cores, with the rest reserved to the game (which was overridden to only run on core 15):
gamemodeonhoun2.png


Though there were some issues with it, it was a great feature in my opinion - particularly in games which are bottlenecked by a single thread.
For example: SOMA runs most of its game logic on a single thread.
That means Game Mode provided a significant amount of headroom. Without it there was only 6% of a core free, so it was likely to drop frames if a more complex scene appeared or if CPU usage increased in one of the background programs.
With Game Mode enabled there was 22% of a core free, and the core was reserved so increased background application activity could not affect it - so even more demanding scenes would not drop frames.

The problem with Game Mode can be seen in that first example where the game is only running on core 15 though.
See how the first four cores are now hitting 100%? For whatever reason, certain critical Windows processes will only ever run on core 0. That includes things like interrupts, which affect your input devices.
If you had a lot running in the background, Game Mode could potentially put so much load on core 0 that it would cause the system to become extremely unresponsive; only responding to inputs every few seconds.
Or in some cases, Game Mode would move all non-game processes onto the first 1/4 of the CPU, but fail to move the game process off it, which led to core 0 being overloaded unless you manually moved the game process yourself (or used a tool like Process Lasso).

I think with bit of tweaking what it did, such as keeping core 0 reserved exclusively for system processes (which is arguably something that Windows should already be doing) rather than moving non-game applications onto it, Game Mode would have been a feature that most people would have used - especially on higher core-count CPUs.
But that idea has been abandoned now, and pretty much the only thing that Game Mode does is silence your notifications and prevent certain system processes like Windows Update from running in the background while a game is running.

Freesync 2 HDR is the equivalent of G-Sync Ultimate which is largely just a marketing term for G-Sync with HDR support. I don't know if Nvidia puts a requirement to have say FALD for G-Sync Ultimate. FALD has its own issues that won't be solved until we get Mini-LED backlights with a lot more dimming zones or dual layer LCDs which would work closer to how OLED does. Since OLED is not available in desktop size at any reasonable cost it's not much of an option unless you can mount a big TV far away from you.
I believe G-Sync ultimate has strict requirements for things like high-density local dimming (512 zones on a monitor-sized display is a huge amount) and HDR capabilities, while FreeSync 2 only covers things like requiring low frame rate compensation and HDR; but far less strict on the HDR experience.
LG are supposed to be releasing 48" OLEDs next year which would be suitable as a large monitor (assuming you're willing to risk burn-in). The ideal would be 46" since that is 96 PPI (Windows is based around 96 DPI) but 48" 4K would still be 92 PPI and suitable for use as a giant monitor at 100% scale.
 

laxu

Member
Nov 26, 2017
2,785
I believe G-Sync ultimate has strict requirements for things like high-density local dimming (512 zones on a monitor-sized display is a huge amount) and HDR capabilities, while FreeSync 2 only covers things like requiring low frame rate compensation and HDR; but far less strict on the HDR experience.
LG are supposed to be releasing 48" OLEDs next year which would be suitable as a large monitor (assuming you're willing to risk burn-in). The ideal would be 46" since that is 96 PPI (Windows is based around 96 DPI) but 48" 4K would still be 92 PPI and suitable for use as a giant monitor at 100% scale.

View distance matters for what PPI you need but IMO anything under 96 or less is too low at normal desktop viewing distances (which has the monitor closer than you would have a large TV). I'm happy with around 110 PPI on a 1440p display where it looks largely sharp without requiring DPI scaling. For desktop displays I'd rather have something in the 38-43" size for 4K.

I made a little graphic comparing my super ultrawide 49" 32:9, 5120x1440 screen to some current and upcoming screen sizes:

XkoTKKh.png


To me the super ultrawide is as big as I would go horizontally and it really needs to be curved at that size too. The upcoming LG 48" is still pretty massive. I feel vertical size is what requires the display to be moved backwards a fair bit to make up for both low PPI and sheer physical size. It's just easier to look side to side than into corners.
 

Pargon

Member
Oct 27, 2017
12,126
View distance matters for what PPI you need but IMO anything under 96 or less is too low at normal desktop viewing distances (which has the monitor closer than you would have a large TV). I'm happy with around 110 PPI on a 1440p display where it looks largely sharp without requiring DPI scaling. For desktop displays I'd rather have something in the 38-43" size for 4K.
92 PPI is just on the edge of where things are acceptable.
96 DPI has the advantage of being "native" to Windows, and it will display things at the intended scale. Anything in-between 96 PPI and 192 PPI is either going to look a little bit too small or a little bit too big depending on the scaling setting you use in Windows - and non-integer scales in Windows result in legacy applications being blurred.
175% scaling will blur legacy applications but 200% will be perfectly sharp. 200% scaling will look too big on anything less than 192 PPI though. Likewise, 125% scaling will blur legacy applications, while 100% will be perfectly sharp; but things will be a little bit too small on displays >96 PPI.

I made a little graphic comparing my super ultrawide 49" 32:9, 5120x1440 screen to some current and upcoming screen sizes:

XkoTKKh.png


To me the super ultrawide is as big as I would go horizontally and it really needs to be curved at that size too. The upcoming LG 48" is still pretty massive. I feel vertical size is what requires the display to be moved backwards a fair bit to make up for both low PPI and sheer physical size. It's just easier to look side to side than into corners.
What I have found is that "regular" ultra-wide (~24:10) is the sweet-spot for aspect ratio.
With the way that our perception works, image height is largely what is used to determine how big a display feels, and what feels comfortable for us.
No matter the aspect ratio, the distance I find it comfortable to sit from a screen is determined by its height.

When I sit close enough to a display that its height is the same perceptual size, I find that a 16:9 display leaves a lot of empty space in my vision at the sides, and these new super ultra-wides push content too far out to the side of my vision. Sitting further back so the super ultra-wide display fits in my vision results in a smaller image that does not fill it vertically.
Regular 24:10 ultra-wides strike the right balance and fill roughly an equal amount of my vision horizontally and vertically - at least the central portion of my vision that is is comfortable to use while gaming - especially when most games place HUD elements at the corners of the display, no matter the aspect ratio.
Display size for me largely only determines the viewing distance, and viewing distance sets the resolution requirements.
 

Skyfireblaze

Member
Oct 25, 2017
11,257
Say, this might not be the right topic for it but does anyone use Nvidia Gamestream here? Ever since I updated to the new drivers yesterday the client-side of Gamestream has some weird frameskipping going on :/
 
Jan 20, 2019
260
This thread is a pleasant reminder that I need to upgrade my main display before I upgrade my GPU. Is it just me though, or does it seem like a bit of a confusing time to consider a next purchase (as someone who has a pretty modest budget and won't be going near 4K for quite a while). The mid-tier graphics card market is really heating up, increased implementation of ray-tracing is on the way, headroom in display settings seem to be hit/miss for 1440 to 1080 scaling at times, and the subjectivity of what refresh rates are "playable" make for a muddled mind when trying to do research.

So, uh...any tips on what to look out for around Black Friday sales?
 

furfoot

Member
Dec 12, 2017
600
Any gsync owners know how to fix flickering with static images? I got issues with total war and its driving me mad.

This is mostly as static images are displayed during loading screens and that is when there is a huge frame variance causing the brightness to change rapidly.

The cause of the brightness differences is actually due to LFC triggering which causes a big spike in refresh rate from say 60hz to 120hz which some monitors don't really play nicely with.

Making sure your frame limit is not close to the LFC trigger point is important during gameplay as otherwise you will notice it in game too. You can change the LFC behaviour and when it switch to this mode with CRU by editing V rate bounds in the display properties. I set this to 57 - 144 so I can frame limit some games to 60 fps and continually play at 120hz without minute frame dips/spikes resulting in LFC triggering.

Finally with Freesync its important to make sure you play at 100hz or above as that is the optimal overdrive/response times rate.

All these oddities and tweaks are why the GSYNC module was invented as it makes sure this is all handled nicely.
 

laxu

Member
Nov 26, 2017
2,785
What I have found is that "regular" ultra-wide (~24:10) is the sweet-spot for aspect ratio.
With the way that our perception works, image height is largely what is used to determine how big a display feels, and what feels comfortable for us.
No matter the aspect ratio, the distance I find it comfortable to sit from a screen is determined by its height.

When I sit close enough to a display that its height is the same perceptual size, I find that a 16:9 display leaves a lot of empty space in my vision at the sides, and these new super ultra-wides push content too far out to the side of my vision. Sitting further back so the super ultra-wide display fits in my vision results in a smaller image that does not fill it vertically.
Regular 24:10 ultra-wides strike the right balance and fill roughly an equal amount of my vision horizontally and vertically - at least the central portion of my vision that is is comfortable to use while gaming - especially when most games place HUD elements at the corners of the display, no matter the aspect ratio.
Display size for me largely only determines the viewing distance, and viewing distance sets the resolution requirements.

I see super ultrawide more as adding peripheral vision. It adds immersion and allows you to look at the sides a bit without turning your character/car/whatever. It is certainly not for everyone and I agree that 24:10 at a sufficiently large size and resolution is a really nice format. It's too bad that the only thing for that is the still unreleased 38" LG 38GL950G which looks to be pushed back again and is also overpriced for what it is. I hope in a few years I could get a 5K x 2K screen at 38-43" with high refresh rate.
 

Pargon

Member
Oct 27, 2017
12,126
I see super ultrawide more as adding peripheral vision. It adds immersion and allows you to look at the sides a bit without turning your character/car/whatever. It is certainly not for everyone and I agree that 24:10 at a sufficiently large size and resolution is a really nice format. It's too bad that the only thing for that is the still unreleased 38" LG 38GL950G which looks to be pushed back again and is also overpriced for what it is. I hope in a few years I could get a 5K x 2K screen at 38-43" with high refresh rate.
While they are technically 43:18 (2.39:1) the 3440x1440 ultra-wide resolution is only very slightly different than 24:10 (2.40:1).
 

laxu

Member
Nov 26, 2017
2,785
While they are technically 43:18 (2.39:1) the 3440x1440 ultra-wide resolution is only very slightly different than 24:10 (2.40:1).

Yes but if you want anything larger then your options are limited if you want high refresh rate as well. I see 3440x1440 as just wider 27" 1440p 16:9.
 

Mercador

Member
Nov 18, 2017
2,840
Quebec City
This is mostly as static images are displayed during loading screens and that is when there is a huge frame variance causing the brightness to change rapidly.

The cause of the brightness differences is actually due to LFC triggering which causes a big spike in refresh rate from say 60hz to 120hz which some monitors don't really play nicely with.

Making sure your frame limit is not close to the LFC trigger point is important during gameplay as otherwise you will notice it in game too. You can change the LFC behaviour and when it switch to this mode with CRU by editing V rate bounds in the display properties. I set this to 57 - 144 so I can frame limit some games to 60 fps and continually play at 120hz without minute frame dips/spikes resulting in LFC triggering.

Finally with Freesync its important to make sure you play at 100hz or above as that is the optimal overdrive/response times rate.

All these oddities and tweaks are why the GSYNC module was invented as it makes sure this is all handled nicely.
Oh wow thanks. What is the meaning of LFC? I notice this effect during loading time between turns as the cpu might be more solicited. But even mouse-over tooltip can create that. Guess I'm too graphic sensitive but it annoys me.
 

Pargon

Member
Oct 27, 2017
12,126
Oh wow thanks. What is the meaning of LFC? I notice this effect during loading time between turns as the cpu might be more solicited. But even mouse-over tooltip can create that. Guess I'm too graphic sensitive but it annoys me.
LFC is low framerate compensation.
When the frame rate of a game drops below the minimum supported refresh rate, LFC will display it at a higher refresh rate within the VRR range; e.g. 39 FPS doubled-up to 78Hz on a 40-120Hz display.
The issue is that some displays (more typically FreeSync displays) may change brightness slightly as the refresh rate changes. That won't be noticeable if the changes are small and gradual, but if a game is jumping between 39-40 FPS that means it will be switching between 78Hz and 40Hz, which may flicker.
 

Mercador

Member
Nov 18, 2017
2,840
Quebec City
Ok, my display is a dell s24d17 if I remember correctly, where I could reach those parameters? I don't see any of them within the display options. Thanks!
 

BeI

Member
Dec 9, 2017
6,026
Is it normal for refresh rate to double / jump around constantly when fps doesn't change much? I've got Skyrim locked to ~60fps (with Rivatuner; game has unlocked fps via ini), but the monitor shows the refresh rate around 120Hz, but it keeps skipping around constantly from 90-130Hz.
 

Pargon

Member
Oct 27, 2017
12,126
Is it normal for refresh rate to double / jump around constantly when fps doesn't change much? I've got Skyrim locked to ~60fps (with Rivatuner; game has unlocked fps via ini), but the monitor shows the refresh rate around 120Hz, but it keeps skipping around constantly from 90-130Hz.
90-130Hz would be unusual for something locked to 60 FPS.
Are you running the game in borderless/windowed mode rather than full-screen exclusive mode? Windowed-mode G-Sync often suffers from the refresh rate fluctuating even if the frame rate is constant. It's why I recommend that you do not even enable it in the NVIDIA Control Panel.
Is it the legacy version of Skyrim or the Special Edition?
 
Last edited:

BeI

Member
Dec 9, 2017
6,026
90-130Hz would be unusual for something locked to 60 FPS.
Are you running the game in borderless/windowed mode rather than full-screen exclusive mode? Windowed-mode G-Sync often suffers from the refresh rate fluctuating even if the frame rate is constant. It's why I recommend that you do not even enable it in the NVIDIA Control Panel.
Is it the legacy version of Skyrim or the Special Edition?

Special edition with normal fullscreen. I have Gsync enabled for Fullscreen and Window apps though because I think some Xbox store games use borderless.

Edit: it seems to fix itself after tabbing out and back in, but sometimes a loading screen sets it off and it settles at ~120Hz afterwards.
 

Pargon

Member
Oct 27, 2017
12,126
Special edition with normal fullscreen. I have Gsync enabled for Fullscreen and Window apps though because I think some Xbox store games use borderless.
Edit: it seems to fix itself after tabbing out and back in, but sometimes a loading screen sets it off and it settles at ~120Hz afterwards.
I reinstalled Skyrim SE and can't seem to reproduce this. I do have to tab out of the game and back for the RTSS OSD to display, but the frame rate limiter is working as expected even before doing so.
I did notice that opening the config options automatically enabled borderless mode in the config file, even when I deselected it, so you might want to check that.
I don't think UWP games are supposed to use borderless mode, but it's possible that newer ones do now. I'd still recommend disabling it, because borderless G-Sync isn't smooth.
 

BeI

Member
Dec 9, 2017
6,026
I reinstalled Skyrim SE and can't seem to reproduce this. I do have to tab out of the game and back for the RTSS OSD to display, but the frame rate limiter is working as expected even before doing so.
I did notice that opening the config options automatically enabled borderless mode in the config file, even when I deselected it, so you might want to check that.
I don't think UWP games are supposed to use borderless mode, but it's possible that newer ones do now. I'd still recommend disabling it, because borderless G-Sync isn't smooth.

It's weird, it just seems like my monitor has a tendency to want to double the Hz. It even happened in State of Decay 2 on Xbox store.
 

Armadilo

Banned
Oct 27, 2017
9,877
Would the one recommended by the OP work with Stadia ? I was looking for a 4k monitor that can take advantage of it, would this only work in 1080p if it doesn't support 1440?
 

aevanhoe

Slayer of the Eternal Voidslurper
Member
Aug 28, 2018
7,389
I have for the longest time been gaming on my PC on an OLED at 60hz, while you can't beat the image the OLED provides i recently got new monitor, 1440p 144hz and this is probably the best purchase i have made in a long time, IQ does take a hit from 4k to 1440p but once you get to play a game at over 100 frames with Gsync/Freesync enabled is the best thing a gamer can witness, if you are on the fence do not hesitate anymore, specially with so many affordable options now, the monitor i got is relatively cheap and i think is amazing, if you have a Gsync/Freesync monitor share which one is it and recommend some for those thinking about getting one! i have to thank Jeff6851 since i got the one this user recommended.

Not saying you're wrong to like what you like but I'd take picture quality of OLED over refresh rates. Depends what you value more.
 

Pargon

Member
Oct 27, 2017
12,126
It's weird, it just seems like my monitor has a tendency to want to double the Hz. It even happened in State of Decay 2 on Xbox store.
Is it a G-Sync or FreeSync monitor?
Is it possible that it's low frame rate compensation kicking in, and the monitor reporting the "real" refresh rate?
All G-Sync monitors are required to use LFC, but mine still reports the frame rate rather than the rate that the monitor is refreshing at. For example: no monitor refreshes at a true 24Hz - they will be displaying at least 48Hz. But my monitor's OSD still reports "24" and not 48/72. Yours might report something like 48/72 for 24 FPS though.

I suppose the main thing that matters is if it's smooth when this is happening.
 

Cyanity

Member
Oct 25, 2017
9,345
Not saying you're wrong to like what you like but I'd take picture quality of OLED over refresh rates. Depends what you value more.
Have you actually experienced what the user you quoted is describing, though? Because the switch from 60fps, high refresh rate to high fps, low refresh rate with slightly worse IQ is more of a leap than a jump.

G/Freesync is a gamechanger that has to be experienced to really "get"
 

Nzyme32

Member
Oct 28, 2017
5,290
It's weird, it just seems like my monitor has a tendency to want to double the Hz. It even happened in State of Decay 2 on Xbox store.

Only thing I can think of...which probably isnt it.... I have a Gigabyte Aorus 144hz monitor that is Gsync compatible, but if FPS goes below 42hz, it has a feature where framerate output by the monitor is doubled, so it can "appear" more smooth..

Other than that, I have an issue with Dying Light that is similar. My computer should handle downsampling to 4k 100+hz no problem, but it runs at between 15 to 25fps. No idea why. Have to play the game at 1440p.
No other game does this.
 

aevanhoe

Slayer of the Eternal Voidslurper
Member
Aug 28, 2018
7,389
Have you actually experienced what the user you quoted is describing, though? Because the switch from 60fps, high refresh rate to high fps, low refresh rate with slightly worse IQ is more of a leap than a jump.

G/Freesync is a gamechanger that has to be experienced to really "get"

While I did see it in action, I admit I haven't spent too much time with it. With that said, I never cared about refresh rates too much, not caring if the game is 30 or 60fps, for example. I always go for graphics vs performance.

I know it's considered blasphemy here, but that's what I care about. Inky blacks is a game changer for me, fps is nice. For me, anyway.