• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.
  • We have made minor adjustments to how the search bar works on ResetEra. You can read about the changes here.

dgrdsv

Member
Oct 25, 2017
11,893

View: https://www.youtube.com/watch?v=LdrzRVFGR2w

A benchmark of the new patch from today.
There are still frametime issues in both GPU and CPU limited runs on a 10700F. The new patch seem to be doing a bit better though.
Also of note (if that's not a result of streaming and/or shader compilation) is that the latest patch seem to have reduced CPU load a bit.

Also a new Nvidia driver with official ReBAR profile is out today:
www.nvidia.com

GeForce Game Ready Driver | 552.12 | Windows 10 64-bit, Windows 11 | NVIDIA

Download the English (US) GeForce Game Ready Driver for Windows 10 64-bit, Windows 11 systems. Released 2024.4.4

Frame Gen is broken because it turns on Reflex and introduces micro stuttering.
After 60 hours of play I can safely say that that microstuttering is there without Reflex too, it just amplify it (On+Boost more so than just On btw).
But since you are getting higher performance with FG the stutters are less noticeable as the frametimes are smaller.
So after playing for ~20 hours without FG I've turned it on. Generally a better experience despite the judder I'd say. But this may depend on what GPU you're using and what is your base framerate of course.
 

dgrdsv

Member
Oct 25, 2017
11,893
Ran some short tests with the new patch and the driver.
FG+Gsync+Vsync seem to work noticeably better now but there is still severe frametime judder w/o FG and with Vsync and FG still doesn't really lock to the fps limit either.

Just to close off the idea that CPU could be an issue here that's how frametimes look right now when running at 60 Hz with vsync (no FG and thus no Gsync):

Oq2Tro.jpeg


What stable joy it is eh?

Here's with in-game half-rate vsync with 60 Hz output (so should be locked 30; which it's not):

Oq8Awb.jpeg


And to finish up the set 120Hz+Gsync no FG:

Oq8YRb.jpeg


And with FG:

Oq8aZk.jpeg


Don't look at CPU loads as the game is compiling shaders in the background after the new driver installation - this can be noticed by the increased storage reads which should be at <1MB/s when that process finishes.

All in all the issue remains. But at least after uploading these shots I've noticed that I forgot to re-enable the Reflex latency detection stub in the OSD and fixed that so there is good news ¯\_(ツ)_/¯
 

Shaz12567

Member
Jun 7, 2021
488

View: https://www.youtube.com/watch?v=LdrzRVFGR2w

A benchmark of the new patch from today.
There are still frametime issues in both GPU and CPU limited runs on a 10700F. The new patch seem to be doing a bit better though.
Also of note (if that's not a result of streaming and/or shader compilation) is that the latest patch seem to have reduced CPU load a bit.

Also a new Nvidia driver with official ReBAR profile is out today:
www.nvidia.com

GeForce Game Ready Driver | 552.12 | Windows 10 64-bit, Windows 11 | NVIDIA

Download the English (US) GeForce Game Ready Driver for Windows 10 64-bit, Windows 11 systems. Released 2024.4.4


After 60 hours of play I can safely say that that microstuttering is there without Reflex too, it just amplify it (On+Boost more so than just On btw).
But since you are getting higher performance with FG the stutters are less noticeable as the frametimes are smaller.
So after playing for ~20 hours without FG I've turned it on. Generally a better experience despite the judder I'd say. But this may depend on what GPU you're using and what is your base framerate of course.

On my 7800X3D, Frame Gen introduces more stuttering which is evident in the frame time graph. Just standing still and looking at the graph, there are small spikes every 2-3 seconds on the graph which completely disappear if you turn off Frame Gen.

I had 2 options, DLAA and Frame Gen or DLSS Quality with no Frame Gen and the latter felt smoother to me because there is no hitching, just judder.

I don't think the new patch did anything material in terms of the stutter while I had already enabled Rebar on the older driver.
 

dgrdsv

Member
Oct 25, 2017
11,893
And some DLSS preset C vs E comparisons:
All three has no AA and TAA in them for comparison too.

My conclusion here (and do note that this is for HFW only as other games may fare differently) is that preset E has extremely sharp resolve which produce a very sharp image in comparison to every other option - even in comparison to no AA one.
This comes with the expected bag of caveats as there are more aliasing now although the preset do try to fight it favoring temporal stability - anything which is moving is notably less sharp and even less sharp than in preset C (which was made to make moving things sharper and reduce ghosting).
In Performance mode model E produce pixelized shadows for example suggesting that it may be too sharp for that particular mode.
Quality mode with model E is sharper than DLAA with model C, and overall seems like a sweet spot for the preset at least in 4K output resolution.
DLAA doesn't seem to be much different between C and E so there's no point in using the latter with DLAA at least in this game.

On my 7800X3D, Frame Gen introduces more stuttering which is evident in the frame time graph.
FG tend to introduce "native" hitches because of how you use it with cutscenes and transitions in/out of menus where generated frames are being dropped to reduce artifacting (hence a hitch occurs).
In general play though I wouldn't say that FG produce more hitches than running without it on my system.
 

Cross-Section

Member
Oct 27, 2017
6,875
How is one actually supposed to change the DLSS preset from within Nvidia Profile Inspector? I've scanned it a few times and I haven't seen an option for it.
 

Shaz12567

Member
Jun 7, 2021
488
DLSS with 77% render scale and Model E looks virtually identical to Native. It's just insane just how far ahead of FSR this is. AMD really needs to step up it's game here. DLSS and RT have made Nvidia cards the only ones to buy at the high end.
 
Oct 27, 2017
1,323
United States
How is one actually supposed to change the DLSS preset from within Nvidia Profile Inspector? I've scanned it a few times and I haven't seen an option for it.

That method requires placing emoose's "CustomSettingsName.xml" file in the same folder as nvidiaProfileInspector.exe. With it, the program will read the file and show the DLSS-related options in section "5 - Common." For now, modifications made only to the global profile will take effect.

The most up-to-date version of the XML file is here, under "Optional Files": NvTrueHDR - RTX HDR for games at Modding Tools - Nexus Mods
 

Cross-Section

Member
Oct 27, 2017
6,875
That method requires placing emoose's "CustomSettingsName.xml" file in the same folder as nvidiaProfileInspector.exe. With it, the program will read the file and show the DLSS-related options in section "5 - Common." For now, modifications made only to the global profile will take effect.

The most up-to-date version of the XML file is here, under "Optional Files": NvTrueHDR - RTX HDR for games at Modding Tools - Nexus Mods
Ah, thank you.
 
Oct 27, 2017
1,323
United States
Just a heads-up though, with that xml it actually places the DLSS/HDR options in their own category at the top of Profile Inspector, not in "5 - Common"

Works like a charm otherwise!

Thank you for confirming! ! I noticed that and appended an edit to my earlier post.

Unless you want to get into the nitty-gritty of customizing DLSS, I find the NVIDIA Profile Inspector route to be more straightforward than working with DLSSTweaks. I also rely on DLSS Swapper for effortlessly keeping my games' DLSS files updated.
 

Shaz12567

Member
Jun 7, 2021
488
And some DLSS preset C vs E comparisons:
All three has no AA and TAA in them for comparison too.

My conclusion here (and do note that this is for HFW only as other games may fare differently) is that preset E has extremely sharp resolve which produce a very sharp image in comparison to every other option - even in comparison to no AA one.
This comes with the expected bag of caveats as there are more aliasing now although the preset do try to fight it favoring temporal stability - anything which is moving is notably less sharp and even less sharp than in preset C (which was made to make moving things sharper and reduce ghosting).
In Performance mode model E produce pixelized shadows for example suggesting that it may be too sharp for that particular mode.
Quality mode with model E is sharper than DLAA with model C, and overall seems like a sweet spot for the preset at least in 4K output resolution.
DLAA doesn't seem to be much different between C and E so there's no point in using the latter with DLAA at least in this game.


FG tend to introduce "native" hitches because of how you use it with cutscenes and transitions in/out of menus where generated frames are being dropped to reduce artifacting (hence a hitch occurs).
In general play though I wouldn't say that FG produce more hitches than running without it on my system.
Do you see something weird with the cpu usage in this game? There are some spots in the game across the map where gpu usage goes down to 80% on my 4090 which really should not happen on a 7800X3D and CL30 DDR5 RAM. CPU load is moderate at 40% overall and I cant even say that the game isn't utilising all cores as they are all being used albeit middling and none of them are maxed out when this happens.

I tried lowering my resolution to 1440p and I saw a 30 fps gain which should not have happened if this was a cpu bottleneck. What is causing this GPU bottleneck?
 
Oct 27, 2017
1,323
United States
Do you see something weird with the cpu usage in this game? There are some spots in the game across the map where gpu usage goes down to 80% on my 4090 which really should not happen on a 7800X3D and CL30 DDR5 RAM. CPU load is moderate at 40% overall and I cant even say that the game isn't utilising all cores as they are all being used albeit middling and none of them are maxed out when this happens.

I tried lowering my resolution to 1440p and I saw a 30 fps gain which should not have happened if this was a cpu bottleneck. What is causing this GPU bottleneck?

Are you using Reflex "On + Boost" by chance? That is one setting I found was similarly hindering performance of my 4090.
 

Gitaroo

Member
Nov 3, 2017
8,031
DLSS with 77% render scale and Model E looks virtually identical to Native. It's just insane just how far ahead of FSR this is. AMD really needs to step up it's game here. DLSS and RT have made Nvidia cards the only ones to buy at the high end.

This game and ratchet are potentially getting new Fsr3.1 so good to compare they two
 

Rickyrozay2o9

Member
Dec 11, 2017
4,424
Do you see something weird with the cpu usage in this game? There are some spots in the game across the map where gpu usage goes down to 80% on my 4090 which really should not happen on a 7800X3D and CL30 DDR5 RAM. CPU load is moderate at 40% overall and I cant even say that the game isn't utilising all cores as they are all being used albeit middling and none of them are maxed out when this happens.

I tried lowering my resolution to 1440p and I saw a 30 fps gain which should not have happened if this was a cpu bottleneck. What is causing this GPU bottleneck?
I have a 13700k/4090 and mine is basically the same at 4k with everything else on including FG and DLSSq. Indoors with a lot of NPCs it'll go up to 90% gpu usage but outdoors in the open world it's in the 80s.
 

Shaz12567

Member
Jun 7, 2021
488
Are you using Reflex "On + Boost" by chance? That is one setting I found was similarly hindering performance of my 4090.
I disabled Reflex entirely because it causes frame time stutters easily visible on the graph. The game behaves very weirdly when it comes to the CPU. I have a 7800X3D which realistically should not be a bottleneck yet there are spots where GPU usage dips into the eighties but the CPU is still chilling as is my GPU. Lowering the resolution from 5120x1440p to 2560x1440p, actually increases my CPU and GPU usage and my hardware is utilised more which doesnt make any sense if there was truly a CPU bottleneck. There is also the issue with camera judder in villages

The game just doesn't know how to utilise high end hardware. My 4090 and 7800X3D are sleeping through this game but the utilisation is still not there,

I don't know why DF calls this an optimised port. Its a passable port. I personally thought Cyberpunk, Starfield released more optimised than this.
 

Shaz12567

Member
Jun 7, 2021
488
I have a 13700k/4090 and mine is basically the same at 4k with everything else on including FG and DLSSq. Indoors with a lot of NPCs it'll go up to 90% gpu usage but outdoors in the open world it's in the 80s.
Does lowering the resolution increase frame rate for you? If so, this is not a CPU bottleneck. Don't know what to make of it.
 

Rickyrozay2o9

Member
Dec 11, 2017
4,424
Does lowering the resolution increase frame rate for you? If so, this is not a CPU bottleneck. Don't know what to make of it.
yep it does. I did turn down Reflex to On instead of ON+BOOST and it did give me like 3 or 4% more on the GPU usage and my fps was a bit more stable. Funny enough though my CPU usage stayed at about 24% in the first town area regardless of what resolution it was set to but dropping the resolution down to 1440p increased my fps quite a bit and the GPU usage was about the same as it was in 4k roughly.
 
Last edited:

dgrdsv

Member
Oct 25, 2017
11,893
Do you see something weird with the cpu usage in this game? There are some spots in the game across the map where gpu usage goes down to 80% on my 4090 which really should not happen on a 7800X3D and CL30 DDR5 RAM. CPU load is moderate at 40% overall and I cant even say that the game isn't utilising all cores as they are all being used albeit middling and none of them are maxed out when this happens.
Yep, same here. I'm not sure that it's the CPU usage per se since the CPU as you've said isn't really loaded.
What's more I've found a "funny" spot in one of the cauldrons (Kappa) where CPU gets hammered hard for whatever reason during water level changes - CPU usage at these points jumps from 40MT/60ST to around 75MT/90ST - and this has absolutely zero effect on the framerate.

I tried lowering my resolution to 1440p and I saw a 30 fps gain which should not have happened if this was a cpu bottleneck. What is causing this GPU bottleneck?
No idea. As I've said the game doesn't look well optimized at all. We can only hope that they'll fix this with patches.
 

Shaz12567

Member
Jun 7, 2021
488
Yep, same here. I'm not sure that it's the CPU usage per se since the CPU as you've said isn't really loaded.
What's more I've found a "funny" spot in one of the cauldrons (Kappa) where CPU gets hammered hard for whatever reason during water level changes - CPU usage at these points jumps from 40MT/60ST to around 75MT/90ST - and this has absolutely zero effect on the framerate.


No idea. As I've said the game doesn't look well optimized at all. We can only hope that they'll fix this with patches.
I dug around with my LG C1 at 4k resolution and my OLED G9 which is on 5120X1440p (90% of 4k) and noticed a huge bug w.r.t CPU usage.

lg c1 4k resolution -101 FPS with 99% GPU usage. The game runs like it should

pyCPK6e.jpeg

Same spot, same settings on the OLED G9 at 5120 x 1440p. FPS is massively lowered to 82 FPS and my GPU usage goes down to 82%.

KldGJu4.jpeg

I have never seen this behavior in any game where increasing the resolution is increasing my FPS lol.
 

NeoBob688

Member
Oct 27, 2017
3,646
I have never seen this behavior in any game where increasing the resolution is increasing my FPS lol.

Since the engine presumably performs a heavy amount of culling offscreen, isn't it possible that by having ultrawidescreen the engine is needing to render more than just pixels but additional world detail at each point in time that could increase how much CPU is taxed? I don't know but just thinking through your results.
 

P40L0

Member
Jun 12, 2018
7,631
Italy
No idea. As I've said the game doesn't look well optimized at all. We can only hope that they'll fix this with patches.
The game v1.1.47 with max settings at 4K + DLSS (3.7.0) Quality + FG + Reflex: On and all my PC Optimizations applied looks and run like this all the time on my 7800X3D and 4080 (Undervolted + Overclocked):

Horizon-Forbidden-West-Complete-Edition-v1-1-47-0-07-04-2024-23-57-51.png


Frame time is literally flat with very small fluctuations totally sorted out by G-Sync on my LG G3 OLED.
Only in cutscenes there are some very small stutters on some scenes' transitions but often you don't even notice.

So yeah, maybe Nixxes could optimize it more and also solve those cutscenes' hiccups, but overall I think the game is among the best PS5/Console ports on PC so far.
 
Last edited:

Rickyrozay2o9

Member
Dec 11, 2017
4,424
The game v1.1.47 with max settings at 4K + DLSS (3.7.0) Quality + FG + Reflex: On and all my PC Optimizations applied looks and run like this all the time on my 7800X3D and 4080 (Undervolted + Overclocked):

Horizon-Forbidden-West-Complete-Edition-v1-1-47-0-07-04-2024-23-57-51.png


Frame time is literally flat with very small fluctuations totally sorted out by G-Sync.
Only in cutscenes there are some very small stutters on some scenes' transitions but often you don't even notice.

So yeah, maybe Nixxes could optimize it more and also solve those cutscenes' hiccups, but overall I think the game is among the best PS5/Console ports on PC so far.
Yeah pretty much with these settings and my 13700k/4090 this game in terms of optimization is quite good. Even when I was using On+Boost reflex I wasn't really seeing THAT much of a degradation in performance as I was essentially still getting 4k/120 plus everywhere with very very little stuttering except during cutscenes when FG activates or whatever is going on. Switching Reflect to just On did help a bit though performance wise so I'm leaving it there.
 

dgrdsv

Member
Oct 25, 2017
11,893
Frame time is literally flat with very small fluctuations totally sorted out by G-Sync on my LG G3 OLED.
Frametime is fine when I'm GPU limited with FG (so <116 fps), it only goes places when I'm hitting either the Reflex fps limit or *something* which results in GPU being underutilized while still not hitting the Reflex limit.
I would say that it's CPU but CPU is <50% ST and <20% MT load during these moments.

Btw I've started Burning Shores, and there's like zero difference in framerate but the CPU usage there is generally a notch higher than in the main campaign - hangs around 25% MT 50% ST most of the time with spikes to 30/60.
Main campaign is a bit all over the place (due to size and variety in scenes probably) but generally it's around 20/40.

The biggest difference between the original game and BS which I've noticed is the different implementation of streaming - the original doesn't really stream much data after it finishes loading (which happens some minutes after you've actually loaded up the save), general figures are at <1MB/s.
BS however seem to stream in data whenever you pan the camera with peak figure being at ~550MB/s (SATA SSD limit basically). No idea what it's streaming in/out as I don't see any lazy loading of anything and the game certainly doesn't use either RAM or VRAM on my 64/24GB config to their fullest.

So yeah, maybe Nixxes could optimize it more and also solve those cutscenes' hiccups, but overall I think the game is among the best PS5/Console ports on PC so far.
I'll give you that, sure. It is "among the best PS5/Console ports on PC so far". It is still not a good port though, just an okay one.
 
Last edited:

P40L0

Member
Jun 12, 2018
7,631
Italy
Frametime is fine when I'm GPU limited with FG (so <116 fps), it only goes places when I'm hitting either the Reflex fps limit or *something* which results in GPU being underutilized while still not hitting the Reflex limit.
I would say that it's CPU but CPU is <50% ST and <20% MT load during these moments.

I'll give you that, sure. It is "among the best PS5/Console ports on PC so far". It is still not a good port though, just an okay one.
As you can see from the pic, I'm constantly hitting the Reflex cap (116fps with my 120hz TV) with no hiccups whatsoever during gameplay and this with a 4080.
Maybe I'll post a video too later to better show it.
So, yeah, something maybe off in your setup somewhere but should not be the game at this point.
 

P40L0

Member
Jun 12, 2018
7,631
Italy
It's not always flat, you have stutters in that clip during gameplay and you can see the frametime spike up plenty of times.
The stutters in the clip are due to Xbox DVR (which sucks but it was the only way to tone map my HDR clip to SDR automatically as I didn't want to install/use GeForce Experience or OBS. It was just a quick clip): the "spikes" you see in the frame time only happen during that brief cinematic, otherwise it's flat or so small to be totally unnoticeable and/or adjusted by G-Sync.

EDIT:
Oh, and in the town area with lots of NPCs it's the same:

Horizon-Forbidden-West-Complete-Edition-v1-1-47-0-08-04-2024-17-35-17.png


Horizon-Forbidden-West-Complete-Edition-v1-1-47-0-08-04-2024-17-58-09.png
 
Last edited:

dgrdsv

Member
Oct 25, 2017
11,893
I used Xbox DVR to record it so the video is a bit choppy, but look at the frame time during the whole clip: it's literally always flat except the small hiccups I mentioned during cutscenes.
I'm not arguing that the game runs seemingly better on Zen4 3D chips for whatever reason.
However the game is literally the only one which shows such judder on my system so it is absolutely the game's issue, and if it's an issue of game running on anything but Zen4 3D CPUs then it is obviously an optimization issue.
And that's before we go into some additional food for thought on the GPU side like the fact that 7900XT here seem to be on par with 4080S or that my 4090 basically never hits >350W power consumption (out of its 450W limit) in this game despite showing 95-100% utilization.
 

P40L0

Member
Jun 12, 2018
7,631
Italy
I'm not arguing that the game runs seemingly better on Zen4 3D chips for whatever reason.
However the game is literally the only one which shows such judder on my system so it is absolutely the game's issue, and if it's an issue of game running on anything but Zen4 3D CPUs then it is obviously an optimization issue.
And that's before we go into some additional food for thought on the GPU side like the fact that 7900XT here seem to be on par with 4080S or that my 4090 basically never hits >350W power consumption (out of its 450W limit) in this game despite showing 95-100% utilization.
I thought you also had the 7800X3D?

How does it run for you without FG btw? Cause in that case it's a near constant judder fest while being 100% GPU limited here.
Without FG I'm overing around 90fps but still with an almost perfect frame time graph.
I honestly prefer the additional visual smoothness of FG/116fps as the inevitable input lag increase is barely noticeable.
 

P40L0

Member
Jun 12, 2018
7,631
Italy
5900X (as can be seen on all screenshot I've posted).
I recalled in a previous post you mentioned the 7800X3D too but that's another story if not then.

This is more interesting honestly as this suggests that the issue is in GPU utilization more than the CPU.
Without FG the frame time graph is not as flat as with it (the line is "thicker" with constant but very low fluctuations, so stutters are still absent on 7800X3D as everything is still perfectly managed by G-Sync except the usual small drops in cinematics), so this along with your 5900X makes me think yours is a CPU utilization issue instead.
 

Shaz12567

Member
Jun 7, 2021
488
The game v1.1.47 with max settings at 4K + DLSS (3.7.0) Quality + FG + Reflex: On and all my PC Optimizations applied looks and run like this all the time on my 7800X3D and 4080 (Undervolted + Overclocked):

Horizon-Forbidden-West-Complete-Edition-v1-1-47-0-07-04-2024-23-57-51.png


Frame time is literally flat with very small fluctuations totally sorted out by G-Sync on my LG G3 OLED.
Only in cutscenes there are some very small stutters on some scenes' transitions but often you don't even notice.

So yeah, maybe Nixxes could optimize it more and also solve those cutscenes' hiccups, but overall I think the game is among the best PS5/Console ports on PC so far.
Maybe I am just more sensitive to it but Frame Gen + Reflex definitely stutters more than not using those features. Its clearly visible on the frame time graph. I am using a 7800X3D and a 4090. See below video. With Frame Gen disabled, the frametime graph is a flat line. When I enable it (Skip to 1:28 for the result in the video) , you can see those very small spikes every 2-3 seconds or so in the frame time graph. They may look small but because the frame rate is so high, it feels like hitches.

View: https://www.youtube.com/watch?v=5x6pf8NlUKM&ab_channel=BZRT

There is also another broken CPU issue with this game. As you can see in my video, GPU usage in a spot goes down to 90% in one spot but the CPU is barely being used as 28% for some reason so not sure why the drop. Funnily enough, I tried this same scene on my LG C1 OLED (my OLED G9 runs at 90% of 4k so really its close enough) and my frame rate increased and GPU usage further jumped to 100%. Increasing resolution is increasing frame rate in my case
 

dgrdsv

Member
Oct 25, 2017
11,893
Maybe I am just more sensitive to it but Frame Gen + Reflex definitely stutters more than not using those features. Its clearly visible on the frame time graph. I am using a 7800X3D and a 4090. See below video. With Frame Gen disabled, the frametime graph is a flat line. When I enable it, you can see those very small spikes every 2-3 seconds or so in the frame time graph. They may look small but because the frame rate is so high, it feels like hitches.
These certainly aren't hitches and this is in general how you'd expect Reflex to operate as it tries to marry CPU output to GPU input (to minimize latency) which means that some variability in frametime is inevitable.

I've seen some reports online that the recent Nvidia drivers have introduced an issue with HAGS which lead to microstuttering which would basically be what I see in HFW at fps limit when running with FG - but alas testing this in Starfield didn't show anything similar, there is some minor frametime variation as you'd expect from Reflex+FG but nothing to the levels of constant judder like in HFW.
So that's another theory disproved.

There is a note in Nvidia's driver release thread that they are looking into low GPU utilization in HFW when Reflex is in On+Boost mode - maybe this will fix it...
 

P40L0

Member
Jun 12, 2018
7,631
Italy
I've found a way to directly record gameplay in 4K/60 + color-accurate HDR (which is still oversaturated when using NVIDIA ShadowPlay instead) while still having a much smoother (but still not 100% smooth) video encoding than native Windows/Xbox DVR in 1080p/60/SDR


View: https://youtu.be/npNhx7F-pgo?si=WA4j6FA8ZMfgvg4f


View: https://www.youtube.com/watch?v=8tVttlTwFdc

I highly recommend watching them in HDR from a native HDR display!


The only drawbacks in having those good looking clips is that the recorder will kill Reflex and overall performance of the game takes an hit of around 6-7%.
I don't care. It just looks so good to be shared. :)
 
Last edited:

Shaz12567

Member
Jun 7, 2021
488
These certainly aren't hitches and this is in general how you'd expect Reflex to operate as it tries to marry CPU output to GPU input (to minimize latency) which means that some variability in frametime is inevitable.

I've seen some reports online that the recent Nvidia drivers have introduced an issue with HAGS which lead to microstuttering which would basically be what I see in HFW at fps limit when running with FG - but alas testing this in Starfield didn't show anything similar, there is some minor frametime variation as you'd expect from Reflex+FG but nothing to the levels of constant judder like in HFW.
So that's another theory disproved.

There is a note in Nvidia's driver release thread that they are looking into low GPU utilization in HFW when Reflex is in On+Boost mode - maybe this will fix it...
I don't see the same behaviour in Starfield, TLOU Part 1, Cyberpunk all of whom use Reflex. I have modded FG into RDR2 which also doesn't show this.
 

Shaz12567

Member
Jun 7, 2021
488
I've found a way to directly record gameplay in 4K/60 + color-accurate HDR (which is still oversaturated when using NVIDIA ShadowPlay instead) while still having a much smoother (but still not 100% smooth) video encoding than native Windows/Xbox DVR in 1080p/60/SDR


View: https://youtu.be/npNhx7F-pgo?si=WA4j6FA8ZMfgvg4f


View: https://www.youtube.com/watch?v=8tVttlTwFdc

I highly recommend watching them in HDR from a native HDR display!


The only drawbacks in having those good looking clips is that the recorder will kill Reflex and overall performance of the game takes an hit of around 6-7%.
I don't care. It just looks so good to be shared. :)

What settings are you using for HDR?
 

dgrdsv

Member
Oct 25, 2017
11,893
What settings are you using for HDR?
Experimented with that quite a bit on my C1 and settled on 825 max, 260 paper white (basically default), 0/0 for boosts.
Boosts lead to color range compression which results in loss of color information in either dark or bright tones.
And setting paper white to lower than default essentially kills contrast.
 

P40L0

Member
Jun 12, 2018
7,631
Italy
Last edited:

P40L0

Member
Jun 12, 2018
7,631
Italy
Your screenshot shows frametime judder when the game is at Reflex limit. Precisely the issue I was talking about.
Whether you personally notice it on your display is a different questions.
That judder is present only in "extreme" cases like this village and it's so small to be fully compensated by G-Sync. No one would notice when it's like that.

In more "relaxed" environments it's completely flat with the same config (DLSS Quality + FG + Reflex: On):

Horizon-Forbidden-West-Complete-Edition-v1-1-47-0-07-04-2024-23-57-51.png


"Stutters" are a different thing: you would see high vertical lines literally interrupting the Frametime you see: you will notice that even with G-Sync. They're absent on both scenarios (but there are some in cutscenes' camera changes).

I think that in most complex scenarios (Open World + lots of NPCs) the game is just constantly streaming assets in and out of memory and it's doing intelligently so as fluctuations are that minimal, at least in my case. Your mileage may vary with different CPUs and PC configurations.
 

dgrdsv

Member
Oct 25, 2017
11,893
That judder is present only in "extreme" cases like this village
For you. It is present pretty much everywhere for me, and I'd wager that this is because I have a 4090 which is more often hitting the Reflex limit.

and it's so small to be fully compensated by G-Sync
It's not "compensated by Gsync" at all because it's exactly what you see with Gsync - a constant frametime judder. It looks like you're getting less fps than you actually are, basically the good old definition of AFR microstutter.
 

P40L0

Member
Jun 12, 2018
7,631
Italy
For you. It is present pretty much everywhere for me, and I'd wager that this is because I have a 4090 which is more often hitting the Reflex limit.
I almost always stay at locked 116fps, so I'm hitting that Reflex ceiling with my Undervolted + OC 4080 as well.
No noticeable stutters nor in the graph or by eye. We discussed this already: seems more CPU or PC config related than GPU or the game itself.
Probably you're polling too much data from the sensors for all those stats in real-time creating a CPU bottleneck somewhere?

It's not "compensated by Gsync" at all because it's exactly what you see with Gsync - a constant frametime judder. It looks like you're getting less fps than you actually are, basically the good old definition of AFR microstutter.
It is on my LG G3 OLED with G-Sync enabled. No noticeable stutter or "micro stutter" when the Frametime graph is like those cases (and by the record: it's 99% flat like one of those two screenshots I posted, 1% like the other)
Everything is always smooth as butter except cutscenes' camera cuts (which literally "cut" the Frametime with an actual, noticeable stutter as well).
 

dgrdsv

Member
Oct 25, 2017
11,893
We discussed this already: seems more CPU or PC config related than GPU or the game itself.
It's 100% on the way the game handle GPU. The issue is exacerbated by the "boost" mode to a point where it literally stutters all the time with no CPU load to speak of. This is documented in Nvidia driver issues already.

Probably you're polling too much data from the sensors for all those stats in real-time creating a CPU bottleneck somewhere?
C'mon man, it's exactly the same without any OSDs.