• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.
  • We have made minor adjustments to how the search bar works on ResetEra. You can read about the changes here.
VRAM Testing Results
OP
OP
Darktalon

Darktalon

Member
Oct 27, 2017
3,266
Kansas
GearsTactics_2020_10_21_17_29_59_148.png


Gears Tactics, max settings at 3440x1440, Benchmark
4767MB VRAM

Interesting note here, the benchmark tells us the VRAM usage on the left side, and we can see that even the per process metric is overshooting a little.
4.65GB was the reported high, and the current usage is listed as 4.76GB.

RTSS Per-Process overestimating is something I found to hold true almost every time that I compared RTSS's method to Special K's method, or internal stats such as FS2020 and Gears.
 
Last edited:
OP
OP
Darktalon

Darktalon

Member
Oct 27, 2017
3,266
Kansas
Control_DX12_2020_10_21_04_12_52_857.png

Control Ultimate Edition
Max Settings, Quality DLSS 3440x1440
4164MB VRAM

This was during OC testing, 450w of fun
 
OP
OP
Darktalon

Darktalon

Member
Oct 27, 2017
3,266
Kansas
guyBXyg.jpeg

92Mmt7O.jpeg

K3gyDZE.jpeg

Age of Empires III Definitive Edition
Max Settings, First 2 are 3440x1440, 3rd is 6880x2880.

You can see how misleading VRAM stats can be, I have total system allocation of 7811 MB in the 2nd picture, yet only 1680MB of it is from AoE 3. When I took this screenshot, I had massive amounts of chrome tabs open, and had not reset my computer in quite some time. And yet if someone did not use the per-process #, they would be misleading people how much VRAM AoE 3 was using.

This is also a good example of the effect resolution has. VRAM usage has almost doubled between screens 2 and 3, when using 4x the pixels.
 
OP
OP
Darktalon

Darktalon

Member
Oct 27, 2017
3,266
Kansas
RXRnXzw.jpg

csNHVg4.jpeg

Death Stranding
Max Settings, 3440x1440 DLSS Quality.

I've included the 1st screenshot to show you why you should not trust the in-game bars. It says my settings will use 4.1GB. RTSS reports 6684MB of system allocation, but the truth is only 1594 MB is being used by Death Stranding.

The 2nd screenshot looks weird because I was playing in HDR. You can see in game, that we are only using 3764MB of VRAM, despite using HDR, and max settings. DLSS reduces VRAM usage, much as DSR increases it. And yet all previous VRAM measurements would have reported 8665MB.... is anyone seeing the pattern here?
 
OP
OP
Darktalon

Darktalon

Member
Oct 27, 2017
3,266
Kansas
CEAzTAp.jpeg

Hearthstone
Max Settings, 2560x1440

One of the most egregious examples of "Allocated VRAM" vs "Usage". It is nearly off by 10x!
 
OP
OP
Darktalon

Darktalon

Member
Oct 27, 2017
3,266
Kansas
r9ON30K.jpg

Horizon Zero Dawn
Max Settings, 3440x1440 HDR

This is one of the highest VRAM using games I have found so far. Screenshot looks funny due to playing in HDR. Allocation is still off by 1.7GB!!!
 
OP
OP
Darktalon

Darktalon

Member
Oct 27, 2017
3,266
Kansas
nN48r6C.png

qVJR95o.jpeg


Gears 5
Max Settings (Yes, even the insane settings, which are insane)
3440x1440
4.98GB VRAM

The benchmark tells us the VRAM usage on the left side. For whatever reason (Async Computer perhaps?), per-process vram was not displaying, so the metric shown is just the standard RTSS Allocation. Yes, a game that is borderline next-gen, is using only 5GB of VRAM.
 

RCSI

Avenger
Oct 27, 2017
1,839
The Witcher 3 with HD reworked texture mod/Tweaks mod and hairworks is off, DSR'ed 8k (8192x4320) as I'm too lazy to change the edid stuff. Using a 3080.

vram usage 7579 MB, vram allocated 9923 MB

swV8KgM.jpg

I'm fairly sure I picked the right memory option to display and framerate was around ~25 when looking at the horizon.
 

mordecaii83

Avenger
Oct 28, 2017
6,862
I just want to pop in here and say that I'm so happy this thread exists. I know a ton of people will never see it and keep overestimating VRAM usage, but it's definitely been eye-opening.
 

Alexx

Member
Oct 27, 2017
237
r9ON30K.jpg
Horizon Zero Dawn
Max Settings, 3440x1440 HDR

This is one of the highest VRAM using games I have found so far. Screenshot looks funny due to playing in HDR. Allocation is still off by 1.7GB!!!

Could you re-run this with the game running in 4K? 7.5Gb at 3440x1440 seems pretty high and I wonder if 4K would match or exceed 10GB.
 
Dec 13, 2018
1,521
Sorry, just poked my head in but got curious, couldn't you just have nvidia-smi polling on a second display before this (at least for nvidis cards)? It seems like people are only interested in the vram per process info anyway
 

Jonnax

Member
Oct 26, 2017
4,920
I imagine gamers are gonna start harassing Devs now.

"How dare you allocate more ram than you're actually using!"
 

LavaBadger

Member
Nov 14, 2017
4,988
Could you re-run this with the game running in 4K? 7.5Gb at 3440x1440 seems pretty high and I wonder if 4K would match or exceed 10GB.

It would be nice to see results for 4k. I don't know that anyone was worried about 1440 (Even with ultra wide) being a bottleneck.

That said, given how far off from 10gb most of these numbers are, it shows there's a lot of room for usage to grow.
 

Puggles

Sometimes, it's not a fart
Member
Nov 3, 2017
2,870
Anyone try Resident Evil 2? At 4K Max it says it's using over 10GB of VRAM but that has to be a lie.
 

Jedi2016

Member
Oct 27, 2017
15,669
I must be using the wrong version of Afterburner, because I don't have those per-process options. It is the beta, but I guess it's the wrong beta? lol

I might just wait until the feature makes it into the full release.

On a side note, I was able to use HWInfo and a plugin to get a lot of this info (minus the per-process stuff) to show up on my Stream Deck, so I can see all the usage graphs without putting anything up on the screen (and the graphs are a bit easier to read at a quick glance). It's interesting to see games with odd usage statistics that push the card in different ways. Clock speed and overall usage percentage definitely don't scale linearly.
 

firstseeker

Member
Dec 4, 2019
266
The only reason I went with the RTX 3090 is for heavily modded games. I can't run my 1k+ modded skyrim @4k on a 3080 10GB
 

Skyfireblaze

Member
Oct 25, 2017
11,257
The only reason I went with the RTX 3090 is for heavily modded games. I can't run my 1k+ modded skyrim @4k on a 3080 10GB

Are you certain this is related to VRAM and not the engine collapsing on itself? My super heavy modded version of Oblivion runs at 20fps on my GTX 1070 + 8600k, there aren't any hardware bottlenecks, it's literally just the engine getting crushed under the load of mods.
 

firstseeker

Member
Dec 4, 2019
266
Are you certain this is related to VRAM and not the engine collapsing on itself? My super heavy modded version of Oblivion runs at 20fps on my GTX 1070 + 8600k, there aren't any hardware bottlenecks, it's literally just the engine getting crushed under the load of mods.

When I run the same mod build with a gtx 1080, It dips to single digit FPS. On my 3090, I get 60 FPS most of the time, with VRAM usage fluctuating between 13-18GB
 

Jedi2016

Member
Oct 27, 2017
15,669
That's the problem with just dumping a bunch of high-res textures into a game, it eliminates whatever optimizations the developers did with the included files. I'm not sure that really counts when you're talking about BGS, but it should hold for most games. My Skyrim isn't modded all that much, and I could hit 60 on my old 1080 no problem.
 

Jedi2016

Member
Oct 27, 2017
15,669
I cover it step by step in the first post, you just need to enable the gpu.dll plug in and then the new options will show up.
I'm not saying it doesn't work, I'm saying it doesn't work for me because I know I have the wrong version of the program for it to work. I'm just making a point, not asking for help.
 

dgrdsv

Member
Oct 25, 2017
11,884
Some of these differences are eye opening. So we can be sure that this reading is accurate?
Well, the big elephant in the room here is that despite the actual process using significantly less VRAM than what is allocated in total at the same time - this total is still allocated meaning that it may actually be used at some point, in some combination of software and user inputs / choices.
So while the process usage number give us some insight into how much VRAM games are actually using as opposed to just allocating it doesn't mean that your system in total would be fine running with just this amount of VRAM.
 

Alexandros

Member
Oct 26, 2017
17,811
Well, the big elephant in the room here is that despite the actual process using significantly less VRAM than what is allocated in total at the same time - this total is still allocated meaning that it may actually be used at some point, in some combination of software and user inputs / choices.
So while the process usage number give us some insight into how much VRAM games are actually using as opposed to just allocating it doesn't mean that your system in total would be fine running with just this amount of VRAM.

I see, thanks!
 

Kaldaien

Developer of Special K
Verified
Aug 8, 2020
298
this total is still allocated meaning that it may actually be used at some point, in some combination of software and user inputs / choices.
Yes, this is true. It's why I remain firm when I say the lengths that this update go to are inadequate.

Code:
10/23/2020 03:23:42.694: --------------------
10/23/2020 03:23:42.695: Shutdown Statistics:
10/23/2020 03:23:42.696: --------------------

10/23/2020 03:23:42.697:  Memory Budget Changed 2 times

10/23/2020 03:23:42.698:  GPU0: Min Budget:        10281 MiB
10/23/2020 03:23:42.699:        Max Budget:        10281 MiB
10/23/2020 03:23:42.700:        Min Usage:         00000 MiB
10/23/2020 03:23:42.702:        Max Usage:         06088 MiB
10/23/2020 03:23:42.703: ------------------------------------
10/23/2020 03:23:42.704:  Minimum Over Budget:     00000 MiB
10/23/2020 03:23:42.705:  Maximum Over Budget:     00000 MiB
10/23/2020 03:23:42.706: ------------------------------------

Special K monitors things a bit differently, it is tracking budget high/low marks and records peak usage every time the driver informs the app that budgets change. At game shutdown, because I was doing more than tracking usage over time, I can generate these stats.

At minimum, MSI Afterburner needs to start reporting the budget capacity. Until it does, it's only a half solution.
 
OP
OP
Darktalon

Darktalon

Member
Oct 27, 2017
3,266
Kansas
MetroExodus_2020_10_22_23_34_01_985.png

Metro Exodus

3440x1440, Max Settings, Including RTX Ultra, DLSS OFF

5.1GB VRAM

Game FPS really takes a hit moving from RTX High to Ultra, but here we have it.
 
OP
OP
Darktalon

Darktalon

Member
Oct 27, 2017
3,266
Kansas
MonsterHunterWorld_2020_10_23_00_56_09_247.png
MonsterHunterWorld_2020_10_23_01_01_27_316.png

Monster Hunter World
3440x1440, Max Settings (Volumetric Low), No DLSS, Performance Booster Mod Installed
HD Texture Pack Installed
7.3GB VRAM

I finally feel like I can afford to have all the settings maxed out, I had to make many compromises on 2080 Ti.
 
OP
OP
Darktalon

Darktalon

Member
Oct 27, 2017
3,266
Kansas
Why? My 2080 Ti K|NGP|N ran it at 4K/60 Maxed. That is a _heavily_ overclocked card, but I don't think a normal 2080 Ti would struggle too much. My 1080 Ti would hold around 50-55 FPS at 4K max.
Because I try to play at 144 fps not 60. I know you love your oleds, but 60 fps is not an acceptable pc gaming experience for me.
 

Kaldaien

Developer of Special K
Verified
Aug 8, 2020
298
Because I try to play at 144 fps not 60. I know you love your oleds, but 60 fps is not an acceptable pc gaming experience for me.
Ahem... my OLEDs are 120 Hz :P

I just like Black Frame Insertion better than brute forcing motion w/ higher framerates. The inevitable frame instability that comes from rendering at high framerates is kind of the antithesis of what I want for motion in games. 60 FPS locked w/ BFI looks a million times better than 120 Hz mostly stable. I won't have to choose anymore whenever the hell NVIDIA decides to start selling GPUs again though :)

I was going to get a 3090 at launch, then run 120 Hz Black Frame Insertion for 240 Hz motion performance. But no NVIDIA unicorns have shown up at my front door yet, and that includes the LDAT I've been waiting on for > 1 month now..
 

Jedi2016

Member
Oct 27, 2017
15,669
High framerates + Gsync = Butter

On a whim, I brought up the display on Blade Runner. It was actually allocating about 1.5GB (!!), but actual usage? 35MB. And even that's way above the game's original PC specs, so I'm guessing most of that usage is coming from ScummVM.
 

Kaldaien

Developer of Special K
Verified
Aug 8, 2020
298
That would be insanely inefficient, for something that emulates a C64 / MS-DOS era adventure engine.

Measuring general graphics memory has never been particularly useful, however. With the way it grows and how much responsibility the driver has for managing this memory, there's not much reason for a developer to measure that statistic, much less an end-user. The technique used now has been around since Windows 10 first shipped, it's taken a while for devs to get the memo and now users are starting to understand how lop-sided these numbers have been for years :)
 
OP
OP
Darktalon

Darktalon

Member
Oct 27, 2017
3,266
Kansas
Ahem... my OLEDs are 120 Hz :P

I just like Black Frame Insertion better than brute forcing motion w/ higher framerates. The inevitable frame instability that comes from rendering at high framerates is kind of the antithesis of what I want for motion in games. 60 FPS locked w/ BFI looks a million times better than 120 Hz mostly stable. I won't have to choose anymore whenever the hell NVIDIA decides to start selling GPUs again though :)

I was going to get a 3090 at launch, then run 120 Hz Black Frame Insertion for 240 Hz motion performance. But no NVIDIA unicorns have shown up at my front door yet, and that includes the LDAT I've been waiting on for > 1 month now..
Sure but Monster Hunter World wouldn't be playing at 4k 120hz on a 2080ti, without reducing settings :)
 

Kaldaien

Developer of Special K
Verified
Aug 8, 2020
298
Which brings me back to the 60 Hz Black Frame Insertion. I'm totally happy with that mode, it requires a rock solid framerate to do without motion artifacts, but rock solid framerates are something I'm pretty good at. 120 Hz's biggest benefit as far as I'm concerned is the imperceptible flicker if you use OLED motion. I can already get 120 Hz latency at lower screen refresh rates just by tuning my framerate limiter ;)
 
OP
OP
Darktalon

Darktalon

Member
Oct 27, 2017
3,266
Kansas
Which brings me back to the 60 Hz Black Frame Insertion. I'm totally happy with that mode, it requires a rock solid framerate to do without motion artifacts, but rock solid framerates are something I'm pretty good at. 120 Hz's biggest benefit as far as I'm concerned is the imperceptible flicker if you use OLED motion. I can already get 120 Hz latency at lower screen refresh rates just by tuning my framerate limiter ;)
Next year I'll be getting the C11, or w/e they call it, to have 4k120 to go with my PS5, and hook the PC up to. My current monitor can do ULMB at both 120 and 144hz, but of course most recent games are not really capable of rock solid framerates at that resolution, even with a 3080.
 
OP
OP
Darktalon

Darktalon

Member
Oct 27, 2017
3,266
Kansas
Ghostrunner-Win64-Shipping_2020_10_27_15_51_43_613.png
Ghostrunner-Win64-Shipping_2020_10_27_15_51_56_284.png
Ghostrunner-Win64-Shipping_2020_10_27_15_52_12_632.png


Ghostrunner 3440x1440, RTX On, Max Settings

DLSS Quality, Performance, and Off, respectively.

Please, take notice of how VRAM scales with DLSS.

4.8GB - Performance
5.3GB - Quality
6.2GB - Native
 

Kaldaien

Developer of Special K
Verified
Aug 8, 2020
298
Of course DLSS eats VRAM measurable in the budget ;)

The number one thing that shows up when you track VRAM by budgets are RenderTargets, the stuff that's been allocated but it isn't counting negatively against budget tends to be textures that can be transiently moved between VRAM and system memory. RenderTargets have to ALWAYS be in VRAM or the software doesn't work. That means this number is primarily a measure of your screen resolution, and the miniscule amount of texture memory required to draw the scene each frame.

Budgets are a measure of the required VRAM total to draw a single frame, basically. Any resource that can be moved out of VRAM during the process of drawing a frame won't be listed.
 

Spoit

Member
Oct 28, 2017
3,987
Unfortunately, the Anti-cheat precluded getting the "Dedicated GPU memory / process" in the OP, but the "GPU Dedicated memory" was reporting ~9 gigs (presumably total?) in WDL @4k with very high settings, the HD texture DLC, no DLSS, and the lowest RT setting (forgot what it was labelled). Not sure if I'm not measuring it correctly, or it's just unusually VRAM hungy.

FWIW, that was reporting ~2 gigs in the menus, before loading into the game proper and the in-game estimates said ~7 gigs, so it's not entirely impossible that it really was using that much