• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.
  • We have made minor adjustments to how the search bar works on ResetEra. You can read about the changes here.

Darktalon

Member
Oct 27, 2017
3,266
Kansas
2021 Edit:
Stop making threads or comments with false data. Please back up claims with this tool.

7IjuEOj.png





IMPORTANT EDIT: MSI Afterburner now has a way to display "Per Process VRAM Commit", which I refer to in this article as "VRAM Usage"
Please see https://www.resetera.com/threads/msi-afterburner-can-now-display-per-process-vram.291986/ for details of how to enable this feature.

Part 1: Historical evidence of consoles vs PC VRAM requirements
Historically, PC requirements have been at least semi-dictated by the games being created on consoles.

I do believe we will still see Ultra Texture packs on PC that push further than consoles, and mods too will always push boundaries, but establishing where the consoles will be at is critical to understand what pushing further really means.

For simplification I will be referring to the shared system memory in XO/XSX and PS4/PS5 as VRAM, just remember that under this system, both the CPU and GPU must share it.

First we must put things into context. The jump between PS3 and PS4 was 256mb to 8GB. 32x increase in VRAM!

The main reason that early gen GPU's such as the 780 could not keep up for the entire 7 years, was that 8GB of VRAM was absolutely insane for 2013 when the PS4 was released. In fact this was a surprise to almost everyone.
The official specs are in for the PlayStation 4 and what we have is, by and large, confirmation of existing Digital Foundry stories - with one outstanding, exciting exception. At the PlayStation Meeting yesterday, Sony revealed that its new console ships with 8GB of GDDR5 RAM, not the 4GB we previously reported. It was a pleasant surprise not just for us, but also for many game developers out there working on PS4 titles now and completely unaware of the upgrade - a final flourish to the design seemingly added in at the last moment to make PlayStation 4 the most technologically advanced games console of the next gaming era.
Taken from https://www.eurogamer.net/articles/df-hardware-spec-analysis-playstation-4

PS4 came out Nov 2013 with an unprecedented 8GB of VRAM, with 3GB allocated to the OS, and 5GB available to developers. This was then split between CPU and GPU as the developers chose. Using this metric, we went from 256MB to 5GB, a 20x increase.

The jump between PS4/XO and PS5/XSX is 8GB to 16GB, 2x increase.

Let's get even more specific. The 8GB in Xbox One is also shared, in fact we know that Xbox One has 3GB reserved for OS and 5GB for the rest of the system.
The XSX has 13.5GB of VRAM to allocate to games, and must share this amount with the CPU. Also only 10GB of the XSX's VRAM has the faster bandwidth of 560GB/s, the other 3.5 GB runs at 336 GB/s. Developers will not likely be allocating more than 10GB to the GPU, otherwise they run into a bandwidth penalty.

Using 13.5GB, we now have a 2.7x VRAM generational increase.

What were PC's up to during this time?
Nvidia's flagship Geforce 780 was released in May 2013, with 3GB of VRAM. In Nov. 2013, AMD released the AMD Radeon R290 with 4GB of VRAM. It wasn't until nearly a year later that Nvidia's Geforce 980 was released, in September 2014, with a more comfortable 4GB of VRAM that held its own without issue for many years at 1080p.

Over the 7 year PS4/XO generation, Nvidia went from 3GB on the 780 to 10GB on the 3080, a 3.33x increase.
Remember, PS3 to PS4 was a 20x generational jump compared to the 2.7x generational jump from XO to XSX.
I'm using these comparisons because X360 had the weird thing with ESRAM, and I currently do not know how much of the PS5 VRAM is reserved for the OS, but I believe we can expect similar numbers.

Screen resolution from PS3 to PS4 was an increase from 720p -> 1080p, a 2.25x jump, and PS4 to PS5 is going from 1080p to 2160p, a 4x jump. More VRAM is being used due to the massive resolution increase, and not just texture quality. Yet we are only getting a 2.7x increase in VRAM for consoles? What gives?
The answer is found in I/O.
The PlayStation 5 features 16GB of GDDR6 unified RAM with 448GB/sec memory bandwidth. This memory is synergized with the SSD on an architectural level and drastically boosts RAM efficiency. The memory is no longer "parking" data from an HDD; the SSD can deliver data right to the RAM almost instananeously.
Essentially the SSD significantly reduces latency between data delivery and memory itself. The result sees RAM only holding assets and data for the next 1 second of gameplay. The PS4's 8GB of GDDR5 memory held assets for the next 30 seconds of gameplay.
"There's no need to have loads of data parked in the system memory waiting to potentially be used. The other way of saying that is the most of the RAM is working on the game's behalf."
The SSD allows Sony to keep RAM capacity down and reduce costs.
"The presence of the SSD reduces the need for a massive inter-generational increase in size."
Excerpt from https://www.tweaktown.com/news/7134...ep-dive-into-next-gen-storage-tech/index.html, which was an analysis of the Mark Cerny video "Road to PS5"

So, we have only a 2.7x increase in VRAM because of how the I/O improvement changes the paradigm of how developers utilize memory for next-gen.

We recently learned that this same amazing I/O revolution, will also be enabled on the RTX 30 series.
For those of us who pair an NVME SSD with a 30 series GPU, VRAM isn't going to be a limiting factor in performance thanks to technologies such as DirectStorage and RTX I/O and Sampler Feedback Streaming. See the end of this post for links to articles that go into more detail of what these technologies do.

  • Conclusion of Part 1: 10GB of VRAM in 2021+ is not the same as 10GB of VRAM in 2020.



Part 2: Establishing why people are concerned
There is currently a very large misconception that runs through the PC gaming community.

Many people currently believe that the games they are playing today are using more VRAM than they actually are. This has people worried that since they see that games are already using 8-11GB of VRAM, there is no possible way that this could be enough for Next-gen, even if we consider I/O improvements.
Luckily, people are mistaken.

A long, long, time ago, us old folks used a program called FRAPS, to measure our framerate in games. FRAPS has now been abandonware for almost 8 years, and even before then, almost everyone had migrated to using a program called Rivatuner Statistics Server. RTSS has been the premiere choice of software to display your FPS and other statistics for over a decade. But there is an inherent issue with RTSS, and almost every single other monitoring program on your computer, from GPU-Z, to EVGA Precision, to even HWiNFO64. All of these tools report "Allocated VRAM", which is not the same as "VRAM Usage", thus why we see a number that is far higher than the reality.

GPU-Z: An imperfect tool
GPU-Z claims to report how much VRAM the GPU actually uses, but there's a significant caveat to this metric. GPU-Z doesn't actually report how much VRAM the GPU is actually using — instead, it reports the amount of VRAM that a game has requested. We spoke to Nvidia's Brandon Bell on this topic, who told us the following: "None of the GPU tools on the market report memory usage correctly, whether it's GPU-Z, Afterburner, Precision, etc. They all report the amount of memory requested by the GPU, not the actual memory usage. Cards will (sic) larger memory will request more memory, but that doesn't mean that they actually use it. They simply request it because the memory is available."
Excerpt from https://www.extremetech.com/gaming/...y-x-faces-off-with-nvidias-gtx-980-ti-titan-x

Brandon Bell has been with Nvidia since 2010, and is a Senior Technical Marketing Manager at Nvidia, and helps write Nvidia's architecture whitepapers.

Q: Why only 10 GB of memory for RTX 3080? How was that determined to be a sufficient number, when it is stagnant from the previous generation?

We're constantly analyzing memory requirements of the latest games and regularly review with game developers to understand their memory needs for current and upcoming games. The goal of 3080 is to give you great performance at up to 4k resolution with all the settings maxed out at the best possible price. In order to do this, you need a very powerful GPU with high speed memory and enough memory to meet the needs of the games. A few examples - if you look at Shadow of the Tomb Raider, Assassin's Creed Odyssey, Metro Exodus, Wolfenstein Youngblood, Gears of War 5, Borderlands 3 and Red Dead Redemption 2 running on a 3080 at 4k with Max settings (including any applicable high res texture packs) and RTX On, when the game supports it, you get in the range of 60-100fps and use anywhere from 4GB to 6GB of memory. Extra memory is always nice to have but it would increase the price of the graphics card, so we need to find the right balance.
Excerpt from https://www.nvidia.com/en-us/geforce/news/rtx-30-series-community-qa/

This was answered by Justin Walker, who has been with Nvidia since 2005, and is a product manager for GeForce desktop GPUs.

So how do we find how much VRAM our games are actually using if our current tools are misleading us?
If you want to see the accurate amount of VRAM in use, not just allocated, you need to either use a built in monitoring overlay, such as those found in Doom Eternal or Flight Simulator 2020, or for games that do not offer their own real-time monitors you can use Special K which uses memory budgets to report VRAM.




I've included an example of how these tools report different numbers. This is using a 980 Ti, which has 6GB of VRAM, in the Game Pass version of Flight Simulator 2020.

B5581Tn.png


In this example, the first number is from the FS2020 developer overlay, and is reporting 2426 MB in use.
The 2nd number is from the Special K GPU Widget, and is reporting a rolling average of 2426.4 MB in use.
The 3rd number is the one most people recognize, and is the VRAM that is currently allocated, 5926 MB.

In this example, the reported # that everyone uses, is off by 2.45x!

  • Conclusion of Part 2: Actual VRAM usage in PC games is far less than believed, thanks to misleading software.



Part 3: But what does it mean?!
We have now established that current PC games are actually using less VRAM than we think, and thanks to improvements in the near future like RTX I/O and Sampler Feedback Streaming our VRAM will go even further.


I am extremely confident you will still be able to play any game with Ultra-max textures in 4k on a 3080 10GB for the next 2 years, until Nvidia Hopper is scheduled to come out, and I am very confident you will still be able to do this in 2024, when the next-next GPUs are expected to be released.


For the people who use VRAM for things other than gaming, or for those who are worried about keeping their GPU beyond 4 years, or for those who are still distressed by the VRAM amount despite the evidence before them, Nvidia does have a product for you: the 3080 20GB, or the 3090.

For the rest of us (many who are not even gaming at 4k right now, but lower resolutions such as 1440p or 1080p), the 3080 10GB will not be holding us back.

  • Conclusion of Part 3: 10GB is enough (for now)


If you actually made it this far without skipping to the end, I truly thank you for spending your time reading this, and I hope it has been enlightening.



Supplemental Reading:
Sampler Feedback Streaming: https://devblogs.microsoft.com/dire...edback-some-useful-once-hidden-data-unlocked/
DirectStorage: https://devblogs.microsoft.com/directx/directstorage-is-coming-to-pc/
RTX I/O: https://www.nvidia.com/en-us/geforce/news/rtx-io-gpu-accelerated-storage-technology/
NVIDIA Community Q/A: https://www.nvidia.com/en-us/geforce/news/rtx-30-series-community-qa/
Guide to Special K: https://www.pcgamingwiki.com/wiki/Special_K

Edit v1.1: Minor typo and grammatical corrections. Enhanced layout. Added conclusion to part 3.
Edit v1.2: Added sentence about calling console system memory, vram.
Edit v1.3: 2.45x is not 245%. My bad.
Edit v2: Updated with new information about MSI Afterburner Beta
Edit v2.1: How do I pin this picture at the top of every PC gaming Resetera thread?
 
Last edited:

delicious

Member
Apr 2, 2018
139
Thanks for the informative post, I'll take this in consideration and hopefully upgrading my PC sooner than later :D
 

PS9

Banned
Oct 28, 2017
5,066
I'm just keen to play Borderlands 3 ultra settings @ native 4K60 lol
 

Deleted member 56752

Attempted to circumvent ban with alt account
Banned
May 15, 2019
8,699
Does it make sense to keep the 9700k and upgrade the GPU in a couple years or by then should I just wait to build a new rig completely
 

TaySan

SayTan
Member
Dec 10, 2018
31,522
Tulsa, Oklahoma
More and more I'm thinking about going with a nice custom 3080 instead and use some of that saved money on a ryzen 4000 cpu.
 

RedOnePunch

Member
Oct 26, 2017
2,628
More and more I'm thinking about going with a nice custom 3080 instead and use some of that saved money on a ryzen 4000 cpu.

video card upgrades are easier, unless it's an existing system where you can upgrade the cpu/mb later. Cheaping out on the CPU when building a PC can end up costing more in the long run.
 

sweetmini

Member
Jun 12, 2019
3,921
We always frame our minds to what we know now.
However, with the big memory pools and big processing capabilities, we may see emerge new use cases :)
For example, i could easily see two players playing two different games on the same computer.
One outputting on TV (sound source TV), one on monitor (sound source PC).
10GB is ample enough in my opinion for single player use cases.
 
OP
OP
Darktalon

Darktalon

Member
Oct 27, 2017
3,266
Kansas
Just completely shitting on the 3070 buyers tbh
8GB will be more than enough for resolutions below 4k.
This is mainly to both inform people of the real #'s of vram usage, and to confirm that you will not need to purchase a 20GB 3080 if you are only playing in 4k, and replacing your gpu at least every other generation.
 

Wollan

Mostly Positive
Member
Oct 25, 2017
8,816
Norway but living in France
Even on a PS5, if it's a non-exclusive game (multiplatform) it likely won't use the systems memory + SSD I/O anywhere near to the max as it needs to work on other platforms. Only exclusives can truly design towards the low-latency & high troughput of the PS5 SSD I/O i.e. keeping only '1 second' of gameplay in memory vs traditional 30+ seconds.
So we will see if 10GB is enough for Nvidia to last the generation when there are zero exclusives and the avid gamer PC baseline not being anywhere near for this I/O dream to happen just yet (five years maybe?). VRAM might just continue to balloon until the PC avid-gamer baseline at least reaches XsX levels of I/O. Or it might just become that the PC is driving the multiplatform baseline here (being the lowest common denominator) for several years (before finally surpassing XsX levels).
 
Last edited:

leecming

Banned
Oct 31, 2017
74
find it kinda amazing that non-technical folks are putting in so much effort to try to argue 10GB is enough
 

Simuly

Alt-Account
Banned
Jul 8, 2019
1,281
To think the big AAA games that are built with the old PS4/XB1 and their netbook CPUs and HDDs today can eat 6GB of your GPUs memory easily @ 4K.

So can you imagine something built only for consoles that are about 5-6X faster than those, and what memory these games are going to eat @ 4K on PC? You'll be pushing right up against that 10GB limitation.
 

dex3108

Member
Oct 26, 2017
22,630
I would love to understand how you come to the conclusion that 8GB of VRAM being recommended, in any way also invalidates the statement that 10GB is enough.

He says that 8GB is minimum or it soon will be minimum. If we take that 10GB will soon be recommended we could end up in situation where next step is that even 10GB go to minimum soon enough. And where that leaves most popular GPU segment (up to 3060) that will get even less than 8GB of VRAM? Nvidia is pushing same amount of RAM on their GPUs for 3rd generation. AMD on the other hand was pushing more VRAM but cards couldn't actually use full potential of that VRAM due to architecture and various other things. But we won't know how things are until first full next-gen AAA titles are out and that will take year or two.

In the end my opinion is that anyone who is buying GPU this year looks ahead max 2 years just in case. You won't be buying one GPU for entire generation or as you said to 2024.
 

leecming

Banned
Oct 31, 2017
74
Can't wait for people to make this argument when we start seeing the 16GB 3070 and 20GB 3080 cards show up
 

Simulmondo

Member
Jun 8, 2020
14
Between USA and Europe
He says that 8GB is minimum or it soon will be minimum. If we take that 10GB will soon be recommended we could end up in situation where next step is that even 10GB go to minimum soon enough. And where that leaves most popular GPU segment (up to 3060) that will get even less than 8GB of VRAM? Nvidia is pushing same amount of RAM on their GPUs for 3rd generation. AMD on the other hand was pushing more VRAM but cards couldn't actually use full potential of that VRAM due to architecture and various other things. But we won't know how things are until first full next-gen AAA titles are out and that will take year or two.

In the end my opinion is that anyone who is buying GPU this year looks ahead max 2 years just in case. You won't be buying one GPU for entire generation or as you said to 2024.
Technically 8GB will be enough for 4k... FOREVER. Is math!
BTW the marketing machine will work to wash brains.
 

Deleted member 11276

Account closed at user request
Banned
Oct 27, 2017
3,223
Meanwhile lead id Tech Engine Programmer doesn't agree




IDTech memory allocation works quite differently compared to other engines.

It's understandable why he thinks that coming from his engine which locks Ultra Nightmare texture streaming to 8 GB for no logical reason, as the texture quality between High and Ultra Nightmare does not change in the slighest. Same with Wolfenstein.

I would not take his words as the industry standard here, especially because IDTech likely won't use DirectStorage and Sampler Feedback, as its based on Vulkan.

Really hope that gets sorted out and Sampler Feedback /DirectStorage will have a Vulkan equivalent, otherwise IDTech games will have trouble running on Xbox Lockhart, which apparently only has 4 GB of GPU optimized memory.
 

elelunicy

Member
Oct 27, 2017
175
Sure 10GB is enough for people who play games at low resolutions. I play all my games at 5k, 8k, or even higher and there are lots of games where 12GB is not enough (i.e. FPS drops to below 1, GPU usage drops to below 10%, system RAM usage starts to skyrocket, etc.). I remember back in 2015 when I had a pair of 12GB Titan X Maxwell's and I managed to run out of all 12GB in AC: Syndicate, a game released in the same year. I plan to get a pair of 3090s I wouldn't be surprised at all if I run into games where 24GB is not enough.
 

Simulmondo

Member
Jun 8, 2020
14
Between USA and Europe
Sure 10GB is enough for people who play games at low resolutions. I play all my games at 5k, 8k, or even higher and there are lots of games where 12GB is not enough (i.e. FPS drops to below 1, GPU usage drops to below 10%, system RAM usage starts to skyrocket, etc.). I remember back in 2015 when I had a pair of 12GB Titan X Maxwell's and I managed to run out of all 12GB in AC: Syndicate, a game released in the same year. I plan to get a pair of 3090s I wouldn't be surprised at all if I run into games where 24GB is not enough.
L O L
 

Fredrik

Member
Oct 27, 2017
9,003
He says that 8GB is minimum or it soon will be minimum. If we take that 10GB will soon be recommended we could end up in situation where next step is that even 10GB go to minimum soon enough. And where that leaves most popular GPU segment (up to 3060) that will get even less than 8GB of VRAM? Nvidia is pushing same amount of RAM on their GPUs for 3rd generation. AMD on the other hand was pushing more VRAM but cards couldn't actually use full potential of that VRAM due to architecture and various other things. But we won't know how things are until first full next-gen AAA titles are out and that will take year or two.

In the end my opinion is that anyone who is buying GPU this year looks ahead max 2 years just in case. You won't be buying one GPU for entire generation or as you said to 2024.
That last bit I can agree with. My thinking is that 10GB will be plenty enough now and if it turns out to be less than enough in a few years then I'll just upgrade to that year's 3080 equivalent. Tech moves so fast on PC, I had three GPUs during the last 7 years, there is no reason to think I'll stay on a 3080 until 2027.
 

Burai

Member
Oct 27, 2017
2,090
I think a good rule of thumb is that by the time you really need more than 10GB, the rest of the 3080 will be the bottleneck. 10 GB might not be enough to last the generation, but if that does turn out to be the case, the 3080 won't be enough either. So why pay more for a 3080 with more than 10GB?

Anyone that's ever tried to "futureproof" a PC will know that sinking feeling, having overpaid for stuff like extra RAM, or SLI'd GPUs or dual Pentium 4s, etc. that by the time that tech became useful for real world gaming, the components are already slow and outdated. I'm sure a good few people who bought RTX 20 series cards are feeling that same burn.

I'd far rather pay less for a well balanced card now in the knowledge I may need to upgrade again in 3-5 years than pay more for a load of VRAM that this card will never touch.
 
OP
OP
Darktalon

Darktalon

Member
Oct 27, 2017
3,266
Kansas
Sure 10GB is enough for people who play games at low resolutions. I play all my games at 5k, 8k, or even higher and there are lots of games where 12GB is not enough (i.e. FPS drops to below 1, GPU usage drops to below 10%, system RAM usage starts to skyrocket, etc.). I remember back in 2015 when I had a pair of 12GB Titan X Maxwell's and I managed to run out of all 12GB in AC: Syndicate, a game released in the same year. I plan to get a pair of 3090s I wouldn't be surprised at all if I run into games where 24GB is not enough.
I agree that for resolutions above 4k, more vram will be needed. This is why we see the 3090 marketed as both an 8k card or a content creators card. 4x 4k will choke nearly anything.
 

CreepingFear

Banned
Oct 27, 2017
16,766
Part 1: Historical evidence of consoles vs PC VRAM requirements
Historically, PC requirements have been at least semi-dictated by the games being created on consoles.

I do believe we will still see Ultra Texture packs on PC that push further than consoles, and mods too will always push boundaries, but establishing where the consoles will be at is critical to understand what pushing further really means.

First we must put things into context. The jump between PS3 and PS4 was 256mb to 8GB. 32x increase in VRAM!

The main reason that early gen GPU's such as the 780 could not keep up for the entire 7 years, was that 8GB of VRAM was absolutely insane for 2013 when the PS4 was released. In fact this was a surprise to almost everyone.

Taken from https://www.eurogamer.net/articles/df-hardware-spec-analysis-playstation-4

PS4 came out Nov 2013 with an unprecedented 8GB of VRAM, with 3GB allocated to the OS, and 5GB available to developers. This was then split between CPU and GPU as the developers chose. Using this metric, we went from 256MB to 5GB, a 20x increase.

The jump between PS4/XO and PS5/XSX is 8GB to 16GB, 2x increase.

Let's get even more specific. The 8GB in Xbox One is also shared, in fact we know that Xbox One has 3GB reserved for OS and 5GB for the rest of the system.
The XSX has 13.5GB of VRAM to allocate to games, and must share this amount with the CPU. Also only 10GB of the XSX's VRAM has the faster bandwidth of 560GB/s, the other 3.5 GB runs at 336 GB/s. Developers will not likely be allocating more than 10GB to the GPU, otherwise they run into a bandwidth penalty.

Using 13.5GB, we now have a 2.7x VRAM generational increase.

What were PC's up to during this time?
Nvidia's flagship Geforce 780 was released in May 2013, with 3GB of VRAM. In Nov. 2013, AMD released the AMD Radeon R290 with 4GB of VRAM. It wasn't until nearly a year later that Nvidia's Geforce 980 was released, in September 2014, with a more comfortable 4GB of VRAM that held its own without issue for many years at 1080p.

Over the 7 year PS4/XO generation, Nvidia went from 3GB on the 780 to 10GB on the 3080, a 3.33x increase.
Remember, PS3 to PS4 was a 20x generational jump compared to the 2.7x generational jump from XO to XSX.
I'm using these comparisons because X360 had the weird thing with ESRAM, and I currently do not know how much of the PS5 VRAM is reserved for the OS, but I believe we can expect similar numbers.

Screen resolution from PS3 to PS4 was an increase from 720p -> 1080p, a 2.25x jump, and PS4 to PS5 is going from 1080p to 2160p, a 4x jump. More VRAM is being used due to the massive resolution increase, and not just texture quality. Yet we are only getting a 2.7x increase in VRAM for consoles? What gives?
The answer is found in I/O.

Excerpt from https://www.tweaktown.com/news/7134...ep-dive-into-next-gen-storage-tech/index.html, which was an analysis of the Mark Cerny video "Road to PS5"

So, we have only a 2.7x increase in VRAM because of how the I/O improvement changes the paradigm of how developers utilize memory for next-gen.

We recently learned that this same amazing I/O revolution, will also be enabled on the RTX 30 series.
For those of us who pair an NVME SSD with a 30 series GPU, VRAM isn't going to be a limiting factor in performance thanks to technologies such as DirectStorage and RTX I/O and Sampler Feedback Streaming. See the end of this post for links to articles that go into more detail of what these technologies do.

Conclusion of Part 1: 10GB of VRAM in 2021+ is not the same as 10GB of VRAM in 2020.




Part 2: Establishing why people are concerned
There is currently a very large misconception that runs through the PC gaming community.
Many people currently believe that the games they are playing today are using more VRAM than they actually are. This has people worried that since they see that games are already using 8-11GB of VRAM, there is no possible way that this could be enough for Next-gen, even if we consider I/O improvements.
Luckily, people are mistaken.

A long, long, time ago, us old folks used a program called FRAPS, to measure our framerate in games. FRAPS has now been abandonware for almost 8 years, and even before then, almost everyone had migrated to using a program called Rivatuner Statistics Server. RTSS has been the premiere choice of software to display your FPS and other statistics for over a decade. But there is an inherent issue with RTSS, and almost every single other monitoring program on your computer, from GPU-Z, to EVGA Precision, to even HWiNFO64. All these tools report "Allocated VRAM", which is not the same as "VRAM Usage" which is why we see a number that is far higher than is the reality.


Excerpt from https://www.extremetech.com/gaming/...y-x-faces-off-with-nvidias-gtx-980-ti-titan-x

Brandon Bell has been with Nvidia since 2010, and is a Senior Technical Marketing Manager at Nvidia, and helps write Nvidia's architecture whitepapers.


Excerpt from https://www.nvidia.com/en-us/geforce/news/rtx-30-series-community-qa/

This was answered by Justin Walker, who has been with Nvidia since 2005, and is a product manager for GeForce desktop GPUs.

So how do we find how much VRAM our games are actually using if our tools are being misleading to us?
If you want to see the accurate # VRAM in use, not just allocated, you need to either use a built in monitoring overlay, such as found in Doom Eternal, or Flight Simulator 2020, or for other games that do not offer their own monitors, you can use Special K which uses memory budgets to report VRAM.




I've included an example of how these tools report different numbers. This is using a 980 Ti, which has 6GB of VRAM, in the Game Pass version of Flight Simulator 2020.

B5581Tn.png


In this example, the first number is from the FS2020 developer overlay, and is reporting 2426 MB in use.
The 2nd number is from the Special K GPU Widget, and is reporting a rolling average of 2426.4 MB in use.
The 3rd number is the one most people recognize, and is the VRAM that is currently allocated, 5926 MB.

In this example, the reported # that everyone uses, is off by ~245%!

Conclusion of Part 2: Actual VRAM usage in PC games is far less than believed




Part 3: But what does it mean?!
So, we now know that current PC games are actually using less VRAM than we think, and thanks to improvements like RTX I/O and Sampler Feedback Streaming our VRAM goes even further.

I am extremely confident you will still be able to play any game with Ultra max textures in 4k on a 3080 10GB in 2 years when Nvidia Hopper is scheduled to come out, and very confident you will still be able to do this in 2024, when the next-next GPUs are expected to come out.

For the people who use VRAM for things other than gaming, or for those who are worried about keeping their GPU beyond 4 years, or for those who are still distressed by the VRAM amount despite the evidence before them, Nvidia does have a product for you: the 3080 20GB, or the 3090.

For the rest of us (many who are not gaming at 4k right now, but lower resolutions such as 1440p or 1080p), the 3080 10GB will not be holding us back.

If you actually made it this far without skipping to the end, thank you for spending your time reading this, and I hope it has been enlightening.




Supplemental Reading:
Sampler Feedback Streaming: https://devblogs.microsoft.com/dire...edback-some-useful-once-hidden-data-unlocked/
DirectStorage: https://devblogs.microsoft.com/directx/directstorage-is-coming-to-pc/
RTX I/O: https://www.nvidia.com/en-us/geforce/news/rtx-io-gpu-accelerated-storage-technology/
NVIDIA Community Q/A: https://www.nvidia.com/en-us/geforce/news/rtx-30-series-community-qa/
Guide to Special K: https://www.pcgamingwiki.com/wiki/Special_K

Thanks. You've made me feel much better if I have to "settle for a 3080" if I can't get a 3090.
 

Terbinator

Member
Oct 29, 2017
10,262
It is a great write-up but I feel like the major thing to take away here is the usage of Special K rather than other tools for reporting.

If you're about to drop £500-700 on a card just wait - if you can - for the competitor card releases and next gen games over the christmas period to launch first.

We have games already that ask internally for a lot VRAM @ 4K (GTA5 comes to mind) with an internal monitor (GR Breakpoint results removed, seems more like a bug).

DirectStorage and RTX/IO is nice, but we also don't know on how exactly these are going to play out from a component point of view in regards to the rest of users systems.
 
Last edited:

Wollan

Mostly Positive
Member
Oct 25, 2017
8,816
Norway but living in France
I think a good rule of thumb is that by the time you really need more than 10GB, the rest of the 3080 will be the bottleneck. 10 GB might not be enough to last the generation, but if that does turn out to be the case, the 3080 won't be enough either. So why pay more for a 3080 with more than 10GB?

Anyone that's ever tried to "futureproof" a PC will know that sinking feeling, having overpaid for stuff like extra RAM, or SLI'd GPUs or dual Pentium 4s, etc. that by the time that tech became useful for real world gaming, the components are already slow and outdated. I'm sure a good few people who bought RTX 20 series cards are feeling that same burn.

I'd far rather pay less for a well balanced card now in the knowledge I may need to upgrade again in 3-5 years than pay more for a load of VRAM that this card will never touch.

A sensible post Burai. The 3080 seems like a really well-balanced card & pricepoint when one keeps in mind that nothing lasts forever.
Personally I expect the 3080 will to start to show its age by the time RTX I/O & DirectStorage actually starts to proper take off at scale on PC.
 
Last edited:

Deleted member 16908

Oct 27, 2017
9,377
Sure 10GB is enough for people who play games at low resolutions. I play all my games at 5k, 8k, or even higher and there are lots of games where 12GB is not enough (i.e. FPS drops to below 1, GPU usage drops to below 10%, system RAM usage starts to skyrocket, etc.). I remember back in 2015 when I had a pair of 12GB Titan X Maxwell's and I managed to run out of all 12GB in AC: Syndicate, a game released in the same year. I plan to get a pair of 3090s I wouldn't be surprised at all if I run into games where 24GB is not enough.

Ladies and gentlemen, the Patrick Bateman of PC gaming.
 

sweetmini

Member
Jun 12, 2019
3,921
By the way, out of my head, what i think i remember as the acceptable sizes in the past. I am out of sync now with my old 970, i may be wrong here and there, still... my feeling. To think that early on i was changing my gpu every year, it's crazy :)
1997: 12 MiB 4+ 8
1998: 16 MiB 8+ 8
1999: 16 MiB 8+ 8
2000: 16 MiB
2001: 32 MiB
2002: 64 MiB
2003: 128 MiB
2004: 128 MiB
2005: 256 MiB
2006: 256 MiB
2007: 512 MiB
2008: 512 MiB
2009: 512 MiB
2010: 512 MiB
2011: 1 GiB
2012: 1 GiB
2013: 1 GiB
2014: 1 GiB
2015: 2 GiB
2016: 2 GiB
2017: 3 GiB
2018: 4 GiB
2019: 5 GiB
2020: 5 GiB
 

liquidmetal14

Banned
Oct 25, 2017
2,094
Florida
It's never been a concern due to the paradigm shift we are seeing with the new consoles and those techniques/advancements tricking down to PC. I said it a while back, consoles, especially PS5, have something that even a PC gamer is envious of. An advancement that will change the way games are designed and how fast the HW/SW interacts with each other.

The only question was when are we going to see such advancements gone to our powerul PC's? Well, looks like soon and I can't wait till we are well into this new standard where the HW requirements on our PC's will finally be utilized to s greater extent.
 

dmix90

Member
Oct 25, 2017
1,885
I am the person who got fucked by poor VRAM related choices multiple times and i tell you that i always had an idea that it will be "just enough"! I get the same feeling now with 3080 10GB version...


ATI Radeon HD 4870 512MB version -> GTA IV -> Barely playable with missing or flickering textures, had to use .ini hacks to offload some to system ram to make it even close to stability of console version installed on HDD.

NVIDIA GTX 680 2GB -> Ryse: Son of Rome -> Can't even select High texture setting even though this card is like few times more powerful than Xbox One. Pretty sure Quantum Break is unplayable on this card as well.

NVIDIA GTX 970 3.5GB -> This one chokes on a lot of stuff... where it supposed to be on par with enhanced consoles, it falls short because of VRAM capacity and bandwidth.


RTX IO/DirectStoragePC is not even ready yet, and it will only be available to RTX cards and new AMD cards on PC, while console API's are available for every developer already and there is a single spec to target( fuck Lockhart ) so you can utilise every VRAM saving trick available in your console version...

I am still thinking about 3080 though.... probably will be another entry to my list of fuckups lol
 

Magio

Member
Apr 14, 2020
647
RTX IO might make more efficient use of VRAM, but by bypassing regular memory it will also make having a large amount of VRAM crucial.

I'm not into high end PC gaming but if I was I'd wait for Ampere Ti versions or SKUs with more VRAM in general.
 

Spence

Member
Oct 27, 2017
1,119
Sweden
I am the person who got fucked by poor VRAM related choices multiple times and i tell you that i always had an idea that it will be "just enough"! I get the same feeling now with 3080 10GB version...

ATI Radeon HD 4870 512MB version -> GTA IV -> Barely playable with missing or flickering textures, had to use .ini hacks to offload some to system ram to make it even close to stability of console version installed on HDD.

NVIDIA GTX 680 2GB -> Ryse: Son of Rome -> Can't even select High texture setting even though this card is like few times more powerful than Xbox One. Pretty sure Quantum Break is unplayable on this card as well.

NVIDIA GTX 970 3.5GB -> This one chokes on a lot of stuff... where it supposed to be on par with enhanced consoles, it falls short because of VRAM capacity and bandwidth.

So you are comparing 3.5GB with 10GB and you're worried? 🙄

It's not like you are being offered a bunch of choices and have to choose one, the industry is setting a standard here at the moment so you can just go with the flow.
 

dmix90

Member
Oct 25, 2017
1,885
So you are comparing 3.5GB with 10GB and you're worried? 🙄

It's not like you are being offered a bunch of choices and have to choose one, the industry is setting a standard here at the moment so you can just go with the flow.
I am worried, yes. It's just NVIDIA hype atm... soon they will offer enhanced cards with more VRAM or even same cards with more VRAM and it will suddenly matter. That standard won't be standard for too long and i know for a fact that it will be painful to see Ultra textures option in game menu which is on par with consoles, but you can't use it because it chokes your GPU... even if it has few times more raw power than these consoles.

People keep forgetting that what you are playing now is designed for Xbox One... next gen jump is coming soon and it's going to be brutal.
 

Laiza

Member
Oct 25, 2017
2,171
I talked to some dev colleagues and they all pretty much said the same. How can people still defend that low amount of vram for the next generation?!
They didn't say 8GB is "low", though. They just say it is the floor (which, currently with the 3070, it is). And to be honest, that's going to be true for quite a long time, especially since we won't see the same ballooning of texture sizes or resolution we saw with this generation, as we're not going ABOVE 4k any time soon.

I keep saying, you have to have an argument for why we'd suddenly start dumping post-4k assets onto games when we're stuck trying to squeeze decent frame rates out of 4k resolution. We're not looking at the same situation as the current gen at all.
I am worried, yes. It's just NVIDIA hype atm... soon they will offer enhanced cards with more VRAM or even same cards with more VRAM and it will suddenly matter. That standard won't be standard for too long and i know for a fact that it will be painful to see Ultra textures option in game menu which is on par with consoles, but you can't use it because it chokes your GPU... even if it has few times more raw power than these consoles.

People keep forgetting that what you are playing now is designed for Xbox One... next gen jump is coming soon and it's going to be brutal.
Not happening. 10GB is enough to play 100% of games at console parity.

People have said this in the past and they always end up with GPUs that can't run decent frame rates anyway despite having tons of VRAM. You'll end up overpaying for something you can't even use.

Even beyond that, we're talking post-4k textures being needed to saturate that VRAM. Let's be real - how much are you even going to notice the difference? Would it even bother you that much if you could "only" run textures that have 1:1 texel density at 4k resolution? Like, how greedy are we being here?
 

Engin

Member
Oct 25, 2017
193
I see full vram usage on my 2080ti while playing fs2020... i still believe that next gen games will require more vram than the current games (shadow of the tomb raider etc)
 

Laiza

Member
Oct 25, 2017
2,171
I see full vram usage on my 2080ti while playing fs2020... i still believe that next gen games will require more vram than the current games (shadow of the tomb raider etc)
Did you not read the part about how inaccurate OSDs can be in reporting the VRAM usage?

Fact is, FS2020 doesn't use more than 6GB of VRAM at 4k ultra. Your 2080 Ti is nowhere close to saturated. Get the facts straight first before you start fear mongering.
 
OP
OP
Darktalon

Darktalon

Member
Oct 27, 2017
3,266
Kansas
I see full vram usage on my 2080ti while playing fs2020... i still believe that next gen games will require more vram than the current games (shadow of the tomb raider etc)
Please, it couldn't be possibly more obvious you did not read my post, or even skim it. I covered this exact game with a screenshot even.