• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.
  • We have made minor adjustments to how the search bar works on ResetEra. You can read about the changes here.

GhostofWar

Member
Apr 5, 2019
512
You can clearly see how the gpu is underutilized (50-70%) with 10gb vram usage. Afterburner isn't the best tool to monitor vram usage but come the fuck on😂 you can clearly see it 5:48 minutes into the video, he says it stutters because of vram.
I don't understand why you all are stubborn. this is a 13 year old game, now imagine a 2022 next gen game on 4k 🤷🏻‍♂️

Imagine a world where gpu utilization could be impacted by other factors

www.dsogaming.com

Crysis Remastered suffers from single-thread CPU issues, just like the original game

Crysis is heavily CPU bottlenecked, even on modern-day systems. Unfortunately, Crysis Remastered suffers from the very same CPU optimization issues.
 

Flygon

Member
Oct 28, 2017
1,384
Isn't the main reason the amount of VRAM on the GTX 3080 not 20GB is because the memory modules with the required density aren't manufactured yet?

Unless I've dramatically misread the situation, I was assuming the only reason the amount of VRAM was so low was because they couldn't get more VRAM without doubling the amount of VRAM chips needed.
 
OP
OP
Darktalon

Darktalon

Member
Oct 27, 2017
3,274
Kansas
Isn't the main reason the amount of VRAM on the GTX 3080 not 20GB is because the memory modules with the required density aren't manufactured yet?

Unless I've dramatically misread the situation, I was assuming the only reason the amount of VRAM was so low was because they couldn't get more VRAM without doubling the amount of VRAM chips needed.
They could have put them on both sides of the PCB, just like the 3090. And if the 3080 20GB is true, that is what they will be doing.

But GDDR6X is also expensive, and if 20GB was the default, you would not have $700 as the MSRP. I honestly believe that a 20GB 3080 will be $900.
 

Zips

Member
Oct 25, 2017
3,917
You can clearly see how the gpu is underutilized (50-70%) with 10gb vram usage. Afterburner isn't the best tool to monitor vram usage but come the fuck on😂 you can clearly see it 5:48 minutes into the video, he says it stutters because of vram.
I don't understand why you all are stubborn. this is a 13 year old game, now imagine a 2022 next gen game on 4k 🤷🏻‍♂️
The Crysis Remaster is a fucking abomination of a "Remaster" and is more akin to a port, having used the X360/PS3 version of Crysis as its basis. Hell, the damn thing is still super reliant on a single CPU thread, which is one of the issues this "Remaster" was supposed to address and didn't actually touch. Its poor performance has zero to do with any allocation of VRAM. It's just shit through and through.

You can see for yourself by visiting our own thread about the game here and seeing literally everyone complaining about performance across all varieties of graphical options. There is hitching and performance drops reported in pretty much every reply.

www.resetera.com

Crysis Remastered |OT| I Know Now Why You CryEngine OT

👩‍💻 Developer: Crytek 🎮 Platforms: PC (EGS, Steam); PlayStation 4 (PSN); Switch (eShop); Xbox One (XBL) 🏷️ Price: $29.99/€29.99/£24.99 or your regional equivalent* 📅 Release date: July 23rd, 2020 (Switch); September 18th, 2020 (all other platforms) * Pricing on platforms/stores other than Steam...
 

mhayze

Member
Nov 18, 2017
555
Not sure how many of you run multi-monitor. I have a 4k display + a 1440p screen, which will eventually be switched out for 2x4k. In Windows 10, DWM (the desktop window manager) with a series of apps running including browsers and productivity apps, uses over 2GB of VRAM dedicated, plus some shared VRAM. I have 32GB of system RAM so that I can jump in an out of games with all my browsers and editors up. Sometimes, when I've left my PC on for a few days and change resolutions a few times, it goes up to 3GB+ before I start a single game. I've already throttled browser GPU setting to stop them from grabbing too much VRAM, otherwise firefox alone can use 2GB. Those thinking this isn't "real" and games can use all the VRAM they need, try starting Alyx with all this stuff in use, it makes me want to hurl with the stutter.

Now this is with an 8GB card, but I think that having a comfortable multi-monitor setup where I don't need to close every app before playing a game is a good example of why I would prefer to have more than 10GB. I agree that at this time 20GB is going to be expensive and not the best bang for the buck but I wish 16GB had been supported - as far as I can tell, 15GB should be possible (using 1/2 density chips for the back of the board, similar to how you can run 4x8 + 4x4 on a quad channel x99 board to get 48GB at full speed).
 

Flygon

Member
Oct 28, 2017
1,384
They could have put them on both sides of the PCB, just like the 3090. And if the 3080 20GB is true, that is what they will be doing.

But GDDR6X is also expensive, and if 20GB was the default, you would not have $700 as the MSRP. I honestly believe that a 20GB 3080 will be $900.
I'm running under the assumption that Nvidia are betting that double capacity chips are substantially cheaper than engineering for double the amount of existing chips.
 

Shyotl

Member
Oct 25, 2017
1,272
I'll just wait. I mean, even if 10gigs is 'enough', I cannot warrant going from a 16gig card to a 10gig card. I'll simply wait until something else comes out. I've always bought cards with extra ram since like, the old 9800pro days, because historically I found that it eventually does matter after 2-3 years. My Radeon VII will hold me over unless it decides to finally catch fire sometime soon. I also run two 1440p monitors, and usually have multiple gpu-accelerated applications open at once, so spare vram is useful for me.
 

Tora

The Enlightened Wise Ones
Member
Jun 17, 2018
8,650
Is VRAM utilisation lower at 1440p vs 4k, even if the same "high definition" textures are used?
 
Oct 25, 2017
41,368
Miami, FL
Of course the nvidia employee will tell on reddit "10 Gb vram is enough", that's why they are nvidia employee

Also they have to do marketing to sell not-future proof $ 700 gpus to sell the 20 GB version for $ 900 later.

Steve Burke from GN was also wrong by answering the question "Will 10 GB enough for the future" with "Yes it is enough today and we made test with games (this gen games, not future!)".

The explanation is simple.
- DirectStorage is only coming to PC in beta version sometimes next year. It will be ready anytime in 2022. DirectStorage is mandatory to access the nvme ssd through the protocol directly, lessen the I/O strain on the devices, enable more IOPS etc.
- VRS and sampler feedback streaming is mandatory, but it is not known that many developers will use it.
Add these two things together and it will be clear that at least until sometimes in 2022, more Vram will still be very very helpful.

That also means pc gamers need more Vram for the graphics engine being able to allocate more data and textures, even if they are not used.

Not enough Vram for allocation, means higher probability for stutters, frametime lags and so on.


That's why Nvidia is preparing a 16/20 GB versions of their cards. They know it. Don't fall for it buying these vram bottlenecked cards.
mmmmmmmmmmmmmm

nah.
 

Jroc

Member
Jun 9, 2018
6,145
Isn't the main reason the amount of VRAM on the GTX 3080 not 20GB is because the memory modules with the required density aren't manufactured yet?

Unless I've dramatically misread the situation, I was assuming the only reason the amount of VRAM was so low was because they couldn't get more VRAM without doubling the amount of VRAM chips needed.

GDDR6X was too expensive. The 3090 can get away with it because it's already expensive as hell. The 3070 was locked to 8GB GDDR6 because they couldn't realistically market a 16GB card when the higher end model only had 10GB.

AMD isn't using GDDR6X so they can afford to crank the VRAM without worrying about the marketing optics or cost issues.
 
MSI Afterburner now can show per process VRAM
OP
OP
Darktalon

Darktalon

Member
Oct 27, 2017
3,274
Kansas
Good news everyone, MSI Afterburner developer Unwinder saw my thread, and added a way to see per process VRAM in the current beta!

  1. Install MSI Afterburner 4.6.3 Beta 2 Build 15840 from https://www.guru3d.com/files-details/msi-afterburner-beta-download.html
  2. Enter the MSI Afterburner settings/properties menu
  3. Click the monitoring tab (should be 3rd from the left)
  4. Near the top and next to "Active Hardware Monitoring Graphs" click the "..."
  5. Click the Checkmark next to "GPU.dll", and hit OK
  6. Scroll down the list until you see "GPU Dedicated Memory Usage", "GPU Shared Memory Usage", "GPU Dedicated Memory Usage \ Process", "GPU Shared Memory Usage \ Process"
  7. Pick and choose what you want to be tracked using the checkmarks next to them. "GPU Dedicated Memory Usage \ Process" is the # that most closely reflects the # we find in FS2020 Developer Overlay and Special K (DXGI_Budget, except Unwinder uses D3DKMT api)
  8. Click show in On-Screen Display, and customize as desired.
  9. ???
  10. Profit

Important Note:
"GPU dedicated memory \ process" and "GPU shared memory \ process" are currently not supported for EAC/BattleEye protected games (as they require opening game process from external application, such request won't work for EAC/BE protected games).
-Unwinder
 
OP
OP
Darktalon

Darktalon

Member
Oct 27, 2017
3,274
Kansas
I created a thread over here to specifically discuss MSI Afterburner's new capabilities, this is big news.

www.resetera.com

MSI Afterburner can now display per process VRAM!

Nov 17th 2020: Per Process VRAM Monitoring now is supported internally, and works in all games. Install MSI Afterburner 4.6.3 Beta 4 Build 15910 from -MSIAfterburnerSetup463Beta4Build15910.rar']over here. Enter the MSI Afterburner settings/properties menu Click the monitoring tab (should be...
 
OP
OP
Darktalon

Darktalon

Member
Oct 27, 2017
3,274
Kansas
Now that MSI Afterburner and Special K can both display per process VRAM usage, I'd love to see people posting examples of their games. I haven't been able to get a 3080 yet, so I'm still unable to show examples of cards with more than 6GB of VRAM available.
 
OP
OP
Darktalon

Darktalon

Member
Oct 27, 2017
3,274
Kansas
Oh wow I'm youtube famous now! Just found this video linked from a reddit post talking about Doom Eternal 8GB vram limitation... makes me wonder how many other videos are out there covering this that I don't know about.
 

Azerare

Member
Oct 25, 2017
1,713
Wow this thread is super cool. I definitely haven't been in the Discord for weeks and already know all of this shit. More people should see this to see they don't think they NEED 20gbs so soon.
 
OP
OP
Darktalon

Darktalon

Member
Oct 27, 2017
3,274
Kansas
Oh look, watch dogs Legion updated their system requirements, and now that they have 3080 as the recommended card instead of 2080 ti, magically the VRAM spot is now listed as 10 GB.

4EiYVqAfE3amy7iPQzWqzn-1200-80.jpg
 

vitormg

Member
Oct 26, 2017
1,948
Brazil
Oh look, watch dogs Legion updated their system requirements, and now that they have 3080 as the recommended card instead of 2080 ti, magically the VRAM spot is now listed as 10 GB.

4EiYVqAfE3amy7iPQzWqzn-1200-80.jpg
Lol. Who would've thought?

On a side note, I'm looking forward to when you get your 3080 and make those sweet VRAM tests for us to clear things up and shut up trolls.

Will you do it for us? :)
 
OP
OP
Darktalon

Darktalon

Member
Oct 27, 2017
3,274
Kansas
Lol. Who would've thought?

On a side note, I'm looking forward to when you get your 3080 and make those sweet VRAM tests for us to clear things up and shut up trolls.

Will you do it for us? :)
Yes, I actually already have the 3080, but been really busy. Hopefully gonna be able to crack into it this week.

Any particular game requests?
 

gozu

Banned
Oct 27, 2017
10,442
America
Will anybody here pay $200 more for 10 extra GBs? Because that's how much it will likely cost. $900 instead of $700.

I very much doubt it. The $700 GPU with 10GB is the sweet spot for the next 2 gens of GPUs. Worst case scenario, you might have to drop internal render a tad, but you will likely reap frame-rate as a reward.

I wish more people thought 10GB was too little, then I could score one maybe :)
 
OP
OP
Darktalon

Darktalon

Member
Oct 27, 2017
3,274
Kansas
DOOM Eternal and Flight Simulator because of their VRAM hungry fame.

Personally I'd like to see AC:Odyssey and Horizon.
Those first two are boring answers, they both already have developer overlays that tell us haha, but I'll do it so we can compare the new msi afterburner per process option.

I'll see what I can do about all 4, I have em all installed but odyssey I think.
 

Cyanity

Member
Oct 25, 2017
9,345
Sure 10GB is enough for people who play games at low resolutions. I play all my games at 5k, 8k, or even higher and there are lots of games where 12GB is not enough (i.e. FPS drops to below 1, GPU usage drops to below 10%, system RAM usage starts to skyrocket, etc.). I remember back in 2015 when I had a pair of 12GB Titan X Maxwell's and I managed to run out of all 12GB in AC: Syndicate, a game released in the same year. I plan to get a pair of 3090s I wouldn't be surprised at all if I run into games where 24GB is not enough.
Lmfao are you serious?

edit - it's good to know that 10GB is good enough for us peasants who only run games at 1440p for the forseeable future.
 

super-famicom

Avenger
Oct 26, 2017
25,338
Sure 10GB is enough for people who play games at low resolutions. I play all my games at 5k, 8k, or even higher and there are lots of games where 12GB is not enough (i.e. FPS drops to below 1, GPU usage drops to below 10%, system RAM usage starts to skyrocket, etc.). I remember back in 2015 when I had a pair of 12GB Titan X Maxwell's and I managed to run out of all 12GB in AC: Syndicate, a game released in the same year. I plan to get a pair of 3090s I wouldn't be surprised at all if I run into games where 24GB is not enough.

Weird flex, but OK.
 

Tora

The Enlightened Wise Ones
Member
Jun 17, 2018
8,650
Yes, I actually already have the 3080, but been really busy. Hopefully gonna be able to crack into it this week.

Any particular game requests?
Everything you've got!

I'd love to know how much Watch Dogs 2 takes up but that's one of the games that isn't compatible with Afterburners tools atm
 

Carbon

Deploying the stealth Cruise Missile
Member
Oct 27, 2017
10,967
Will anybody here pay $200 more for 10 extra GBs? Because that's how much it will likely cost. $900 instead of $700.

I very much doubt it. The $700 GPU with 10GB is the sweet spot for the next 2 gens of GPUs. Worst case scenario, you might have to drop internal render a tad, but you will likely reap frame-rate as a reward.

I wish more people thought 10GB was too little, then I could score one maybe :)
I'm sure some people bought the 3090 because it had extra VRAM. Of course some will pay a couple hundred more to get "peace of mind" memory capacity.

And I can kinda sorta understand it if they're the type of people that plan on keeping graphics cards for 4+ years. But even still, a card in the $700 range will come in another year or two that beats the 3080 with more VRAM. I've gotten caught up in the "paying extra for future-proofing" game before as well, and it rarely works out as well as I'd hoped.
 

elelunicy

Member
Oct 27, 2017
175
Lmfao are you serious?

edit - it's good to know that 10GB is good enough for us peasants who only run games at 1440p for the forseeable future.
Lol 1440p is absolutely, utterly a joke in 2020.

Imagine paying $700 for a graphics card (so easily $1000+ for the whole build) and then proceed to run games at a resolution so far below what $400/$500 consoles do. It's simply embarrassing.
 

Myself

Member
Nov 4, 2017
1,282
Lol 1440p is absolutely, utterly a joke in 2020.

Imagine paying $700 for a graphics card (so easily $1000+ for the whole build) and then proceed to run games at a resolution so far below what $400/$500 consoles do. It's simply embarrassing.

Why is that? I would much rather have better looking graphics or higher framerates, or RT in 1440p than 4K. For a PC, attached to a monitor at a desk, 4k just seems overkill unless you somehow do desktop computing on some 40" screen which seems ludicrous.
 

Yerffej

Prophet of Regret
Member
Oct 25, 2017
23,851
Lol 1440p is absolutely, utterly a joke in 2020.

Imagine paying $700 for a graphics card (so easily $1000+ for the whole build) and then proceed to run games at a resolution so far below what $400/$500 consoles do. It's simply embarrassing.
Is this a joke post? Ridiculous.
 

Alexx

Member
Oct 27, 2017
237
Lol 1440p is absolutely, utterly a joke in 2020.

Imagine paying $700 for a graphics card (so easily $1000+ for the whole build) and then proceed to run games at a resolution so far below what $400/$500 consoles do. It's simply embarrassing.

PC gamers tend care about frame rate over resolution and 1440p is the sweet spot for resolution and performance.
 
OP
OP
Darktalon

Darktalon

Member
Oct 27, 2017
3,274
Kansas
Lol 1440p is absolutely, utterly a joke in 2020.

Imagine paying $700 for a graphics card (so easily $1000+ for the whole build) and then proceed to run games at a resolution so far below what $400/$500 consoles do. It's simply embarrassing.
I'll take 3440x1440 every single day over 3840x2160.

And I'd take 2560x1440p144hz over 4k60hz every single day too.

Edit: you're the same guy who said you play in 5k-8k all the time and had sli titans.

Normal pc enthusiasts wouldn't say such outrageous things

Edit2: You are literally the Lucille Arrested Development meme

tenor.gif
 
Last edited:

alphacat

One Winged Slayer
Member
Oct 27, 2017
4,940
Lol 1440p is absolutely, utterly a joke in 2020.

Imagine paying $700 for a graphics card (so easily $1000+ for the whole build) and then proceed to run games at a resolution so far below what $400/$500 consoles do. It's simply embarrassing.

1440p is fine for PC gaming especially if you're using a monitor on a desk
 

dgrdsv

Member
Oct 25, 2017
12,036
Lol 1440p is absolutely, utterly a joke in 2020.

Imagine paying $700 for a graphics card (so easily $1000+ for the whole build) and then proceed to run games at a resolution so far below what $400/$500 consoles do. It's simply embarrassing.
27" 1440p on the desk looks a lot better/bigger than 65" 4K from a couch so I dunno.
 

elelunicy

Member
Oct 27, 2017
175
Why is that? I would much rather have better looking graphics or higher framerates, or RT in 1440p than 4K. For a PC, attached to a monitor at a desk, 4k just seems overkill unless you somehow do desktop computing on some 40" screen which seems ludicrous.
1440p is fine for PC gaming especially if you're using a monitor on a desk
Resolutions are typically more important for a PC gamer who sits like 2-3 ft from their monitor then a console gamer who sits 6-12 ft from their TV.

For example:

2.5' from a 27" monitor corresponds to a horizontal viewing angle of 42.8° and a vertical viewing angle of 24.1°

7' from a 65" TV corresponds to a horizontal viewing angle of 37.3° and a vertical viewing angle of 21.0°

As you can see the 65" TV fills less of your vision in this scenario by quite significant margin, making resolution less important.
 

gozu

Banned
Oct 27, 2017
10,442
America
Lol 1440p is absolutely, utterly a joke in 2020.

Imagine paying $700 for a graphics card (so easily $1000+ for the whole build) and then proceed to run games at a resolution so far below what $400/$500 consoles do. It's simply embarrassing.

I understand that 4K tvs have been out for a while but beware marketing traps!

Judging a game by its pixelcount is not a good idea. You must look at image quality. How's them particles? lighting? Blur? Hair and cloth physics? Environment physics? The list is endless.

1440p is way more than enough.