Yes I agree, I made it a threadmark.K. Jack's post is required reading. It really explains this well :)
Yes I agree, I made it a threadmark.K. Jack's post is required reading. It really explains this well :)
You can clearly see how the gpu is underutilized (50-70%) with 10gb vram usage. Afterburner isn't the best tool to monitor vram usage but come the fuck on😂 you can clearly see it 5:48 minutes into the video, he says it stutters because of vram.
I don't understand why you all are stubborn. this is a 13 year old game, now imagine a 2022 next gen game on 4k 🤷🏻♂️
They could have put them on both sides of the PCB, just like the 3090. And if the 3080 20GB is true, that is what they will be doing.Isn't the main reason the amount of VRAM on the GTX 3080 not 20GB is because the memory modules with the required density aren't manufactured yet?
Unless I've dramatically misread the situation, I was assuming the only reason the amount of VRAM was so low was because they couldn't get more VRAM without doubling the amount of VRAM chips needed.
The Crysis Remaster is a fucking abomination of a "Remaster" and is more akin to a port, having used the X360/PS3 version of Crysis as its basis. Hell, the damn thing is still super reliant on a single CPU thread, which is one of the issues this "Remaster" was supposed to address and didn't actually touch. Its poor performance has zero to do with any allocation of VRAM. It's just shit through and through.You can clearly see how the gpu is underutilized (50-70%) with 10gb vram usage. Afterburner isn't the best tool to monitor vram usage but come the fuck on😂 you can clearly see it 5:48 minutes into the video, he says it stutters because of vram.
I don't understand why you all are stubborn. this is a 13 year old game, now imagine a 2022 next gen game on 4k 🤷🏻♂️
I'm running under the assumption that Nvidia are betting that double capacity chips are substantially cheaper than engineering for double the amount of existing chips.They could have put them on both sides of the PCB, just like the 3090. And if the 3080 20GB is true, that is what they will be doing.
But GDDR6X is also expensive, and if 20GB was the default, you would not have $700 as the MSRP. I honestly believe that a 20GB 3080 will be $900.
mmmmmmmmmmmmmmOf course the nvidia employee will tell on reddit "10 Gb vram is enough", that's why they are nvidia employee
Also they have to do marketing to sell not-future proof $ 700 gpus to sell the 20 GB version for $ 900 later.
Steve Burke from GN was also wrong by answering the question "Will 10 GB enough for the future" with "Yes it is enough today and we made test with games (this gen games, not future!)".
The explanation is simple.
- DirectStorage is only coming to PC in beta version sometimes next year. It will be ready anytime in 2022. DirectStorage is mandatory to access the nvme ssd through the protocol directly, lessen the I/O strain on the devices, enable more IOPS etc.
- VRS and sampler feedback streaming is mandatory, but it is not known that many developers will use it.
Add these two things together and it will be clear that at least until sometimes in 2022, more Vram will still be very very helpful.
That also means pc gamers need more Vram for the graphics engine being able to allocate more data and textures, even if they are not used.
Not enough Vram for allocation, means higher probability for stutters, frametime lags and so on.
That's why Nvidia is preparing a 16/20 GB versions of their cards. They know it. Don't fall for it buying these vram bottlenecked cards.
Yes. The framebuffer doesn't have to hold as much data.Is VRAM utilisation lower at 1440p vs 4k, even if the same "high definition" textures are used?
Isn't the main reason the amount of VRAM on the GTX 3080 not 20GB is because the memory modules with the required density aren't manufactured yet?
Unless I've dramatically misread the situation, I was assuming the only reason the amount of VRAM was so low was because they couldn't get more VRAM without doubling the amount of VRAM chips needed.
Thank you! :)
Yes, and the same works for DLSS. Less VRAM will be used.Is VRAM utilisation lower at 1440p vs 4k, even if the same "high definition" textures are used?
Thank you. 10GB seems especially fine for 1440p gaming then.
-Unwinder"GPU dedicated memory \ process" and "GPU shared memory \ process" are currently not supported for EAC/BattleEye protected games (as they require opening game process from external application, such request won't work for EAC/BE protected games).
Updated OP to reflect that MSI Afterburner now has this option.
Obviously it's cause there's a downgradeOh look, watch dogs Legion updated their system requirements, and now that they have 3080 as the recommended card instead of 2080 ti, magically the VRAM spot is now listed as 10 GB.
Lol. Who would've thought?Oh look, watch dogs Legion updated their system requirements, and now that they have 3080 as the recommended card instead of 2080 ti, magically the VRAM spot is now listed as 10 GB.
Yes, I actually already have the 3080, but been really busy. Hopefully gonna be able to crack into it this week.Lol. Who would've thought?
On a side note, I'm looking forward to when you get your 3080 and make those sweet VRAM tests for us to clear things up and shut up trolls.
Will you do it for us? :)
Yeah, the game would've looked insane if there were 11GBs on 3080.
DOOM Eternal and Flight Simulator because of their VRAM hungry fame.Yes, I actually already have the 3080, but been really busy. Hopefully gonna be able to crack into it this week.
Any particular game requests?
Those first two are boring answers, they both already have developer overlays that tell us haha, but I'll do it so we can compare the new msi afterburner per process option.DOOM Eternal and Flight Simulator because of their VRAM hungry fame.
Personally I'd like to see AC:Odyssey and Horizon.
Lmfao are you serious?Sure 10GB is enough for people who play games at low resolutions. I play all my games at 5k, 8k, or even higher and there are lots of games where 12GB is not enough (i.e. FPS drops to below 1, GPU usage drops to below 10%, system RAM usage starts to skyrocket, etc.). I remember back in 2015 when I had a pair of 12GB Titan X Maxwell's and I managed to run out of all 12GB in AC: Syndicate, a game released in the same year. I plan to get a pair of 3090s I wouldn't be surprised at all if I run into games where 24GB is not enough.
FFXV with 4K textures.Yes, I actually already have the 3080, but been really busy. Hopefully gonna be able to crack into it this week.
Any particular game requests?
Sure 10GB is enough for people who play games at low resolutions. I play all my games at 5k, 8k, or even higher and there are lots of games where 12GB is not enough (i.e. FPS drops to below 1, GPU usage drops to below 10%, system RAM usage starts to skyrocket, etc.). I remember back in 2015 when I had a pair of 12GB Titan X Maxwell's and I managed to run out of all 12GB in AC: Syndicate, a game released in the same year. I plan to get a pair of 3090s I wouldn't be surprised at all if I run into games where 24GB is not enough.
Everything you've got!Yes, I actually already have the 3080, but been really busy. Hopefully gonna be able to crack into it this week.
Any particular game requests?
I'm sure some people bought the 3090 because it had extra VRAM. Of course some will pay a couple hundred more to get "peace of mind" memory capacity.Will anybody here pay $200 more for 10 extra GBs? Because that's how much it will likely cost. $900 instead of $700.
I very much doubt it. The $700 GPU with 10GB is the sweet spot for the next 2 gens of GPUs. Worst case scenario, you might have to drop internal render a tad, but you will likely reap frame-rate as a reward.
I wish more people thought 10GB was too little, then I could score one maybe :)
Lol 1440p is absolutely, utterly a joke in 2020.Lmfao are you serious?
edit - it's good to know that 10GB is good enough for us peasants who only run games at 1440p for the forseeable future.
Lol 1440p is absolutely, utterly a joke in 2020.
Imagine paying $700 for a graphics card (so easily $1000+ for the whole build) and then proceed to run games at a resolution so far below what $400/$500 consoles do. It's simply embarrassing.
This post is embarrassing.Lol 1440p is absolutely, utterly a joke in 2020.
Imagine paying $700 for a graphics card (so easily $1000+ for the whole build) and then proceed to run games at a resolution so far below what $400/$500 consoles do. It's simply embarrassing.
Is this a joke post? Ridiculous.Lol 1440p is absolutely, utterly a joke in 2020.
Imagine paying $700 for a graphics card (so easily $1000+ for the whole build) and then proceed to run games at a resolution so far below what $400/$500 consoles do. It's simply embarrassing.
Lol 1440p is absolutely, utterly a joke in 2020.
Imagine paying $700 for a graphics card (so easily $1000+ for the whole build) and then proceed to run games at a resolution so far below what $400/$500 consoles do. It's simply embarrassing.
I'll take 3440x1440 every single day over 3840x2160.Lol 1440p is absolutely, utterly a joke in 2020.
Imagine paying $700 for a graphics card (so easily $1000+ for the whole build) and then proceed to run games at a resolution so far below what $400/$500 consoles do. It's simply embarrassing.
Lol 1440p is absolutely, utterly a joke in 2020.
Imagine paying $700 for a graphics card (so easily $1000+ for the whole build) and then proceed to run games at a resolution so far below what $400/$500 consoles do. It's simply embarrassing.
Your privilege is showing, SLI 3090 man.Lol 1440p is absolutely, utterly a joke in 2020.
Imagine paying $700 for a graphics card (so easily $1000+ for the whole build) and then proceed to run games at a resolution so far below what $400/$500 consoles do. It's simply embarrassing.
Lol 1440p is absolutely, utterly a joke in 2020.
Imagine paying $700 for a graphics card (so easily $1000+ for the whole build) and then proceed to run games at a resolution so far below what $400/$500 consoles do. It's simply embarrassing.
27" 1440p on the desk looks a lot better/bigger than 65" 4K from a couch so I dunno.Lol 1440p is absolutely, utterly a joke in 2020.
Imagine paying $700 for a graphics card (so easily $1000+ for the whole build) and then proceed to run games at a resolution so far below what $400/$500 consoles do. It's simply embarrassing.
Why is that? I would much rather have better looking graphics or higher framerates, or RT in 1440p than 4K. For a PC, attached to a monitor at a desk, 4k just seems overkill unless you somehow do desktop computing on some 40" screen which seems ludicrous.
Resolutions are typically more important for a PC gamer who sits like 2-3 ft from their monitor then a console gamer who sits 6-12 ft from their TV.1440p is fine for PC gaming especially if you're using a monitor on a desk
Lol 1440p is absolutely, utterly a joke in 2020.
Imagine paying $700 for a graphics card (so easily $1000+ for the whole build) and then proceed to run games at a resolution so far below what $400/$500 consoles do. It's simply embarrassing.