rashbeep

Member
Oct 27, 2017
9,590
I feel like AMD has an opening here. While the 3000 series are in the end monsters, the performance of 3080 is not near Nvidia's 2X 2080 claims and the power usage is very high. According to Techspot they may have been designed a bit more as server chips. We'll see.

I didn't think Big Navi could compete with 3090 but now I think it's easily possible (of course then we'll get 3090 Ti and the rat race continues)

I felt the same way for years vs Intel. I saw Intel was potentially "wasting" all that die space for discrete GPU users putting an IGP on every single CPU, for example.

others have mentioned this but imo if amd truly had something at that level we would know by now, as they are about to lose a lot of potential customers tomorrow.

but we will see, obviously competition is needed rn in the gpu space
 

Deleted member 17092

User requested account closure
Banned
Oct 27, 2017
20,360
Is it the exact same benchmarking suite as before? If not then I'm not sure how anyone can arrive to the conclusion that the result is due to driver updates.

Afaik yes, he mentioned on ass creed that he actually upped the quality presets vs last tests where the 5700xt lagged even the 2060s and this time it beat both the 2070s and 2060s. Outside of drivers I don't know what explains pretty much all around increases in performance over time vs the 2060s and 2070s.
 

Spoit

Member
Oct 28, 2017
4,193
Con some of the people who are down on the 1080p/1440 numbers explain something to me? I can understand why that would be a good reason to not bother upgrading at all, but if it's CPU limited, why would you want to wait for the RDNA2 chips, which would probably be similarly CPU limited? Why wouldn't you be disapointed in the CPU performance, instead of the part that is being limited?
 

dgrdsv

Member
Oct 25, 2017
12,267
I feel like AMD has an opening here. While the 3000 series are in the end monsters, the performance of 3080 is not near Nvidia's 2X 2080 claims and the power usage is very high.
+72% across all 4K benchmarks is pretty close to 2X actually - and it's even closer to 2X in games which hit h/w the hardest. This gap between 2080 and 3080 will grow with more next gen games arriving on PC.

According to Techspot they may have been designed a bit more as server chips. We'll see.
They really haven't. A100 is a server chip and the differences are apparent. GA10x aren't purely for gaming of course but no NV GPU was since around G80.

I didn't think Big Navi could compete with 3090 but now I think it's easily possible
It's not going to be "easily possible". Navi 21 will have it's own issues with scaling the performance across the WGPs.
Projections so far are putting Navi 21 at double the number of SIMDs of Navi 10 - which coincidentally is what you can have with +50% perf/watt gain in 300W.
This however will not result in doubling of actual performance, similarly to how doubling of FP32 SIMDs hasn't with Ampere (albeit Navi 2x has a higher chance of hitting more here since it's doubling WGPs, not SIMDs inside them).

I felt the same way for years vs Intel. I saw Intel was potentially "wasting" all that die space for discrete GPU users putting an IGP on every single CPU, for example.
Unless RDNA2 will be significantly more area efficient than RDNA1 NV won't "waste" any die space in comparison despite having more h/w on it. So a bad comparison really.
 

Hadoken

Member
Oct 25, 2017
306
How much gain can we expect from a 3x8 pin AIB vs the FE? 10% at least?

The review show the 3080 FE was at 1950ish out of the box even with the power limit. So unless Ampere can hit past 2000mhz on air I'm not sure if it can go any higher due to the chip itself. AIB will still provide better cooling and lower noise level though.
 

dgrdsv

Member
Oct 25, 2017
12,267
Afaik yes, he mentioned on ass creed that he actually upped the quality presets vs last tests where the 5700xt lagged even the 2060s and this time it beat both the 2070s and 2060s.
Changing the quality presets may result in a completely different end results picture. So this isn't the same benchmark already.

Outside of drivers I don't know what explains pretty much all around increases in performance over time vs the 2060s and 2070s.
Again, if the benchmarking suite is different then there can be no attribution of performance changes to drivers.
Different set of games - or even the same games with different settings - will produce a different set of results which may or may not be comparable to the previous ones.

And AFAIR HWU benchmarking is mostly DX12/VK these days - and these renderers are not very "friendly" to driver level optimizations which throws another wrench into this theory.
 

Deleted member 17092

User requested account closure
Banned
Oct 27, 2017
20,360
Changing the quality presets may result in a completely different end results picture. So this isn't the same benchmark already.


Again, if the benchmarking suite is different then there can be no attribution of performance changes to drivers.
Different set of games - or even the same games with different settings - will produce a different set of results which may or may not be comparable to the previous ones.

He only mentioned changing presets for ass creed and even then it's higher presets... New games were death stranding and horizon. Big aaa new releases that people care about the perf for. All the other games previously tested.
 
Oct 29, 2017
3,561
Got an alert about gigabyte cards being available on newegg, went to the site, clicked "add to cart", met with "this item is no longer available" -_-
 

Bane

The Fallen
Oct 27, 2017
5,910
I got a ASUS strix 1080ti a few years back and it's done well for me so I was thinking of getting theirs again but now I see it's $100 more than other brands, what are you paying for in this case? My understanding is other brands are just as good. I've not kept up with PC stuff since my last build in 2015 so would like a bit of help here.
 

Mórríoghain

Member
Nov 2, 2017
5,180
So about pin connectors...

I have one 8 and one 6+2 (2 is not connected) connected to my 2070. And there is another 8 pin connector doing fuck all in the case. Am I set for a 3080? Or do I need proper 8 pins?

This whole thing made me realize I don't know anything about psus.
 
Oct 25, 2017
192
Do we have a somewhat conclusive answer on whether the 10 GB VRAM is enough for...lets say, the next 3 to 4 years?

I know this has been talked about to death all over, but I (a person not particularly knowledge about subjects such as graphics cards and VRAM) have gotten really lost in the weeds-y back & forth of that whole discussion.
 

TheDutchSlayer

Did you find it? Cuez I didn't!
Member
Oct 26, 2017
7,117
The Hauge, The Netherlands
All testing should be done with the Metro Exodus train ride and in Crysis 3 only.

Bonus rumor: https://videocardz.com/newz/nvidia-teases-geforce-rtx-3080-with-20gb-memory
Sadly Crysis 3 would not run, digital Foundry tried that in a lot of ways but was not able to get the game running on the 3080 FE.
Do we have a somewhat conclusive answer on whether the 10 GB VRAM is enough for...lets say, the next 3 to 4 years?

I know this has been talked about to death all over, but I (a person not particularly knowledge about subjects such as graphics cards and VRAM) have gotten really lost in the weeds-y back & forth of that whole discussion.
check out this thread its very very good

www.resetera.com

VRAM in 2020-2024: Why 10GB is enough.

2021 Edit: Stop making threads or comments with false data. Please back up claims with this tool. IMPORTANT EDIT: MSI Afterburner now has a way to display "Per Process VRAM Commit", which I refer to in this article as "VRAM Usage" Please see...
 

Buggy Loop

Member
Oct 27, 2017
1,232
others have mentioned this but imo if amd truly had something at that level we would know by now, as they are about to lose a lot of potential customers tomorrow.

but we will see, obviously competition is needed rn in the gpu space

But right now it feels like a paper launch on Nvidia's side honestly. Not sure the coming months will count that much number wise.( I don't think amd has something in 3080 range either..)
 
Oct 27, 2017
3,731
I'm sure this has been asked before but is the review embargo for AIBs the same moment they're available for purchase? Do reviewers even have any of these cards?
 
Oct 28, 2017
2,816
So I am not planning to OC my card. Is the FE card my best bet? Does the boost in core clocks (in base AIB models) make any discernable difference or only a minimal FPS gain?
 

closure

Alt account
Banned
Jul 21, 2020
15
Do we have a somewhat conclusive answer on whether the 10 GB VRAM is enough for...lets say, the next 3 to 4 years?

I know this has been talked about to death all over, but I (a person not particularly knowledge about subjects such as graphics cards and VRAM) have gotten really lost in the weeds-y back & forth of that whole discussion.


at this point I would assume that buying a 30xx (excluding 90) card locks you into a 40xx purchase. If the rumours of a 20GB 3080 are true, then that'll be the only long lasting card.
 

OmegaDL50

One Winged Slayer
Member
Oct 25, 2017
9,832
Philadelphia, PA
Please don't link MLID videos here, he has posted false and misleading information in the past.

Gamer's Nexus has gone to address this the fact that memory allocation and memory usage are two completely different things. A game can allocate all of the VRAM you have available and Windows taskmaster will show this as well, but actual memory usage is completely different then what is being allocated.
 
Last edited:

Deleted member 17092

User requested account closure
Banned
Oct 27, 2017
20,360
Or don't make assumptions about drivers when you're changing your benchmarking suite with games which clearly favor one IHV over another. Just a thought.

Isn't death stranding a Nvidia sponsored title? There is marked improvement in several Nvidia titles. It isn't just the inclusion of horizon as the 5700xt doesn't have some huge gap there that creates a big outlier anyway. He always points out outliers as well. Hardware unboxed noted drivers as a source of improvement over time. I'm sorry for posting a vid that shows solid 5700xt performance and a 3080 unboxing I guess...
 

Hasney

One Winged Slayer
The Fallen
Oct 25, 2017
19,366
www.techpowerup.com

MSI GeForce RTX 3080 Gaming X Trio Review

The GeForce RTX 3080 Gaming X Trio is MSI's flagship RTX 3080. It comes with a large overclock out of the box, and the cooler is massive. This is the fastest RTX 3080 we've tested so far, and it's the quietest as well. Wow, and that with a classic triple-fan design that doesn't use any fancy...
unknown.png


Looking at the actual performance gains, at 4K resolution in all our games, the MSI RTX 3080 Gaming X Trio is 4% faster than the Founders Edition—not a whole lot, and that's the highest gain out of all reviews today.

So the MSI is the best card they've tested so far, but it's not a wide range
 

xyla

Member
Oct 27, 2017
8,525
Germany
Anybody with a quick tip on the Ventus?
I could check out with one, but no idea if it's a good one - wanted the Gaming X Trio.
 

xyla

Member
Oct 27, 2017
8,525
Germany
If you're not OCing, it's usually good. Looks like the OC headroom isn't great on any card reviewed today anyway.

I would rather under- then overclock. Noise Level is what's keeping me from clicking on buy

Edit: Bought it - we'll see. I can always send it back. Timing is the worst though, gonna go on a one week vacation tomorrow...
 
Last edited:

Rente

Member
Oct 31, 2017
950
Cologne, Germany
The powerlimit makes it practically pointless to look for anything other than the quietest GPU, the range for overclocking is 2-5% max.
I think I will actually go for a 3090, watercooling is also nearly pointless on a 3080 because of this.

You may be able to do a little more by lowering the voltage via the curve editor, because the GPU needs less power with that, but Nvidia has probably optimized it to the edge this time (unlike with 1080 and 2080).
 
Last edited:

Gashprex

Member
Oct 25, 2017
1,038
Gigabyte review - looks solid.

16172010623l.jpg


www.overclock3d.net

Gigabyte RTX 3080 Gaming OC 10G Review - OC3D

Introduction We’re sure that anyone with half an interest in the new Ampere GPU from Nvidia will be aware that the RTX 3080 is an absolute beast. Not only does it out-perform the RTX 2080 Ti that was the object of desire for the hardcore gamers over the last two years, but it does so […]

Excellent cooling has clear benefits in performance terms too. Nvidia GPU Boost has been a boon to anyone who just wants more performance without spending days fine-tuning their clock speeds. This graphics card relies largely upon thermal headroom to achieve even higher speeds, and the Gigabyte Gaming makes full use of that technology. Out of the box, it's hardly a sloth with a 1770 MHz rated boost speed, but as you can see from our graph back on page three, the Gigabyte card gives us the highest average speed of the four cards we've reviewed during this busy launch week, with an average boost clock of 1975 MHz, peaking at 2040. It's 4 MHz higher than our overclocked Founders Edition and 30 MHz faster than the next best card, the MSI Gaming X Trio.

That performance on paper is reflected in the results we saw throughout our testing. All of the RTX 3080s come with true, jaw-dropping levels of performance thanks to their increased core counts in almost every regard going along with the faster GDDR6X, but the Gigabyte is right there at the top fairly regularly. Considering this is by no means the most expensive card we've looked at, nor will it be the most expensive once the Strix/Lightning/Aorus Ultra's appear, yet the performance is gob-smacking.

With quality component choices allied to an excellent cooler and all based upon the blazing speed of the new Ampere GPU from Nvidia, the Gigabyte RTX 3080 Gaming OC 10G would be our choice of the three cards we have reviewed on launch day with an awesome combination of price, performance and cooling. It's a no-frills fit and forget card that will allow you to concentrate on what's important... Gaming.
 

Deleted member 1594

Account closed at user request
Banned
Oct 25, 2017
5,762

super-famicom

Avenger
Oct 26, 2017
25,606
www.youtube.com

Custom RTX 3080 cards are here!

With a new family of video cards comes the custom AIB options from the board partners! We kick off the custom 3080 cards with the EVGA RTX 3080 XC3 card feat...

Not impressed by this EVGA card.

Why? I've only heard of them recently. Are they often wrong about things?

MLID just takes info from tweets and other sources, then repeats them back to you while adding some of his own bullshit on top of that. He basically flings shit at the wall and hopes some of it will stick.