Razgriz417

Member
Oct 25, 2017
9,154
www.youtube.com

Custom RTX 3080 cards are here!

With a new family of video cards comes the custom AIB options from the board partners! We kick off the custom 3080 cards with the EVGA RTX 3080 XC3 card feat...

Not impressed by this EVGA card.

Why? I've only heard of them recently. Are they often wrong about things?
he's said in previous videos that he just makes things up that he thinks is likely to happen. Dude pulls info from his ass
 

dgrdsv

Member
Oct 25, 2017
12,267
Isn't death stranding a Nvidia sponsored title?
It's in NV's devrel program, yes. NV doesn't "sponsor" titles. I haven't heard of even one case where NV has actually paid anything to a game dev/publisher.
Death Stranding is also running fine on AMD h/w.

It isn't just the inclusion of horizon as the 5700xt doesn't have some huge gap there that creates a big outlier anyway.
The point remains: if you want to talk about driver improvements DON'T change the benchmarking suite between benchmarks. If you do then all your driver related changes are essentially gone and you can't make any assumptions about them.
 

Frozen Viper

Member
Feb 7, 2019
279
So what's the consensus on AIB cards? I'm not a huge overclocker, just looking for something that can run well, OC a little, is nice and quiet. Is there a place where AIB reviews are consolidated?
 

PlayBee

One Winged Slayer
Member
Nov 8, 2017
5,728
So what's the consensus on AIB cards? I'm not a huge overclocker, just looking for something that can run well, OC a little, is nice and quiet. Is there a place where AIB reviews are consolidated?
videocardz.com

NVIDIA GeForce RTX 3080 CUSTOM Graphics Cards Review Roundup - VideoCardz.com

Reviews of CUSTOM NVIDIA GeForce RTX 3080 graphics card Yesterday NVIDIA lifted the embargo on GeForce RTX 3080 Founders Edition. Today the embargo on custom designs officially lifts. Starting from now you can also order the first Ampere-based graphics card. Custom NVIDIA GeForce RTX 3080...
 

komaruR

Member
Oct 28, 2017
3,132
http://www.twitch.tv/komarur

taggen86

Member
Mar 3, 2018
464
Had anyone managed to get g sync working in a c9 3080 combo? My card arrives next week but I am worried it wont work
 

Hasney

One Winged Slayer
The Fallen
Oct 25, 2017
19,365
Had anyone managed to get g sync working in a c9 3080 combo? My card arrives next week but I am worried it wont work

It won't work at 4k/120, it may work at other resolutions and framerates like current cards do. Going to be waiting on a firmware update or driver update to fix it. Hoping the latter as it will be quicker.

Not sure if the CX exhibits the same behaviour or not. Don't believe it's been tested.
 

pswii60

Member
Oct 27, 2017
26,897
The Milky Way
It won't work at 4k/120, it may work at other resolutions and framerates like current cards do. Going to be waiting on a firmware update or driver update to fix it. Hoping the latter as it will be quicker.

Not sure if the CX exhibits the same behaviour or not. Don't believe it's been tested.
Oh fuck. Your personal experience or is that mentioned in a review?

Currently no issues with 1440p/120hz/Gsync on my C9 but my main reason to buy this card was for the HDMI 2.1. But 120hz pretty useless without Gsync to fall back on.
 

taggen86

Member
Mar 3, 2018
464
It won't work at 4k/120, it may work at other resolutions and framerates like current cards do. Going to be waiting on a firmware update or driver update to fix it. Hoping the latter as it will be quicker.

Not sure if the CX exhibits the same behaviour or not. Don't believe it's been tested.

Really doesnt make sense why it would work i 4k 60 but not 4k 120. Isnt it more likely it wont work at all then? Also isnt vrr part of hdmi 2.1?
 

Hasney

One Winged Slayer
The Fallen
Oct 25, 2017
19,365
Oh fuck. Your personal experience or is that mentioned in a review?

Currently no issues with 1440p/120hz/Gsync on my C9 but my main reason to buy this card was for the HDMI 2.1. But 120hz pretty useless without Gsync to fall back on.



Really doesnt make sense why it would work i 4k 60 but not 4k 120. Isnt it more likely it wont work at all then? Also isnt vrr part of hdmi 2.1?

It could easily make sense if it's just a bug. Either the drivers or the TV not expecting GSync over HDMI with 4k and 120Hz. Either way, I have the card and a cable coming tomorrow, so I'll make sure to test it (assuming the cable actually works at the claimed bandwidth).
 

taggen86

Member
Mar 3, 2018
464




It could easily make sense if it's just a bug. Either the drivers or the TV not expecting GSync over HDMI with 4k and 120Hz. Either way, I have the card and a cable coming tomorrow, so I'll make sure to test it (assuming the cable actually works at the claimed bandwidth).


Awesome. Please report your findings in this thread as soon as possible :)
 

pswii60

Member
Oct 27, 2017
26,897
The Milky Way




It could easily make sense if it's just a bug. Either the drivers or the TV not expecting GSync over HDMI with 4k and 120Hz. Either way, I have the card and a cable coming tomorrow, so I'll make sure to test it (assuming the cable actually works at the claimed bandwidth).

Ah thanks, just after the 7 minute mark.

Also I've got a fibre HDMI cable (due to the length) routed through the wall. No idea if it'll cope with the HDMI 2.1 throughput, have a horrible feeling I'm going to have to replace it and not sure if HDMI 2.1 fibre HDMI cables exist yet ..
 

Gitaroo

Member
Nov 3, 2017
8,303




It could easily make sense if it's just a bug. Either the drivers or the TV not expecting GSync over HDMI with 4k and 120Hz. Either way, I have the card and a cable coming tomorrow, so I'll make sure to test it (assuming the cable actually works at the claimed bandwidth).

So no TV VRR? Gsync only?
 
Oct 25, 2017
41,368
Miami, FL
www.youtube.com

Custom RTX 3080 cards are here!

With a new family of video cards comes the custom AIB options from the board partners! We kick off the custom 3080 cards with the EVGA RTX 3080 XC3 card feat...

Not impressed by this EVGA card.

Why? I've only heard of them recently. Are they often wrong about things?
zero reason to base any opinions on a single review, which contains a single sample of the product. he may have simply had a mediocre chip, as many will.

I see no reason to believe these cards will perform any better or worse than anything else in this price range, varying based on the quality of silicon you win in the lottery.

wait for more reviews.
 
Oct 25, 2017
2,974
Crossposting with the buyers thread -
What's the consensus on AIB cards with regards to cooling/lower heat output and fan quietness, rather than overclocking? Undervolting?
Dimension spreadsheet

My case has side panel spacer accessories available now for $15 a pop. Based on what I can fit:

+0 spacers = FE OR XC3, I'm good to go already
+1 spacer = Asus TUF
+2 spacers = EVGA FTW3
+3 spacers = MSI Ventus 3X

Anything else is too long at over 315mm rather than too wide. Cards that are too big:
Gigabyte Eagle series
Gigabyte Aorus series
MSI Gaming Trio series
Galax cards
INNO3D cards
 

super-famicom

Avenger
Oct 26, 2017
25,606
Crossposting with the buyers thread -
What's the consensus on AIB cards with regards to cooling/lower heat output and fan quietness, rather than overclocking? Undervolting?
Dimension spreadsheet

My case has side panel spacer accessories available now for $15 a pop. Based on what I can fit:

+0 spacers = FE OR XC3, I'm good to go already
+1 spacer = Asus TUF
+2 spacers = EVGA FTW3
+3 spacers = MSI Ventus 3X

Anything else is too long at over 315mm rather than too wide. Cards that are too big:
Gigabyte Eagle series
Gigabyte Aorus series
MSI Gaming Trio series
Galax cards
INNO3D cards

From the few reviews I saw so far, the Tuf does a good job at staying cool. It has an alternate quiet BIOS which runs quieter at the expense of +4-5C. Of course, undervolting any card will help a lot and usually lowers temps anywhere from 4-7C in my experience. I'm curious about the Ventus (since I still have an Amazon pre-order that is still up) but haven't seen any reviews for it.

Edit- based in that spreadsheet you have an NZXT H1? This guy on reddit has theb3080 Ventus in his H1 and claims that temps peak at 75C on a Unigine benchmark.

 

taggen86

Member
Mar 3, 2018
464
It won't work at 4k/120, it may work at other resolutions and framerates like current cards do. Going to be waiting on a firmware update or driver update to fix it. Hoping the latter as it will be quicker.

Looks like you were correct based on the avsforum thread. G sync 4k 60 works but not 120
 

taggen86

Member
Mar 3, 2018
464
Ah, there we go then. I wonder what's at fault, the TV or the driver. Have you seen anything about the CX? If that's working, it's probably the TV software.

Exactly. For us c9 owners out there I really hope it is not working for CX either. That would increase the likelihood that it is on the nv side and also that we get a firmware update if the bug is on the LG side
 

SmartWaffles

Member
Nov 15, 2017
6,267
Best part of the TUF is that it ticks all the right boxes, didn't waste anything on RGB or super doper fancy cooling, and is priced at MSRP.
 

Yibby

Member
Nov 10, 2017
1,812
The powerlimit makes it practically pointless to look for anything other than the quietest GPU, the range for overclocking is 2-5% max.
I think I will actually go for a 3090, watercooling is also nearly pointless on a 3080 because of this.

You may be able to do a little more by lowering the voltage via the curve editor, because the GPU needs less power with that, but Nvidia has probably optimized it to the edge this time (unlike with 1080 and 2080).
That's what it looks like. Everything runs at the same level with almost no overclocking advantage. I guess that will be the same for the 3090, maybe it's even worse? Maybe the 3090 has an increased TDP only to get on the 3080 level in clock speeds. That would mean that there is even less OC potential.
 

xyla

Member
Oct 27, 2017
8,525
Germany
Are there any reviews on the Ventus yet?
Has it shipped for anyone? Can't find anything on this card when it comes to reviews.
 
Oct 25, 2017
41,368
Miami, FL
That's what it looks like. Everything runs at the same level with almost no overclocking advantage. I guess that will be the same for the 3090, maybe it's even worse? Maybe the 3090 has an increased TDP only to get on the 3080 level in clock speeds. That would mean that there is even less OC potential.
Do keep in mind "everything" as you put it refers to like 4 cards out of about 30. Yes, all the cards in the same price range perform about the same. 🤣

We have yet to see a single review of the biggest or highest stock clocked cards or the top end cards. No FTW3, Strix, Aurous, Hybrid, etc. We should see those reviews pop up in the next few days.

That said, if it turns out that all cards are within 5% of each other, then the correct play is indeed to get the quietest, coolest running cards that you can deal with the looks of. But we are many reviews away from definitively determining which cards those are. Let's try to speak less like we're at the end of the review process. It's just begun.
 

Xiaomi

Member
Oct 25, 2017
7,237
The tuf is so ugly I really would consider deshrouding it and slapping two noctua NF-A12x25s on it. But that goes for several of the cards and their hideous shrouds.
 

Deleted member 9330

User requested account closure
Banned
Oct 26, 2017
6,990
If the lack of 4K120 Gsync on the C9 is an LG issue they're gonna take forever fixing it. They finally fixed the LPCM 5.1/7.1 thing this summer so it makes me hopeful they won't just leave the C9 to die and update only the CX...
 
Last edited:

Hasney

One Winged Slayer
The Fallen
Oct 25, 2017
19,365
If the lack of 4K120 on the C9 is an LG issue they're gonna take forever fixing it. They finally fixed the LPCM 5.1/7.1 thing this summer so it makes me hopeful they won't just leave the C9 to die and update only the CX...

4k120 works out of the box. It's just adding GSync into the mix that makes it black screen.
 

Mórríoghain

Member
Nov 2, 2017
5,180
Got my Palit GamingPro. I still waiting for the CPU and mobo as well but I needed to pop it on and muck around a bit. Playing Control at 1440p everything ultra and RTX on is... Glorious.
 

TheNerdyOne

Member
Oct 28, 2017
521


Not sure where to post this, but it seems "2nd gen RT cores" are doing exactly nothing differently than first gen RT cores, same for the tensor cores with regard to DLSS performance. the performance impact of enabling RT is exactly the same, percentage wise, as doing so on turing, and the gains for DLSS are also roughly the same vs turing, percentage wise. This isn't exactly great news for consumers hoping for a significant change in RT acceleration capabilities vs turing... the only reason RT is faster on ampere is because the ampere gpu is bigger, and faster in general... not because there was any gain at all in terms of RT performance per RT core, or performance per clock.... it also means that AMD rnda2 competition should hit even harder, seeing as how nvidia seems to have stood still on its RT cores for the last two years in terms of ipc. The ratio of RT cores per SM remains identical to turing as well, as does the relative RT performance for any given amount of raster performance. evidenced by the percentage drop when enabling RT on both gpus... they brute forced it rather than actually make an architectural change.... which leaves them about as wide open for a counter as going with samsung 8nm did.....
 

Calabi

Member
Oct 26, 2017
3,531


Not sure where to post this, but it seems "2nd gen RT cores" are doing exactly nothing differently than first gen RT cores, same for the tensor cores with regard to DLSS performance. the performance impact of enabling RT is exactly the same, percentage wise, as doing so on turing, and the gains for DLSS are also roughly the same vs turing, percentage wise. This isn't exactly great news for consumers hoping for a significant change in RT acceleration capabilities vs turing... the only reason RT is faster on ampere is because the ampere gpu is bigger, and faster in general... not because there was any gain at all in terms of RT performance per RT core, or performance per clock.... it also means that AMD rnda2 competition should hit even harder, seeing as how nvidia seems to have stood still on its RT cores for the last two years in terms of ipc. The ratio of RT cores per SM remains identical to turing as well, as does the relative RT performance for any given amount of raster performance. evidenced by the percentage drop when enabling RT on both gpus... they brute forced it rather than actually make an architectural change.... which leaves them about as wide open for a counter as going with samsung 8nm did.....


Yeah I saw that review and it kind of made me feel better about holding off from getting a 3080 currently.
 

TheNerdyOne

Member
Oct 28, 2017
521
Yeah I saw that review and it kind of made me feel better about holding off from getting a 3080 currently.

not like you can find one even if you wanted one atm, and by the time regular stock is available, amd will have launched, so its the perfect time to wait and see what both the 3070 looks like, as well as rdna2 (the 3090 is out of reach for 99.9999999999999% of people, especially in a post covid world... $1500 on a gpu is.... yikes.. and the retail price of AIB cards will likely be $1800 - 2000+ for months or years anyway, if turing was anything to go by with basically zero cards actually at $1000 for like, a year. In any event, in the end the consumers win... nvidia got scared of what amd could be up to so it pushed pricing down relative to the performance on offer, and they also got scared evidenced by the fact they're willing to push 350w (with spikes over 400w) on what isn't even their highest end part, and undervolting the card or easing up on the power limit sees a minor performance drop for a huge power draw drop, so nvidia clearly felt they needed to completely ignore efficiency and push as hard as possible.... that's a move of desperation and fear, not a company confident it will have the obviously faster product.
 

Grassy

Member
Oct 25, 2017
1,079
videocardz.com

NVIDIA GeForce RTX 3080 CUSTOM Graphics Cards Review Roundup - VideoCardz.com

Reviews of CUSTOM NVIDIA GeForce RTX 3080 graphics card Yesterday NVIDIA lifted the embargo on GeForce RTX 3080 Founders Edition. Today the embargo on custom designs officially lifts. Starting from now you can also order the first Ampere-based graphics card. Custom NVIDIA GeForce RTX 3080...

Awesome, thanks heaps for this. MSi 3080 Gaming X looks like a fantastic card, and their cooling solution is again top-tier in terms of acoustics. Pretty crazy for being the fastest 3080 out there.

msigamingx3pk1s.png

From the Guru3d review - https://www.guru3d.com/articles_pages/msi_geforce_rtx_3080_gaming_x_trio_review,9.html
 

dgrdsv

Member
Oct 25, 2017
12,267


Not sure where to post this, but it seems "2nd gen RT cores" are doing exactly nothing differently than first gen RT cores, same for the tensor cores with regard to DLSS performance. the performance impact of enabling RT is exactly the same, percentage wise, as doing so on turing, and the gains for DLSS are also roughly the same vs turing, percentage wise. This isn't exactly great news for consumers hoping for a significant change in RT acceleration capabilities vs turing... the only reason RT is faster on ampere is because the ampere gpu is bigger, and faster in general... not because there was any gain at all in terms of RT performance per RT core, or performance per clock.... it also means that AMD rnda2 competition should hit even harder, seeing as how nvidia seems to have stood still on its RT cores for the last two years in terms of ipc. The ratio of RT cores per SM remains identical to turing as well, as does the relative RT performance for any given amount of raster performance. evidenced by the percentage drop when enabling RT on both gpus... they brute forced it rather than actually make an architectural change.... which leaves them about as wide open for a counter as going with samsung 8nm did.....

As I've already said on this in another thread - this video is completely wrong in its conclusions.

Hybrid RT renderers never were limited by RT h/w on Turing and they of course won't be limited by a much faster RT h/w on Ampere.
There are more than a bunch of results which showcase Ampere RT h/w improvements though, just that they aren't games like BFV. Something like Blender is a much better fit for such testing since even Q2RTX was aimed at Turing's RT h/w capabilities meaning that it likely won't get many benefits from faster BVH h/w.

DLSS on Ampere runs similarly because Ampere TCs are cut down in number compared to Turing and GA100 - the throughput is similar unless there is a possibility to apply sparsity feature.
But DLSS can be run in parallel to shading now which is impossible to do on Turing - and games have to be patched for that which means that you can't arrive to these conclusions from running Turing tensor code on Ampere.
So again, a false statement on HWU side here.

stood still on its RT cores for the last two years in terms of ipc
Yeah, no, I'm out.
 

JudgmentJay

Member
Nov 14, 2017
5,301
Texas
not like you can find one even if you wanted one atm, and by the time regular stock is available, amd will have launched, so its the perfect time to wait and see what both the 3070 looks like, as well as rdna2 (the 3090 is out of reach for 99.9999999999999% of people, especially in a post covid world... $1500 on a gpu is.... yikes.. and the retail price of AIB cards will likely be $1800 - 2000+ for months or years anyway, if turing was anything to go by with basically zero cards actually at $1000 for like, a year. In any event, in the end the consumers win... nvidia got scared of what amd could be up to so it pushed pricing down relative to the performance on offer, and they also got scared evidenced by the fact they're willing to push 350w (with spikes over 400w) on what isn't even their highest end part, and undervolting the card or easing up on the power limit sees a minor performance drop for a huge power draw drop, so nvidia clearly felt they needed to completely ignore efficiency and push as hard as possible.... that's a move of desperation and fear, not a company confident it will have the obviously faster product.

Aren't you the guy who was claiming that console RT performance would be 4x that of a 2080Ti a few months ago?
 

Darktalon

Member
Oct 27, 2017
3,301
Kansas
not like you can find one even if you wanted one atm, and by the time regular stock is available, amd will have launched, so its the perfect time to wait and see what both the 3070 looks like, as well as rdna2 (the 3090 is out of reach for 99.9999999999999% of people, especially in a post covid world... $1500 on a gpu is.... yikes.. and the retail price of AIB cards will likely be $1800 - 2000+ for months or years anyway, if turing was anything to go by with basically zero cards actually at $1000 for like, a year. In any event, in the end the consumers win... nvidia got scared of what amd could be up to so it pushed pricing down relative to the performance on offer, and they also got scared evidenced by the fact they're willing to push 350w (with spikes over 400w) on what isn't even their highest end part, and undervolting the card or easing up on the power limit sees a minor performance drop for a huge power draw drop, so nvidia clearly felt they needed to completely ignore efficiency and push as hard as possible.... that's a move of desperation and fear, not a company confident it will have the obviously faster product.
Ignorance, trolling, or both.

Edit2: sorry
 
Last edited: