Status
Not open for further replies.

Atolm

Member
Oct 25, 2017
5,868
I'm guessing Nvidia isn't prioritising that due to the sorry state of HDR on PC monitors. HDR support on PC will be hamstrung until we get nice OLED moniters.

I swear I've been hearing about oleds on PC for years now. This ship isn't coming at all, I think. If anything, maybe micro-led in 5-10 years.
 

Deleted member 49611

Nov 14, 2018
5,052
the only game i really want to upgrade for is Cyberpunk 2077 so i need the 3080 Ti. that game will make even the most powerful PC struggle.

i'm gonna need all the performance Nvidia can give if i want to play at 1440p 80-140fps, RTX on, and as high settings as possible. honestly i don't think the 3080 Ti will be enough but if its the best i can buy then i guess that's what i need to settle with.
 

StreamedHams

Member
Nov 21, 2017
4,359
Can't believe I never actually played Crysis. Was console-only when it released in 2011 and obviously critics complained about performance on 360/PS3 so I skipped it. Now I'm ready to finally play it via the remaster on PC :)
From a sandbox perspective, it's cool. Gameplay and story are pretty meh.

It's the tech that kept me coming back for years just trying to see everything maxed out.
 

captainmal01

Member
Oct 28, 2017
1,341
Still waiting for 2080ti performance at sub 300 quid. Hopefully the series after ampere will give me that.
 

Serious Sam

Banned
Oct 27, 2017
4,354
Anyone else feel a bit underwhelmed about this new lineup when you consider that you most likely will be able to get PS5 and XsX both for the price of RTX3080(Ti).
 

Xx 720

Member
Nov 3, 2017
3,920
That 3080 ti, with dlss 2.0 and full ray tracing, hdr etc. enabled should be able to approximate Pixar level visuals, going be a stunner.
 

zeomax

Member
Oct 28, 2017
188
I swear I've been hearing about oleds on PC for years now. This ship isn't coming at all, I think. If anything, maybe micro-led in 5-10 years.
Samsung is going to stopp the production of LCD panels by the end of 2020 and in 2021 only produce QLED (the real QLED with the self illuminating pixels). Maybe next year or 2022 we will have QLED Monitors.
 

Praetorpwj

Member
Nov 21, 2017
4,401
Knowing very little as I do (and assuming rumoured specs to be correct) which card would be recommended to drive my 1440 ultra wide utilising RT?
I'm coming from a 980Ti.
 

Serious Sam

Banned
Oct 27, 2017
4,354
not really? considering it will trounce both consoles

i mean, yeah, price will most likely suck but power wise, doubt the ti will underwhelm
Yeah sure, consoles will run games at 4K with raytracing, 3080Ti will run games at 4K with raytracing (at a higher framerate). Big woop! GPU proposal this fall sounds like burning money but that's just me.
 

Jimrpg

Member
Oct 26, 2017
3,280
Dang, my 1080 Ti is 11.3 TF apparently. So even if I go 3080 it will be a huge jump.
That 26 TF 3080 Ti should at least do 1440p/60/RTX full without even sweating.

Well the ray tracing would be dependent on the tensor cores and we didn't really get much information on that on the RTX 20 series cards so we don't know how much it's improved by. It depends how much ray tracing the devs want to program in. We saw them struggle with it when the 20 series cards launched, so there seems like there's not enough power. The improvement in the 30 series cards will help, but they're probably not going to make too big of a leap because it might render the older cards useless. We saw games like Battlefield V fps tank from 100+ frames to 60 when ray tracing was turned on so the ray tracing seems to be the bottleneck.

If this even turns out to be true. It seems too good. Watch it turn out to be more like 20Tflops in the end.

Yeah I was really surprised seeing these rumoured specs ( maybe that won't actually be the 3080 Ti because it's the GU100 chip and not the GU102 chip as a few people have said, but it's looking like the 3080Ti is 20+Tflops which is beastly.

This 3080Ti has double the number of CUDA cores as the 2080Ti.

I have a feeling it's going to be a smash hit, even though it's the start of the gen and there really aren't many games that will use 26Tflops.
 

Jimrpg

Member
Oct 26, 2017
3,280
3600X will bottleneck the 3080Ti ? Should I sell my 3600X and pick the 3700X ?

Depends how you define bottleneck? Is going from 120fps to 115fps a bottleneck? That's probably the real world difference when you have a 3080Ti. It's kind of a non issue imo. And you won't see it be a problem until you get games that demand all 8 cores of the 3700x and you can upgrade then.
 

Drelkag

Member
Oct 25, 2017
527
Liking what I see of these cards. Will probably get a 3070 if budget permits, you never know in these times.
 

Spark

Member
Dec 6, 2017
2,625
Yeah sure, consoles will run games at 4K with raytracing, 3080Ti will run games at 4K with raytracing (at a higher framerate). Big woop! GPU proposal this fall sounds like burning money but that's just me.
That's pretty much how it is now? Minus the ray tracing, but PC raytracing will no doubt be higher quality than the new consoles. Games will look and run better. A difference is lots of high end gamers have 100hz+ monitors, and if they want to play games with next generation quality visuals while still taking advantage of the high framerates they're accustomed to then they'll have to pay premium. Everything in a vacuum, those people likely won't be disappointed seeing Halo Infinite or whatever other next gen game running smooth at better than next generation visual fidelity. Compared to PC games now it'll be a world of difference.
 

Serious Sam

Banned
Oct 27, 2017
4,354
i think you may be in the wrong thread then buddy
How so? This is a discussion forum where counter points are permitted and this isn't Nvidia worshiping website, last I checked. I've owned high end gaming PCs and all consoles for the last 2 decades. I'm sure hardcore PC gamers will line up to buy overpriced Nvidia's cards day 1 like they always do, but I can't wait to see how average price-conscious gamer will react when they see console propositions this fall. There are sooooo many people who aren't emotionally invested in PC and who can easily lean towards consoles, and just want best bang for buck gaming.
 

mutantmagnet

Member
Oct 28, 2017
12,401
Nvidia is kind of gambling on everyone being happy with ray tracing. With the specs as they are there are big improvements in ray tracing capabilities but everything else got small improvements that wouldn't make an upgrade worth it.

I expected the ray tracing to improve but I thought at the very least rasterization improvements would match the jump from 10 series to 20 series but that might not be happening if these specs end up being mostly true.
 

Kuosi

Member
Oct 30, 2017
2,377
Finland
Anyone else feel a bit underwhelmed about this new lineup when you consider that you most likely will be able to get PS5 and XsX both for the price of RTX3080(Ti).
thats like each for their own, my instance is I can tax deduct pc hardware and having pc removes the needs of xbox from the equation and I vastly prefer to play all games on pc if possible thanks to everything pc brings to the table. If you are just looking for the console experience and arent interested in the pc gaming perks like higher hz, mods, other customization and so on then yeah grab xbox+ps, that just isnt the story for everyone
 

GhostofWar

Member
Apr 5, 2019
512
Nvidia is kind of gambling on everyone being happy with ray tracing. With the specs as they are there are big improvements in ray tracing capabilities but everything else got small improvements that wouldn't make an upgrade worth it.

I expected the ray tracing to improve but I thought at the very least rasterization improvements would match the jump from 10 series to 20 series but that might not be happening if these specs end up being mostly true.

The TI is packing some raster improvments but the price on that is gonna be rough but I agree with your point for the 60/70/80, they are pushing bigger jumps in rt performance and I'm guessing they will be hoping to use dlss to have the framerate crown in both raster and rt.
 

Sanctuary

Member
Oct 27, 2017
14,481
I expected the ray tracing to improve but I thought at the very least rasterization improvements would match the jump from 10 series to 20 series but that might not be happening if these specs end up being mostly true.

I'm confused. Are you saying the rasterization looks better than expected, or worse to you? The 3080 looks like it will at the very least match the 2080 Ti. The rasterization jump from the 10 series to the 20 series was abysmal.
 

Fall Damage

Member
Oct 31, 2017
2,134
How so? This is a discussion forum where counter points are permitted and this isn't Nvidia worshiping website, last I checked. I've owned high end gaming PCs and all consoles for the last 2 decades. I'm sure hardcore PC gamers will line up to buy overpriced Nvidia's cards day 1 like they always do, but I can't wait to see how average price-conscious gamer will react when they see console propositions this fall. There are sooooo many people who aren't emotionally invested in PC and who can easily lean towards consoles, and just want best bang for buck gaming.

Being a price-conscious gamer myself I try not to get too hung up on the high end stuff. It exists because there is a market for it in the same way there is a market for 100k cars. The majority will be going with the xx60 as usual.
 

starblue

Member
Oct 28, 2017
1,754
Depends how you define bottleneck? Is going from 120fps to 115fps a bottleneck? That's probably the real world difference when you have a 3080Ti. It's kind of a non issue imo. And you won't see it be a problem until you get games that demand all 8 cores of the 3700x and you can upgrade then.

Bottleneck, for me, is when the game struggles and starts to be not FPS stable, for example using i5 with only 4 cores, some games starts tu stutter bc the CPU is not able to handle the GPU. That's my big concern. Nowadays with 6 core (12 thread) you have more than enough but....who knows in 2 years, im afraid the 6 core is not enough, specially for higher framerates.

Im not an expert on hardware so maybe Im wrong, that's why im asking :) thank you
 

Tovarisc

Member
Oct 25, 2017
24,628
FIN
In terms of Turing leaks (that were right) to release^^?

If I remember right first leaks that ended up being most accurate in hindsight came out month or less before keynote so 2 months before release.

Good estimate for release window of Ampere is (Keynote month + 2 months) which would put this around July.
 

mutantmagnet

Member
Oct 28, 2017
12,401
That's what I thought you were saying, and that doesn't make any sense. It looks like it's going to have a way bigger improvement than 10 to 20 did, and is more in line with the 30%+ jump we typically see.

I'll have to retract my previous statement. I looked more closely at the numbers. It is only the 60 card that gets shafted here. All the others actually are a reasonable improvement even if not as big as the 80 ti spec jump jump.
 

Deleted member 49611

Nov 14, 2018
5,052
Anyone else feel a bit underwhelmed about this new lineup when you consider that you most likely will be able to get PS5 and XsX both for the price of RTX3080(Ti).
maybe but they will still be weaker than a 3080 Ti. so no. plus i use my GPU for more than gaming. so that extra cost is totally worth it. a PS5/XSX isn't going to help accelerate programs for work i do on PC.
 

pswii60

Member
Oct 27, 2017
26,894
The Milky Way
Anyone else feel a bit underwhelmed about this new lineup when you consider that you most likely will be able to get PS5 and XsX both for the price of RTX3080(Ti).
Well you can get a PS4 Pro, Xbox One X and a Switch for the price of a 2080Ti. It's not really the point.

It's like saying how many Fiats you could get for the price of one Ferrari.
 

orava

Alt Account
Banned
Jun 10, 2019
1,316
My quick calcs - going off the number of increased Cuda Cores and Clock Speeds

RTX 2060 6.5 Tflops
RTX 2070 7.5Tflops
RTX 2080 10 Tflops
RTX 2080 Ti 13.4Tflops

RTX 3060 10TFlops
RTX 3070 13TFlops
RTX 3080 17Tflops
RTX 3080Ti 26Tflops

Even if you had a 20 series card - there's good reason to upgrade. Which was not the case going from the 10 series to 20 series.

So the XSX gpu is comparable to 2070 - 2080 speeds. This would mean that even the 3060 can match it. Also, nvidia has been quite conservative with reported boost clocks compared to actual. If you could overclock that 3080ti close to 2Ghz, it would break the 30 tlops barrier.
 

dgrdsv

Member
Oct 25, 2017
12,260
I'll have to retract my previous statement. I looked more closely at the numbers. It is only the 60 card that gets shafted here. All the others actually are a reasonable improvement even if not as big as the 80 ti spec jump jump.
3060 card in this leak will be around +50% in that "rasterization" of yours and will hit 2080 level of performance. How is this "shafted" exactly?
 

Jimrpg

Member
Oct 26, 2017
3,280
Bottleneck, for me, is when the game struggles and starts to be not FPS stable, for example using i5 with only 4 cores, some games starts tu stutter bc the CPU is not able to handle the GPU. That's my big concern. Nowadays with 6 core (12 thread) you have more than enough but....who knows in 2 years, im afraid the 6 core is not enough, specially for higher framerates.

Im not an expert on hardware so maybe Im wrong, that's why im asking :) thank you

Yeah it feels a bit early imo to upgrade and itd be a kinda small upgrade to go from a 6 core CPU to an 8 core CPU such as the 3600X to 3700X.

The next gen consoles have 8 core CPUs clocked at around 3.5ghz, which is a huge improvement over the CPUs now. Seems like itd be better to wait and see what the new games are like and how CPU hungry they are.

There's a 3950X that has 16 cores/32 threads right now. In a couple of years that's probably going to be the new Ryzen 7 and that'd be a big upgrade and a big jump from a 6 core CPU.

So the XSX gpu is comparable to 2070 - 2080 speeds. This would mean that even the 3060 can match it. Also, nvidia has been quite conservative with reported boost clocks compared to actual. If you could overclock that 3080ti close to 2Ghz, it would break the 30 tlops barrier.

Yeah the XSX will be around the 2080 level, which is pretty crazy and might be even more crazy depending on if its $499-599. Man I hope they do $399 just to make Sony nervous.

I don't know if Nvidia will allow users to boost so high, starting with the 10 series cards, they've been really limiting the amount of power allowed so we weren't seeing users overclocking their cards by 25-30% like with Maxwell 9 series cards and prior.
 
Last edited:

Deleted member 11276

Account closed at user request
Banned
Oct 27, 2017
3,223
Is anyone really believing these rumors? I believe the GA100 chip is correct, but that will be a HPC GPU with 48 GB HBM2 memory. (see here: https://www.notebookcheck.net/NVIDI...-GA102-40-up-on-the-RTX-2080-Ti.456402.0.html ) Not a 3080Ti (that would be stupidy expensive), have you ever heard of a consumer GPU with 48 GB HBM2 VRAM? That's obviously the succ to the V100. I believe this configuration is far more likely:

GA102 - 84 SMs / 5376 CUDA cores / 12GB GDDR6 / 384-bit bus - 40% faster than RTX 2080 Ti - RTX 3080 TI
GA103 - 60 SMs / 3840 CUDA cores / 10GB GDDR6 / 320-bit bus - 10% faster than RTX 2080 Ti - RTX 3080
GA104 - 48 SMs / 3072 CUDA cores / 8GB GDDR6 / 256-bit bus - 5% slower than RTX 2080 Ti -RTX 3070

Read more: https://www.tweaktown.com/news/7215...ecs-teases-an-absolute-monster-gpu/index.html

Rumors like these will just create unrealistic expectations and disappointment... For what its worth we will definately see the GA100 chip at the GTC presentation in two weeks.
 

ILikeFeet

DF Deet Master
Banned
Oct 25, 2017
61,987
Is anyone really believing these rumors? I believe the GA100 chip is correct, but that will be a HPC GPU with 48 GB HBM2 memory. (see here: https://www.notebookcheck.net/NVIDI...-GA102-40-up-on-the-RTX-2080-Ti.456402.0.html ) Not a 3080Ti (that would be stupidy expensive), have you ever heard of a consumer GPU with 48 GB HBM2 VRAM? That's obviously the succ to the V100. I believe this configuration is far more likely:

GA102 - 84 SMs / 5376 CUDA cores / 12GB GDDR6 / 384-bit bus - 40% faster than RTX 2080 Ti - RTX 3080 TI
GA103 - 60 SMs / 3840 CUDA cores / 10GB GDDR6 / 320-bit bus - 10% faster than RTX 2080 Ti - RTX 3080
GA104 - 48 SMs / 3072 CUDA cores / 8GB GDDR6 / 256-bit bus - 5% slower than RTX 2080 Ti -RTX 3070

Read more: https://www.tweaktown.com/news/7215...ecs-teases-an-absolute-monster-gpu/index.html

Rumors like these will just create unrealistic expectations and disappointment... For what its worth we will definately see the GA100 chip at the GTC presentation in two weeks.
The original sourcing for the image doesn't even believe it as they called it a rumor, sources from elsewhere
 
Status
Not open for further replies.