• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.

Deleted member 40102

User requested account closure
Banned
Feb 19, 2018
3,420
Its been 2 years since a really good upgrade with 'reasonable' price (1080 ti) since then all we get is one 1tf upgrade and almost double the price as what they use to be.... I have been wondering since then are we reaching a dead end of compute speed at this size or are Nvidia just getting greedy? Do you think there is still room for a large upgrades in the future ?
 

LCGeek

Member
Oct 28, 2017
5,856
Why should they when they have 0 competition?

that and profits.

Nvidia stopped being the company I liked ages ago when they started nerfing their cards to make oc gains minute compared to what they could be.

Doesn't excuse them though. GPU and 3d acceleration makers have always been shit even 3dfx wasn't all that great.
 

ss_lemonade

Member
Oct 27, 2017
6,646
Looking at benchmarks, I always thought the 2080 Ti was in-line with what to expect when moving from a 980 Ti -> 1080 Ti -> 2080 Ti. The price jump though was insane lol
 

Deleted member 1635

User requested account closure
Banned
Oct 25, 2017
6,800
They learned that they could get away with outrageous pricing starting especially with the 1080 generation.
 

7thFloor

Member
Oct 27, 2017
6,625
U.S.
RTX first gen was definitely overpriced, but keep in mind they did improve RT based render times by like 3 fold.
 

Jimrpg

Member
Oct 26, 2017
3,280
Its been 2 years since a really good upgrade with 'reasonable' price (1080 ti) since then all we get is one 1tf upgrade and almost double the price as what they use to be.... I have been wondering since then are we reaching a dead end of compute speed at this size or are Nvidia just getting greedy? Do you think there is still room for a large upgrades in the future ?

they've ruined the market at this point with their RTX ( not interested) technology and making twice as many cards as before trying to fill in gaps. Have they clamped down on the overclocking as well? My 1070 literary could not overclock at all (may just be a bad one) but it seems like they've reduced overclocking capability recently. I was able to easily get 30% more on my 970.
 

capitalCORN

Banned
Oct 26, 2017
10,436
AMD is only competing in the lower and mid range. And within that range they offer much better products. That price range is not what the OP is discussing though.
True, but I wouldn't doubt Nvidia obfuscating the market with weird shit like the 1660ti at the high end as well.
 

Flandy

Community Resettler
Member
Oct 25, 2017
3,445
Looking at benchmarks, I always thought the 2080 Ti was in-line with what to expect when moving from a 980 Ti -> 1080 Ti -> 2080 Ti. The price jump though was insane lol
I disagree. Looking at user benchmark there was a smaller jump between the 1080ti and 2080ti compared to 780ti -> 980ti and 980ti -> 1080ti.
Massive price jump and an even smaller jump in performance than usual is pretty pathetic
 

eathdemon

Member
Oct 27, 2017
9,621
roumers are mid 2020, ether way I will be making my choice than. new pc, assuming 7nm second gen rtx cards, or console. more likely I will be sticking with pc though.
 

ItsTheShoes

Attempting to circumvent ban with an alt
Banned
Oct 27, 2017
334
If anyone wants to trade my 1080 for a 2080 straight up please Send me a dm.
 

Arkaign

Member
Nov 25, 2017
1,991
Nvidia decided that emerging tech that is giving huge profits, specifically hardware accelerated AI/deep learning, was where to put the new budget for adding transistors this gen. They had to dress it up a bit to make it more appealing to gamers, with obviously mixed to disappointing results thus far.

Each gen with time for engineers to improve efficiency, and use new process tech to increase number of transistors per SKU, you can get a certain predictable increase in raw performance, although you can swing high or low depending on other factors. Say you have to go with a large die and aggressive clocks one gen to maintain competitive footing. Then a process shrink comes along and you have less pressure from the competition. You can choose to do a more direct die shrink without a huge increase in transistor count. Boom, you have a more modest performance leap but a cooler/less power hungry lineup, and more dies per wafer. Alternatively you can go big in the other direction, which makes things more expensive, lower yields, and tougher to power and cool. But maximizes potential performance.

A hypothetical 2060/2070/2080/2080ti with all transistors put towards traditional architecture would have meant a very large uplift in performance, but something like 40% of the die is tensor/RT stuff. They ARE expensive for what you get, especially considering that the RTX portions are useless in most consumer scenarios, but they actually are pretty huge die size and transistor counts. Pascals were pretty small and efficient at each level. Turing does are enormous by comparison, just with almost all of the additional die shrink transistor budget spent on stuff that doesn't get used all that much.

It's actually kind of impressive that it wasn't more of a disaster than it was, but that's the basics of why we saw what we did with 10xx to 20xx.

The good news is that the next gen should see back to normal increases from new process tech and improvements in memory speed and architecture tuning, assuming an equal % of resources used for tensor and RT portions.

Alternatively, a hypothetical 30xx series that also fully abandoned RT/Tensor would see monumental uplift in performance, but that is probably at roughly 0% chance of happening.

Personally, I feel that it was too early to dedicate die space to tensor RT stuff. Even my 2080ti is only meh at RT applications, so it feels like a bit of a waste. But, the gen that started implementation would always be the roughest one, so at least that's out of the way.

5 or 6nm EUV and 2020+ we should see a substantial improvement in performance across the board. Whether or not that means that raytracing or especially DLSS amount to anything worthwhile remains to be seen however.
 

EatChildren

Wonder from Down Under
Member
Oct 27, 2017
7,029
I feel like for years and years people have talked up fabled enormous gains in GPU revisions that simply don't exist. With NVIDIA I feel it's a combination of being the market leader and thus price gouging whenever they can get away with it, and also warped consumer expectations romanticising performance leaps at affordable prices that probably aren't practical or realistic.

Rendering technology is also in a weird spot where market demands shape GPU focus. 4K is extremely demanding and this isn't going to change for a long time, yet there is a push to market cards for this market. Hence why NVIDIA and console manufacturers are pushing the whole "4K upscaling" thing via checkboardering and DLSS. Or, in short, not actually rendering games natively at 4K but building technology that impressively upscales to create the illusion of 4K.

We're also entering an era of emergent technology like ray tracing. I can understand people not wanting NVIDIA to focus on RT cores, but it's legitimately cutting edge and the future of the industry, so it's a bit of a tradeoff there. But again a lot of people don't seem to understand the performance cost, and just assume having RT cores equates to flipping a switch and RT games performing well. Instead we have a situation where this emergent tech is incredibly impressive, but still comes at a massive performance cost, and people feel cheated despite the reality of what is going on in the silicon.

That being said, 7nm Ampere will be the real test for how much wiggle room NVIDIA have.

Personally, I feel that it was too early to dedicate die space to tensor RT stuff. Even my 2080ti is only meh at RT applications, so it feels like a bit of a waste. But, the gen that started implementation would always be the roughest one, so at least that's out of the way.

It's probably not technically feasible, but a secondary set of cards committed 100% to tensor RT stuff would be lovely. Similar to what NVIDIA tried (and failed) to do with PhysX.
 

leng jai

Member
Nov 2, 2017
15,116
There's literally nothing worth upgrading to from a 1070 that is close to being reasonably. The RTX tech is the excuse/reason with the massive price hike this generation so if you're just looking for pure performance boosts you're not getting all that much. Obviously they have to start somewhere and first generation is almost never good but I'm not going to pay the early adopter tax, it makes no sense financially. '

To be honest I thought my 1070 was overpriced but the 2070 is even worse. The 2080 Ti is literally over $2000 here, it's absurd. Hopefully whatever they release in 2020 is a lot better.
 

Paz

Member
Nov 1, 2017
2,148
Brisbane, Australia
I do miss the old days of significant GPU speed increases at the same price you paid 12-24 months prior, the 20XX series cards in Australia launched at like 2x the price that the 10XX series cards launched at. Paying 2.2 thousand dollars for the high end video card at launch is considered normal now I guess?
 

EVA UNIT 01

Member
Oct 27, 2017
6,729
CA
Like the 970 the 1080ti was a legend.
Im on a 1080 now and will prob go for a 2080super cause fuck me right
 
OP
OP

Deleted member 40102

User requested account closure
Banned
Feb 19, 2018
3,420
I disagree. Looking at user benchmark there was a smaller jump between the 1080ti and 2080ti compared to 780ti -> 980ti and 980ti -> 1080ti.
Massive price jump and an even smaller jump in performance than usual is pretty pathetic
Yeah that what I thought pretty small jump and massive price spike
 
OP
OP

Deleted member 40102

User requested account closure
Banned
Feb 19, 2018
3,420
There's literally nothing worth upgrading to from a 1070 that is close to being reasonably. The RTX tech is the excuse/reason with the massive price hike this generation so if you're just looking for pure performance boosts you're not getting all that much. Obviously they have to start somewhere and first generation is almost never good but I'm not going to pay the early adopter tax, it makes no sense financially. '

To be honest I thought my 1070 was overpriced but the 2070 is even worse. The 2080 Ti is literally over $2000 here, it's absurd. Hopefully whatever they release in 2020 is a lot better.
Honestly the old 1080ti was worth all the money. It use to be $650 or something?
Now its +1000 if you're lucky.
 

Duxxy3

Member
Oct 27, 2017
21,671
USA
$500 on up they don't have competition.

Also it's insane that $500 cards are barely considered high end now.

edit: My higher end cards are a 980 ti and an RTX 2070 and I wish I hadn't bought the 2070 because the difference wasn't worth the cost.
 

Deleted member 17092

User requested account closure
Banned
Oct 27, 2017
20,360
The 5700xt and 2070 super basically being 1080 ti level cards for $400 and $500 respectively isn't terrible I guess but given they are midsized dies they should both be more like $300. I guess $400 for the 2070s would be ok given the rtx stuff.
 

leng jai

Member
Nov 2, 2017
15,116
$500 on up they don't have competition.

Also it's insane that $500 cards are barely considered high end now.

edit: My higher end cards are a 980 ti and an RTX 2070 and I wish I hadn't bought the 2070 because the difference wasn't worth the cost.

I'm looking at a 2070S from my 1070 which is very similar to 980 TI -> 2070 and it's just not worth it.
 

Smash-It Stan

Member
Oct 25, 2017
5,261
I upgraded from a 970 to a 1070 this year, didn't really feel like a gigantic upgrade considering the half decade that I've had it for. Especially considering the price.
 

RestEerie

Banned
Aug 20, 2018
13,618
there were actual upgrades...just not the upgrade you want, OP......

gotta be objective there OP......regardless of your stance on raytracing (and the Nvidia tax), it's not like Nvidia just boost the clock, add more vram and slap a new cooler on the 1080TI and calls it a new product.....there were actual technical progress objectively. Again, whether this upgrades are being used in a mainstream manner is an entire different debate.
 

Pottuvoi

Member
Oct 28, 2017
3,062
Turing brought some really good features for rasterization, sadly they need quite bit code to get working.
Mesh sharders is something that I hope will completely replace traditional vertex etc pipelines.
 

ILikeFeet

DF Deet Master
Banned
Oct 25, 2017
61,987
Given how I only have a 1080p monitor (and I'm not buying a 9ft HDMI cable for my 4K tv), I'm waiting on an affordable card that gives me 1.5 to 2 times the performance of my 970 and offers raytracing. I'm really hoping Ampere hits that mark
 

z0m3le

Member
Oct 25, 2017
5,418
Its been 2 years since a really good upgrade with 'reasonable' price (1080 ti) since then all we get is one 1tf upgrade and almost double the price as what they use to be.... I have been wondering since then are we reaching a dead end of compute speed at this size or are Nvidia just getting greedy? Do you think there is still room for a large upgrades in the future ?
GTX 1080 ti has no tensor or rt cores. That takes up room and the die didn't physically shrink. They have been using 16nm and 12nm (tsmc process node is physically still 16nm). 7nm+ from Samsung will be used in the first half of next year for Nvidia's ampere architecture, it will be the biggest process node jump Nvidia has ever undergone.

Considering the RTX 2080 ti was over 700mm^2 and 7nm+ can shrink a die to just ~40% of that, Nvidia is possibly going to blow us away with a huge jump next year.
 

Finaika

Member
Dec 11, 2017
13,272
I disagree. Looking at user benchmark there was a smaller jump between the 1080ti and 2080ti compared to 780ti -> 980ti and 980ti -> 1080ti.
Massive price jump and an even smaller jump in performance than usual is pretty pathetic
But ray tracing tho.
 

Keyouta

The Wise Ones
Member
Oct 25, 2017
4,193
Canada
I'm sticking with my 980ti for a few more years until I have good reason to upgrade. Still runs games amazingly well @ 1440p.