• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.

Trace

Member
Oct 25, 2017
4,683
Canada
Waiting on upgrading from my 980ti, next year should be the year to do it.

Consoles will be pushing RayTracing harder and next year will be the first year for a second-gen RayTracing card from Nvidia which should hopefully offer solid performance improvements on that front.
 

Zed

Member
Oct 28, 2017
2,544
So glad a got a 1070 near launch. It still kicks ass for just about any game at 1080p.
 

pswii60

Member
Oct 27, 2017
26,646
The Milky Way
Its been 2 years since a really good upgrade with 'reasonable' price (1080 ti) since then all we get is one 1tf upgrade and almost double the price as what they use to be.... I have been wondering since then are we reaching a dead end of compute speed at this size or are Nvidia just getting greedy? Do you think there is still room for a large upgrades in the future ?
11.3tf to 13.4tf is 2.1tf.

And the price per tf always comes at a crazy cost when you reach the upper limits (just look at Titan), not to mention the RTX stuff adding cost.
 
Oct 25, 2017
4,839
This is one of the worst GPU generations ever. The monopoly holder does not improve price/performance in any way and the other company is not even trying to compete and the emerging competition in 2020 won't be trying to compete either. The only hope we have is that Nvidia won't overprice their stuff too hard next year but with no competition there's no chance.

Anyone who bought a 1080 Ti when it launched hit a jackpot as they had the highest-end card for a year and still the highest-end reasonably priced card for two more years.
 
Oct 27, 2017
594
What do you really want from an upgrade though? I upgrade fairly frequently and I'm alright with the 20/30% increases.
Is it that you're expecting to go from 60fps low to 144fps max settings?
We go through this every generation, "next year Nvidia will blow us away". It never happens which is fine.
 

EatChildren

Wonder from Down Under
Member
Oct 27, 2017
7,026
The 1080 Ti really is a beast. I toyed with the idea of grabbing an RTX, but there's no reason to. Not while knowing NVIDIA's next batch will be better yet again, and improved on ray tracing.

Gaming at 1440p with a 144Hz GSYNC monitor, the 1080 Ti has been easily one of the best PC purchases ever. They're made for each other.
 

leng jai

Member
Nov 2, 2017
15,114
What do you really want from an upgrade though? I upgrade fairly frequently and I'm alright with the 20/30% increases.
Is it that you're expecting to go from 60fps low to 144fps max settings?
We go through this every generation, "next year Nvidia will blow us away". It never happens which is fine.

So to get a 20-30% upgrade over my 1070 I have to get a 2060 Supper which is at least $600AUD here. In what world does that make any sense? Why even bother.
 
Oct 25, 2017
4,839
What do you really want from an upgrade though? I upgrade fairly frequently and I'm alright with the 20/30% increases.
Is it that you're expecting to go from 60fps low to 144fps max settings?
We go through this every generation, "next year Nvidia will blow us away". It never happens which is fine.
What people want from an upgrade is an actual upgrade. The RTX generation provided no price-to-performance improvements at all. The 1080 Ti equivalent is an RTX 2080 which is the same price!

So your post is also wrong: there is no "20/30% increase", it's a 0% increase.
 

leng jai

Member
Nov 2, 2017
15,114
If I'm upgrading my videocard I want at least a 65-75% boost in performance for the same price I paid say 3 years ago. Right now it's barely 50%.

Who goes through the expense snd hassle of changing their video card for 20% performance?
 

Finaika

Member
Dec 11, 2017
13,250
What people want from an upgrade is an actual upgrade. The RTX generation provided no price-to-performance improvements at all. The 1080 Ti equivalent is an RTX 2080 which is the same price!

So your post is also wrong: there is no "20/30% increase", it's a 0% increase.
1080Ti has more VRAM than the RTX2080 too.
 

Darkstorne

Member
Oct 26, 2017
6,805
England
A hypothetical 2060/2070/2080/2080ti with all transistors put towards traditional architecture would have meant a very large uplift in performance, but something like 40% of the die is tensor/RT stuff. They ARE expensive for what you get, especially considering that the RTX portions are useless in most consumer scenarios, but they actually are pretty huge die size and transistor counts.

It's actually kind of impressive that it wasn't more of a disaster than it was, but that's the basics of why we saw what we did with 10xx to 20xx.

The good news is that the next gen should see back to normal increases from new process tech and improvements in memory speed and architecture tuning, assuming an equal % of resources used for tensor and RT portions.
This guy gets it ^

Up until the 20xx series they had been able to focus 100% of die space on traditional performance. With the 20xx series, and the decision to include tensor cores for RTX, they only had 60% of the card space for traditional performance, hence a much smaller uplift in performance.

30xx series should see us going back to expected performance gains again though, where a 3070 more or less matches a 2080ti, provided that 60/40 split remains the same for RTX. We could also be seeing some pretty big leaps for RTX performance though, given it's such a new technology with so much room to grow.

The only scenario where we see a similarly weak performance upgrade with the 30xx series is if tensor cores for RTX go from a 40% share of the cards to over 50%. And while that day is likely to come eventually, I'm not sure it's worth doing yet.
 

Polk

Avenger
Oct 26, 2017
4,203
1080TI like 8800GT before it was "stupid" move from Nvidias point of view. Sure, they dominated the market but it was too good card for the price to make upgrade viable for most people.
 

Tokyo_Funk

Banned
Dec 10, 2018
10,053
Went from a 1080Ti to a 2080Ti OC 11Gb and to me RayTracing has been phenomenal. I can't play Control, Metro or Battlefield without it. That and the creative drivers for 3D modelling/texturing programs I use has improved render times. The holy grail of rendering has been reached and only a few appreciate it.
 
Oct 27, 2017
594
So to get a 20-30% upgrade over my 1070 I have to get a 2060 Supper which is at least $600AUD here. In what world does that make any sense? Why even bother.
I mean I just upgraded my 1080 to a 2070 super, also living in Australia. So I'm not the one to ask. I agree that the pricing is stuffed, especially down here.
I think there is more to an upgrade than just raw performance. If that was all these cards offered I probably wouldn't have upgraded this time around.
 

Lakeside

Member
Oct 25, 2017
9,209
A hypothetical 2060/2070/2080/2080ti with all transistors put towards traditional architecture would have meant a very large uplift in performance, but something like 40% of the die is tensor/RT stuff. They ARE expensive for what you get, especially considering that the RTX portions are useless in most consumer scenarios, but they actually are pretty huge die size and transistor counts. Pascals were pretty small and efficient at each level. Turing does are enormous by comparison, just with almost all of the additional die shrink transistor budget spent on stuff that doesn't get used all that much.

Would love to see your source that backs up the 40% claim on tensor and RTX. I know it was initially thought to be up to 30% but high-resolution images were put out there several months ago that debunked this. Some analysis of these images have put it in the 8-10% range.

 
Last edited:

Dark1x

Digital Foundry
Verified
Oct 26, 2017
3,530
You can really tell who's been PC gaming for a long time vs those who have not.

The Turing cards are a significant upgrade in so many ways but, like GeForce 3, this is just the beginning.
 

JahIthBer

Member
Jan 27, 2018
10,371
Rumours say Ampere should be a good leap & a price cut on the high end models, but as long as AMD doesn't compete, who knows.
 

Bricktop

Attempted to circumvent ban with an alt account
Banned
Oct 27, 2017
2,847
they've ruined the market at this point with their RTX ( not interested) technology and making twice as many cards as before trying to fill in gaps. Have they clamped down on the overclocking as well? My 1070 literary could not overclock at all (may just be a bad one) but it seems like they've reduced overclocking capability recently. I was able to easily get 30% more on my 970.

They didn't ruin anything. They are going to drag you guys along to the future with the rest of us and ray tracing is that future. Many of us have been waiting on for decades and the 20 series was the first step. It had to start somewhere and now even the next console are fully commited, which just makes it that much more of a reality.

People can complain about the performance gains, they can whine about the focus on RT cores, but it had to happen sooner or later and Nividia was the only one who was in a secure enough position to take the leap.

In 5-10 years when ray tracing is the standard all of these complaint posts about it are going to age like milk. They are small minded and short-sighted. Big deal, they went one gen without massive gains everyone is used to, and btw, they also introduced consumer level tech that will change the way games are rendered forever.
 

Hasney

One Winged Slayer
The Fallen
Oct 25, 2017
18,548
Nvidia decided that emerging tech that is giving huge profits, specifically hardware accelerated AI/deep learning, was where to put the new budget for adding transistors this gen. They had to dress it up a bit to make it more appealing to gamers, with obviously mixed to disappointing results thus far.

Each gen with time for engineers to improve efficiency, and use new process tech to increase number of transistors per SKU, you can get a certain predictable increase in raw performance, although you can swing high or low depending on other factors. Say you have to go with a large die and aggressive clocks one gen to maintain competitive footing. Then a process shrink comes along and you have less pressure from the competition. You can choose to do a more direct die shrink without a huge increase in transistor count. Boom, you have a more modest performance leap but a cooler/less power hungry lineup, and more dies per wafer. Alternatively you can go big in the other direction, which makes things more expensive, lower yields, and tougher to power and cool. But maximizes potential performance.

A hypothetical 2060/2070/2080/2080ti with all transistors put towards traditional architecture would have meant a very large uplift in performance, but something like 40% of the die is tensor/RT stuff. They ARE expensive for what you get, especially considering that the RTX portions are useless in most consumer scenarios, but they actually are pretty huge die size and transistor counts. Pascals were pretty small and efficient at each level. Turing does are enormous by comparison, just with almost all of the additional die shrink transistor budget spent on stuff that doesn't get used all that much.

It's actually kind of impressive that it wasn't more of a disaster than it was, but that's the basics of why we saw what we did with 10xx to 20xx.

The good news is that the next gen should see back to normal increases from new process tech and improvements in memory speed and architecture tuning, assuming an equal % of resources used for tensor and RT portions.

Alternatively, a hypothetical 30xx series that also fully abandoned RT/Tensor would see monumental uplift in performance, but that is probably at roughly 0% chance of happening.

Personally, I feel that it was too early to dedicate die space to tensor RT stuff. Even my 2080ti is only meh at RT applications, so it feels like a bit of a waste. But, the gen that started implementation would always be the roughest one, so at least that's out of the way.

5 or 6nm EUV and 2020+ we should see a substantial improvement in performance across the board. Whether or not that means that raytracing or especially DLSS amount to anything worthwhile remains to be seen however.

Yup, agreed with most this. While the 2000 series was not going to be the upgrade for me, I'm glad they did it to get Raytracing out there. Those tensor codes are not cheap and I don't think just performance numbers are good enough in this instance.

Roll on next year though
 

ss_lemonade

Member
Oct 27, 2017
6,641
You can really tell who's been PC gaming for a long time vs those who have not.

The Turing cards are a significant upgrade in so many ways but, like GeForce 3, this is just the beginning.
I started late. Our first real PC was a prebuilt one with a geforce 4 MX, which we swapped out for a 4 Ti 4200 and that felt like a generational leap lol
 

Jimrpg

Member
Oct 26, 2017
3,280
They didn't ruin anything. They are going to drag you guys along to the future with the rest of us and ray tracing is that future. Many of us have been waiting on for decades and the 20 series was the first step. It had to start somewhere and now even the next console are fully commited, which just makes it that much more of a reality.

People can complain about the performance gains, they can whine about the focus on RT cores, but it had to happen sooner or later and Nividia was the only one who was in a secure enough position to take the leap.

In 5-10 years when ray tracing is the standard all of these complaint posts about it are going to age like milk. They are small minded and short-sighted. Big deal, they went one gen without massive gains everyone is used to, and btw, they also introduced consumer level tech that will change the way games are rendered forever.

Exactly - it's the future not now. It's been a year how many titles are RTX enabled? And like a substantial improvement not just a puddle on the ground? Who wants to spend 25% more on the card for the same performance as what people got last gen.

This was a price hike with RTX that barely offered anything for people who invested for a year. There were 5% of people who wanted RTX, the other 95% of people just wanted better value.

Also I hear the RTX performance will be substantially improved with the next set of cards, wouldn't it have been better if they started then?
 

Hasney

One Winged Slayer
The Fallen
Oct 25, 2017
18,548
Also I hear the RTX performance will be substantially improved with the next set of cards, wouldn't it have been better if they started then?

No, because you have to start somewhere and another generation of R&D costs would have just been added to this next generation while the software would be starting from square 1 as well.

The 2000 series was a needed one, even if it was a skip year if that makes sense.
 

Bricktop

Attempted to circumvent ban with an alt account
Banned
Oct 27, 2017
2,847
Exactly - it's the future not now. It's been a year how many titles are RTX enabled? And like a substantial improvement not just a puddle on the ground? Who wants to spend 25% more on the card for the same performance as what people got last gen.

This was a price hike with RTX that barely offered anything for people who invested for a year. There were 5% of people who wanted RTX, the other 95% of people just wanted better value.

Also I hear the RTX performance will be substantially improved with the next set of cards, wouldn't it have been better if they started then?

The first step in any new technology is painful. You don't just wake up one day and have perfectly running self driving cars, just ask Uber. You can't get to B without A. 20 series RTX cards were A in terms of the tech reaching consumers. The future doesn't just happen, otherwise why we do anything now when we could just wait until a perfect version of what we want exists later? Performance will be better 3 gens from now than it is today, why not just wait and start then?

Luckily for you, the 95% of the people who just wanted better value will also be able to enjoy the fruits of labor that the other 5% of us were willing to take a chance on. You're welcome.
 

capitalCORN

Banned
Oct 26, 2017
10,436
Even with 3d acceleration we had duds like the Virge. People need to chill out. Hell, Nvidia's original TNT was basically a glorified GL Quake accelerator.
 
Last edited:

Foxashel

Banned
Jul 18, 2019
710
I have a GTX 1080 (non Ti) and I play on a 4k TV. I rarely have my resolution set to 4k, or I scale down. I would love to upgrade, but the price to performance gains right now just don't seem to make much sense. I should have gotten a Ti, but they weren't out yet. I want to die.
 
OP
OP

Deleted member 40102

User requested account closure
Banned
Feb 19, 2018
3,420
You can really tell who's been PC gaming for a long time vs those who have not.

The Turing cards are a significant upgrade in so many ways but, like GeForce 3, this is just the beginning.
You got me I'm only 2 years with pc.

But I was always watching the prices of graphics cards and still with what I said RTX upgrade doesn't hold up well with the huge price increase especially if you are someone who couldn't give two shits about RTX
 
OP
OP

Deleted member 40102

User requested account closure
Banned
Feb 19, 2018
3,420
Rumours say Ampere should be a good leap & a price cut on the high end models, but as long as AMD doesn't compete, who knows.
Oh you bet Nvidia find new born next gen a big competition. I have a feeling they gonna do everything in their power to have best price possible for high end rtx especially for 2020, then probably they will get back to their old habits when next gen consoles sales slows down.
 

icecold1983

Banned
Nov 3, 2017
4,243
No competition. We need someone to keep nvidia in check. the 5700 series invalidates all the RTX gpus below the 2080ti, we just need a high end model to finish the job.
 

DonMigs85

Banned
Oct 28, 2017
2,770
Pray that AMD and I guess Intel soon really gives them a run for their money. But 7nm Ampere GPUs should have a healthy boost over Turing. Then we wait for 5nm
 

BigTnaples

Member
Oct 30, 2017
1,752
Well Control (and other games) with all RTX on looks a generation leap better than any other card or console without RT. So I'm more than happy.
 

Dark1x

Digital Foundry
Verified
Oct 26, 2017
3,530
You got me I'm only 2 years with pc.

But I was always watching the prices of graphics cards and still with what I said RTX upgrade doesn't hold up well with the huge price increase especially if you are someone who couldn't give two shits about RTX
Well, here's the thing.

1) RT is the future of rendering. It has to start somewhere. It was never going to be a convenient time but, personally, I think the end of a console generation is THE BEST time for it as most games aren't so demanding as to require that much more power than a 1080ti can deliver. There are a few exceptions but we're good right now - next-gen will bring new, more demanding games where larger leaps are more necessary.

2) This ties into my comment - PC hardware in the last decade has become all about performance boosts but little else. It's just about increasing what we already have. That also means that newcomers to the world of PC just sort of feel 'that's how it should be' when, in reality, it wasn't like this in the past. Each new paradigm shift in graphics requires sacrifice. As I said, programmable shaders were a key feature of GeForce 3 and it was critical to the development of graphics - but the GF3 wasn't a super fast card and few games launched with those features during its heyday. It needed to happen, however. Same deal with hardware T&L on GeForce 256 and many many other cards from different companies. Important features were added that were critical to the future of graphics but you didn't always get massive performance boosts. That's what is happening here.

Basically, what I'm saying is that you need to look beyond simple performance boosts and consider the big picture. Is it expensive? Yes but the answer to that is - wait. This was an important leap and it needed to happen. This was the absolute best time for it too.

This is why the knee-jerks reactions from people do bother me - that attitude hurts the advance of graphics for a boost that we don't REALLY need at this exact moment. I understand why people want that typical upgrade cycle but this is better for the future of graphics and next-generation games.

So, if it's too expensive and you're disappointed - again, just wait. Pickup a used 1080ti if you haven't already and be happy - it can run all games just fine at high resolutions.

No competition. We need someone to keep nvidia in check. the 5700 series invalidates all the RTX gpus below the 2080ti, we just need a high end model to finish the job.
Ha, what? The 5700 series most certainly does not.
 
Nov 8, 2017
13,077
Ha, what? The 5700 series most certainly does not.

I assume he's operating under the logic that the RTX features aren't worthwhile, so on a purely raster-performance-per-dollar level the 5700 series is superior to the RTX series while being "close enough" to the 2070s / 2080, but not the 2080ti, which is clearly a performance tier higher.

I don't agree with that assessment but it isn't an uncommon sentiment I see in internet tech circles right now.
 

Iztok

Member
Oct 27, 2017
6,130
1080ti was a performance outlier, that's why you feel like this.

It's still an amazing card today.

I'm on a 980Ti and still can't justify an upgrade. I'm hoping 3080Ti changes that.
 

sweetmini

Member
Jun 12, 2019
3,921
What do you really want from an upgrade though? I upgrade fairly frequently and I'm alright with the 20/30% increases.
Is it that you're expecting to go from 60fps low to 144fps max settings?
We go through this every generation, "next year Nvidia will blow us away". It never happens which is fine.

I guess in the back of people s mind are the leaps that were the local bus, the 3dfx voodoo graphics or the hardware T&L (biggest leaps i can remember). RTX is somewhat on that magnitude of change, but it would be visible only if games came with mandatory raytraced effects (like some games were barely playable if you didn t have a local bus, or a 3dfx, or had to use CPU based transform and lighting).
 
Oct 25, 2017
4,839
Same deal with hardware T&L on GeForce 256 and many many other cards from different companies. Important features were added that were critical to the future of graphics but you didn't always get massive performance boosts. That's what is happening here.
I actually remember that I couldn't play a Spider-Man movie tie-in game because it required a T&L graphics card and the game would not start without one. Meanwhile my big brother's PC did have one so I had to play it whenever he was away.

It makes me wonder if we'll see RT Only games in a few years. But we need AMD to release their RT cards first.
 
OP
OP

Deleted member 40102

User requested account closure
Banned
Feb 19, 2018
3,420
Well, here's the thing.

1) RT is the future of rendering. It has to start somewhere. It was never going to be a convenient time but, personally, I think the end of a console generation is THE BEST time for it as most games aren't so demanding as to require that much more power than a 1080ti can deliver. There are a few exceptions but we're good right now - next-gen will bring new, more demanding games where larger leaps are more necessary.

2) This ties into my comment - PC hardware in the last decade has become all about performance boosts but little else. It's just about increasing what we already have. That also means that newcomers to the world of PC just sort of feel 'that's how it should be' when, in reality, it wasn't like this in the past. Each new paradigm shift in graphics requires sacrifice. As I said, programmable shaders were a key feature of GeForce 3 and it was critical to the development of graphics - but the GF3 wasn't a super fast card and few games launched with those features during its heyday. It needed to happen, however. Same deal with hardware T&L on GeForce 256 and many many other cards from different companies. Important features were added that were critical to the future of graphics but you didn't always get massive performance boosts. That's what is happening here.

Basically, what I'm saying is that you need to look beyond simple performance boosts and consider the big picture. Is it expensive? Yes but the answer to that is - wait. This was an important leap and it needed to happen. This was the absolute best time for it too.

This is why the knee-jerks reactions from people do bother me - that attitude hurts the advance of graphics for a boost that we don't REALLY need at this exact moment. I understand why people want that typical upgrade cycle but this is better for the future of graphics and next-generation games.

So, if it's too expensive and you're disappointed - again, just wait. Pickup a used 1080ti if you haven't already and be happy - it can run all games just fine at high resolutions.
So would you say 2020 cards are the one going to be bigger leaps ?
 

BigTnaples

Member
Oct 30, 2017
1,752
Well, here's the thing.

1) RT is the future of rendering. It has to start somewhere. It was never going to be a convenient time but, personally, I think the end of a console generation is THE BEST time for it as most games aren't so demanding as to require that much more power than a 1080ti can deliver. There are a few exceptions but we're good right now - next-gen will bring new, more demanding games where larger leaps are more necessary.

2) This ties into my comment - PC hardware in the last decade has become all about performance boosts but little else. It's just about increasing what we already have. That also means that newcomers to the world of PC just sort of feel 'that's how it should be' when, in reality, it wasn't like this in the past. Each new paradigm shift in graphics requires sacrifice. As I said, programmable shaders were a key feature of GeForce 3 and it was critical to the development of graphics - but the GF3 wasn't a super fast card and few games launched with those features during its heyday. It needed to happen, however. Same deal with hardware T&L on GeForce 256 and many many other cards from different companies. Important features were added that were critical to the future of graphics but you didn't always get massive performance boosts. That's what is happening here.

Basically, what I'm saying is that you need to look beyond simple performance boosts and consider the big picture. Is it expensive? Yes but the answer to that is - wait. This was an important leap and it needed to happen. This was the absolute best time for it too.

This is why the knee-jerks reactions from people do bother me - that attitude hurts the advance of graphics for a boost that we don't REALLY need at this exact moment. I understand why people want that typical upgrade cycle but this is better for the future of graphics and next-generation games.

So, if it's too expensive and you're disappointed - again, just wait. Pickup a used 1080ti if you haven't already and be happy - it can run all games just fine at high resolutions.


Ha, what? The 5700 series most certainly does not.


This. So much.
 

Lazlow

Member
Oct 27, 2017
1,137
I'm sitting on a 970 and it holds up really well but I only play at 1080p. Considered getting a 1070 but im also rocking a 2600k so not sure it's really worth it; might just see this set up out until a complete rebuild.
 

DonMigs85

Banned
Oct 28, 2017
2,770
I'm sitting on a 970 and it holds up really well but I only play at 1080p. Considered getting a 1070 but im also rocking a 2600k so not sure it's really worth it; might just see this set up out until a complete rebuild.
Yeah I would probably wait for Ampere next year or AMD's next cards, and maybe either a Ryzen 4XXX or Intel's next architecture and socket
 

Dark-VIII

Member
Oct 27, 2017
166
planning to change my 1080 with a new card next year with the release of the 7nm cards, hopefully the upgrade worth it
 
Oct 29, 2017
13,470
The price increase was absurd, but the performance increase was not worse than 5XX to 6XX and 6XX to 7XX series. 7XX to 9XX and 9XX to 10XX were great though.