• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.
  • We have made minor adjustments to how the search bar works on ResetEra. You can read about the changes here.

Hace

Member
Sep 21, 2018
894
Since when has new tech ever launched with day 1 games? Were you around when the initial DX8 bump mapping and stuff hit w/ GF3? It took literally years but was a huge improvement. What about when the initial DX9C stuff hit with the R9700? Or full programmability with GTX8800? Or the modern implementation of tessellation with GTX480?
It's hard enough to convince devs to adopt new technologies quickly, expecting them at launch just isn't reasonable and has never happened historically. You can't compare a new process launch to something like Pascal which was basically a refresh of Maxwell with no actual new tech to speak of (which was really just a performance/efficiency fix for Kepler...).
ZUgmj64.png
hSAW0qC.png


Fermi had problems with actual support yeah, but it was also an improvement simply as a videocard, and didn't just stagnate the product line a totally insane way.
 

Exentryk

Member
Oct 25, 2017
3,236
some performance info on control from remedy

9.2 ms additional render time @ 1080p on 2080ti. id say 60 fps is extremely unlikely given this info
Nice to have some numbers. This is one of the games I'm planning on buying and the gameplay vid of it looked real life at times. So the 2080ti might be able to do 40+ fps at 1080p with ray tracing on? Hopefully they also use DLSS so we can at least get to 1440p. With a Gsync monitor (or HDMI 2.1 if it releases by then), 40+ fps might be good enough.
 

icecold1983

Banned
Nov 3, 2017
4,243

Comparing all these overclocked turing gpus to founders edition pascals is so misleading. The founders edition cards lowered clocks by a couple hundred mhz due to insufficient cooling. The results narrow quite a bit when you remove that bottleneck
 

Deleted member 34239

User requested account closure
Banned
Nov 24, 2017
1,154

Talk about disingenuous charts. Why is the 1080 founders edition being used when the 2070 founders edition cards don't have blower style coolers? Testing from other publications have already shown that the gap between these cards is near indiscernible. Publications like Gamers Nexus for example. This is the worst kind of journalism and is very misleading. If this is a Nvidia mandate then shame but if this is computerbase mandate then it's even worse. Useless charts, smh.
 

Deleted member 25042

User requested account closure
Banned
Oct 29, 2017
2,077
Talk about disingenuous charts. Why is the 1080 founders edition being used when the 2070 founders edition cards don't have blower style coolers? Testing from other publications have already shown that the gap between these cards is near indiscernible. Publications like Gamers Nexus for example. This is the worst kind of journalism and is very misleading. If this is a Nvidia mandate then shame but if this is computerbase mandate then it's even worse. Useless charts, smh.

They didn't review the 2070 FE though
This is the card they reviewed (lower boost clock than FE)

 

Deleted member 34239

User requested account closure
Banned
Nov 24, 2017
1,154
They didn't review the 2070 FE though
This is the card they reviewed (lower boost clock than FE)
So what you're saying is that their review is even worse? I'm not exactly sure why they would use that card in their review. EVGA's 500 has a significantly better cooler than the one seen in the picture. Regardless, their review is very flawed. Next time I see a review from computerbase.de, I'll file it in the trash can where it belongs.
 
Last edited:

Deleted member 25042

User requested account closure
Banned
Oct 29, 2017
2,077
So what you're saying is that their review is even worse? Thanks for the heads up. Next time I see a review from computerbase.de, I'll file it in the trash can where it belongs.

OK you do that.

For everyone else, there's absolutely nothing wrong with that 2070 review from one of the best hardware site around.
It's blower vs blower and shows a 8% gap @1440p between a 1080 FE and the Asus 2070 Turbo
For reference, in the 20 game bench from Hardware Unboxed above the gap @1440p was 7%.
 

Deleted member 17092

User requested account closure
Banned
Oct 27, 2017
20,360
OK you do that.

For everyone else, there's absolutely nothing wrong with that 2070 review from one of the best hardware site around.
It's blower vs blower and shows a 8% gap @1440p between a 1080 FE and the Asus 2070 Turbo
For reference, in the 20 game bench from Hardware Unboxed above the gap @1440p was 7%.

For some folks that 7% gap plus rtx is worth $100-$200 extra, for some folks it isn't. I'd say in most cases if you already have a 1080/64 it isn't worth selling your card for $300ish and spending $300 for that 7% and rtx. If you have something older it's not a bad option at all.
 

Deleted member 34239

User requested account closure
Banned
Nov 24, 2017
1,154
OK you do that.

For everyone else, there's absolutely nothing wrong with that 2070 review from one of the best hardware site around.
It's blower vs blower and shows a 8% gap @1440p between a 1080 FE and the Asus 2070 Turbo
For reference, in the 20 game bench from Hardware Unboxed above the gap @1440p was 7%.
So you speak for everybody now? My bad..... Anyway, my issue with the review is stems from the fact that if nvidia deemed it necessary to steer clear of a blower design on their FE version, then it stands to reason that their conclusion is that the blower design does not offer adequate performance to show the chip in it's best light. Furthermore, a large majority of the cards 2070 cards sold will have a cooler with >= 2 fans. Why then would you review a blower card? What is the benefit of such a review to general consumer when both the average gtx 1080 and gtx 2070 offer far better performance than what is listed in their numbers. It makes no sense.

Anyway, in the end, the gtx 2070 offers gtx 1080 performance for gtx 1080ti prices. It's a terrible buy but not as terrible as the 2080.
 

Lakeside

Member
Oct 25, 2017
9,231
So you speak for everybody now? My bad..... Anyway, my issue with the review is stems from the fact that if nvidia deemed it necessary to steer clear of a blower design on their FE version, then it stands to reason that their conclusion is that the blower design does not offer adequate performance to show the chip in it's best light. Furthermore, a large majority of the cards 2070 cards sold will have a cooler with >= 2 fans. Why then would you review a blower card? What is the benefit of such a review to general consumer when both the average gtx 1080 and gtx 2070 offer far better performance than what is listed in their numbers. It makes no sense.

Anyway, in the end, the gtx 2070 offers gtx 1080 performance for gtx 1080ti prices. It's a terrible buy but not as terrible as the 2080.

Not to mention that the Nvidia FE blowers are pretty good, especially compared to the junk blowers AIB cards have used.
 

Deleted member 25042

User requested account closure
Banned
Oct 29, 2017
2,077
So you speak for everybody now? My bad..... Anyway, my issue with the review is stems from the fact that if nvidia deemed it necessary to steer clear of a blower design on their FE version, then it stands to reason that their conclusion is that the blower design does not offer adequate performance to show the chip in it's best light. Furthermore, a large majority of the cards 2070 cards sold will have a cooler with >= 2 fans. Why then would you review a blower card? What is the benefit of such a review to general consumer when both the average gtx 1080 and gtx 2070 offer far better performance than what is listed in their numbers. It makes no sense.

Anyway, in the end, the gtx 2070 offers gtx 1080 performance for gtx 1080ti prices. It's a terrible buy but not as terrible as the 2080.

I don't.
The second part of my post was intended for everyone else.

Hardware sites review the products they're sent.
There were a lot of people claiming that most RTX reviews were kinda flawed because of the dual axis FE vs last gen FE blower so in a way that kind of blower vs blower review is interesting imo.
And, as the hardware unboxed review shows, whether it's blower vs blower or dual axis fans, the gap in performance is similar so it's a moot point anyway.
 

Paxton25

Member
May 9, 2018
1,899
Bit of topic but Iv got a 1060 3gb card and wanted to upgrade to these new cards, but after seeing the reviews I'm not willing to drop the cash for the RTXs. What card would people suggest for good 4K gaming ? 1080ti I presume..
 

Lakeside

Member
Oct 25, 2017
9,231
Bit of topic but Iv got a 1060 3gb card and wanted to upgrade to these new cards, but after seeing the reviews I'm not willing to drop the cash for the RTXs. What card would people suggest for good 4K gaming ? 1080ti I presume..

Short of a 2080 Ti, it's about your best option. Don't expect consistent 4K60 though, we are just starting to get close.
 

low-G

Member
Oct 25, 2017
8,144
Damn, still no word on my Step Up from EVGA. I really hope I don't have to wait months.

Is there something you're failing to get the performance you want out of? I personally intend to wait until the last minute to decide if I want to go 2080Ti (hopefully some RT stuff out by mid-Dec!), but there's nothing rasterized that my 2080 can't do that I need right now.
 

Kyle Cross

Member
Oct 25, 2017
8,447
Is there something you're failing to get the performance you want out of? I personally intend to wait until the last minute to decide if I want to go 2080Ti (hopefully some RT stuff out by mid-Dec!), but there's nothing rasterized that my 2080 can't do that I need right now.
The 2080 is struggling at locked 60 at 4k Semi-Ultra already. For $300 more I might as well get a significantly better card.
 

Ra

Rap Genius
Moderator
Oct 27, 2017
12,240
Dark Space
If I lived in Australia,you'd catch me on the next season of Locked Up Abroad for smuggling computer parts.
 

Durante

Dark Souls Man
Member
Oct 24, 2017
5,074
So what you're saying is that their review is even worse? I'm not exactly sure why they would use that card in their review. EVGA's 500 has a significantly better cooler than the one seen in the picture. Regardless, their review is very flawed. Next time I see a review from computerbase.de, I'll file it in the trash can where it belongs.
Don't be silly.

Computerbase has excellent reviews -- some of the best on the net currently, actually, given the decline of several other sites -- and comparing a blower-style card to a blower-style card is valid. Furthermore, reviewing a lower-cost option is just as valuable as reviewing a high-end model, it just needs to be represented accurately. Which it is.
 

low-G

Member
Oct 25, 2017
8,144
The 2080 is struggling at locked 60 at 4k Semi-Ultra already. For $300 more I might as well get a significantly better card.

If you're shooting for 4K 60+ I can see that for sure. I'm in the same situation where $300 will basically buy an extra generation of power (I'm sure I won't buy the same leap for only $300 whenever the next gen comes out).
 

Deleted member 15632

User requested account closure
Banned
Oct 27, 2017
314
So wait, pairing a 2080 with a i7 8600K cpu seems good?

I want to grab a 2080ti but paired with current mobo/cpu setup will throttle the hell out of it...
 

icecold1983

Banned
Nov 3, 2017
4,243
computerbase article on DLSS and RT

nvidia only plans on DLSS for 4k and its not known if it will work with DSR or if you need an actual 4k monitor. that really lowers its usability IMO. nvidia says it may at some point add a 1440p option.

the article also confirms that the star wars demo ran at native 1080p and DLSS upscaled it to 4k.

and finally it says Dice has announced that the RT effects will be reduced from what was shown at the gamescom demo in the final build of the game.
 
Last edited:

VictoryLap77

Member
Oct 25, 2017
100
I currently have a 1070 paired with an i5 6600k OC'ed to 4.4 ghz. Would it be worth it to consider a 2080, 1080ti or neither before upgrading my cpu? I'm aiming for high fps at 1440p.
 

Bricktop

Attempted to circumvent ban with an alt account
Banned
Oct 27, 2017
2,847

Vash63

Member
Oct 28, 2017
1,681
computerbase article on DLSS and RT

nvidia only plans on DLSS for 4k and its not known if it will work with DSR or if you need an actual 4k monitor. that really lowers its usability IMO. nvidia says it may at some point add a 1440p option.

the article also confirms that the star wars demo ran at native 1080p and DLSS upscaled it to 4k.

and finally it says Dice has announced that the RT effects will be reduced from what was shown at the gamescom demo in the final build of the game.

Whaaa? 4k only? That seems pretty crazy to lock something like this to a single resolution... hope we get more info soon.
 

Dictator

Digital Foundry
Verified
Oct 26, 2017
4,935
Berlin, 'SCHLAND
computerbase article on DLSS and RT

nvidia only plans on DLSS for 4k and its not known if it will work with DSR or if you need an actual 4k monitor. that really lowers its usability IMO. nvidia says it may at some point add a 1440p option.

the article also confirms that the star wars demo ran at native 1080p and DLSS upscaled it to 4k.

and finally it says Dice has announced that the RT effects will be reduced from what was shown at the gamescom demo in the final build of the game.
Did Computerbase make any mention of where NV said that? It mentions in the German that NV has made it known to be the case essentially, but I have not read that anywhere and there is no link to that that I can see. Also, the DLSS in the Star Wars demo works at multiple resolutions and not just 3840X2160.
 

icecold1983

Banned
Nov 3, 2017
4,243
Did Computerbase make any mention of where NV said that? It mentions in the German that NV has made it known to be the case essentially, but I have not read that anywhere and there is no link to that that I can see. Also, the DLSS in the Star Wars demo works at multiple resolutions and not just 3840X2160.

Nah no mention. Im guessing it was their own personal correspondence with nvidia. Doesnt the star wars demo only work at nvidias preset resolutions if you want to use DLSS? computerbase says the algorithm has to be retrained for every separate resolution
 

Deleted member 22585

User requested account closure
Banned
Oct 28, 2017
4,519
EU
computerbase article on DLSS and RT

nvidia only plans on DLSS for 4k and its not known if it will work with DSR or if you need an actual 4k monitor. that really lowers its usability IMO. nvidia says it may at some point add a 1440p option.

the article also confirms that the star wars demo ran at native 1080p and DLSS upscaled it to 4k.

and finally it says Dice has announced that the RT effects will be reduced from what was shown at the gamescom demo in the final build of the game.

Oh wow. If true, then I'm happy that I didn't buy one of the new cards yet. Then I'll gladly ride out my 1080ti until the next gen cards arrive.
 

Serious Sam

Banned
Oct 27, 2017
4,354
Sounds like DLSS will be just another momentary Nvidia gimmick created to help their marketing division and move hardware. It will last about a year or two, will get implemented in a couple of games and then forgotten. You need RTX card, 4K monitor and game developers to implement it in each game. These are some awfully steep requirements for what is basically an upscaling method.
 

icecold1983

Banned
Nov 3, 2017
4,243
Sounds like DLSS will be just another momentary Nvidia gimmick created to help their marketing division and move hardware. It will last about a year or two, will get implemented in a couple of games and then forgotten. You need RTX card, 4K monitor and game developers to implement it in each game. These are some awfully steep requirements for what is basically an upscaling method.
Imo this sillicon is completely designed for non gaming markets. Nvidia has such a lead over AMD they focused all efforts into what must be a potentially more lucrative market. This is probably the best gaming use they could find for the sillicon. Just my speculative opinion and id much rather be wrong for sure but i dont see these features really being very good for the gaming consumer
 

Deleted member 22585

User requested account closure
Banned
Oct 28, 2017
4,519
EU
Sounds like DLSS will be just another momentary Nvidia gimmick created to help their marketing division and move hardware. It will last about a year or two, will get implemented in a couple of games and then forgotten. You need RTX card, 4K monitor and game developers to implement it in each game. These are some awfully steep requirements for what is basically an upscaling method.

Oh man I really hope that you're wrong because it has so much potential.
But giving Nvidias track record with their features, it's difficult to be optimistic. Most stuff appeared in a couple of games and then vanished.
 

low-G

Member
Oct 25, 2017
8,144
computerbase article on DLSS and RT

nvidia only plans on DLSS for 4k and its not known if it will work with DSR or if you need an actual 4k monitor. that really lowers its usability IMO. nvidia says it may at some point add a 1440p option.

the article also confirms that the star wars demo ran at native 1080p and DLSS upscaled it to 4k.

and finally it says Dice has announced that the RT effects will be reduced from what was shown at the gamescom demo in the final build of the game.

I could see the initial rollout of data being for 4K, because 4K 60fps is likely the priority for people and 1440p xxx fps being 2nd priority. I could see it being bound to several resolutions. As I said a long time ago in some thread, it didn't make sense for there to be some arbitrary minimum needed resolution, otherwise you really are extrapolating something from nothing. I'd expect DSR will work fine with it. All remains to be seen of course.

Dice said they will have scalable RT effects one way or another (either a slider or presets). I don't think they're going to just eliminate the effects they already had running fine (at least on a 2080Ti).
 

low-G

Member
Oct 25, 2017
8,144
are there any good reviews for the evga 2080 ti xc ultra gaming card?

Just curious about what you want to know that isn't covered in a general review, specific temps? The XC Ultra series definitely runs a bit quieter and cooler than Nvidia's FE. I often see 8C and ~4db difference on the 2080 (non-Ti) side.