• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.

Vash63

Member
Oct 28, 2017
1,681
I'm looking to upgrade my GPU but still undecided, I'm going the 1440p 144Hz route and trying to find out if it's worth forking an extra 300€ for a Ti compared to a regular 2080, is there anything more in it besides an extra 10-20 fps?

It's basically a straight 30-40% spec boost, except in Ray Tracing where it's only 25% faster.

It's a pretty big boost in perf but it's not any different architecture wise, just more of everything. It's also a lot more money though, so really budget dependent.
 

brainchild

Independent Developer
Verified
Nov 25, 2017
9,478
It is impossible to always get arbitrary images correct to arbitrary precision. AI algorithms make reasonable guesses regarding the complete image on the basis of statistical regularities. Those statistical regularities are not going to be perfectly predictive and in some edge cases may be misleading for the same reasons that election polling results are not perfectly predictive and are sometimes misleading.

Now, it may be that something like this ends up being "good enough" that it eventually becomes the default rendering method for many applications. But the way some proponents talk about, it seems like they think it is actually magic. It's not magic. It's generalization from a sample to a population based on statistical inference.

As with many people before you who have made incorrect assumptions about DLSS and NVIDIA's DNN models, the inferencing model does not solely rely on completely arbitrary information, which is why each model needs to be pre-trained on a per-game basis. It is a combination of artificial and human intelligence that effectively produce the final result, it is not a simple statistical inference model.

I understand that the technology is still very new, but please do some research before speaking about new technology, especially if you're otherwise going to spread misinformation.
 

MrBob

Member
Oct 25, 2017
6,668
But your card was charged yes?
I think best buy just authorizes with a credit card. You pay in full right away if you use PayPal though. I have a purchase from Sept 7 still in progress. I'll wait it out since I was able to use a bday coupon and I doubt I'll be and able to find one cheaper anytime soon.

From what I've read nvidia haven't shipped their 2080 TI yet so who knows when it will come.
 
Last edited:
Oct 27, 2017
9,418
As with many people before you who have made incorrect assumptions about DLSS and NVIDIA's DNN models, the inferencing model does not solely rely on completely arbitrary information, which is why each model needs to be pre-trained on a per-game basis. It is a combination of artificial and human intelligence that effectively produce the final result, it is not a simple statistical inference model.

I understand that the technology is still very new, but please do some research before speaking about new technology, especially if you're otherwise going to spread misinformation.

But that's what he said, "It's generalization from a sample". No one knows exactly what it will produce in non specific curated inviroments, well other than nvidia, and it sounds like they are still ironing it out as well. I honestly think that until we see real world examples everything is speculation one way or the other.
 

the_wart

Member
Oct 25, 2017
2,261
As with many people before you who have made incorrect assumptions about DLSS and NVIDIA's DNN models, the inferencing model does not solely rely on completely arbitrary information, which is why each model needs to be pre-trained on a per-game basis. It is a combination of artificial and human intelligence that effectively produce the final result, it is not a simple statistical inference model.

I understand that the technology is still very new, but please do some research before speaking about new technology, especially if you're otherwise going to spread misinformation.

I already knew all that, none of it contradicts what I said, and yes it is a statistical inference model just like every other machine learning algorithm. I never said it was a simple one.

In theory I suppose the could make the algorithm perform "perfectly" by restricting the renderer to output only images that the algorithm could perfectly predict, but that would be a very silly thing to do.
 

brainchild

Independent Developer
Verified
Nov 25, 2017
9,478
But that's what he said, "It's generalization from a sample". No one knows exactly what it will produce in non specific curated inviroments, well other than nvidia, and it sound like they are still ironing it out as well. I honestly think that until we see real world examples everything is speculation one way or the other.

But that's not what I said. I would advise you reread the post that you quoted.

I already knew all that, none of it contradicts what I said, and yes it is a statistical inference model just like every other machine learning algorithm. I never said it was a simple one.

In theory I suppose the could make the algorithm perform "perfectly" by restricting the renderer to output only images that the algorithm could perfectly predict, but that would be a very silly thing to do.

But the end result is not merely the consequence of statistical inferencing, so to suggest as much is just misinformation. Using human visual analysis to correct inferencing errors is not statistical inferencing.

And the back propagation training aims to get as close to a 'perfect' reconstruction as possible, correcting any errors that arise during training. The corrections aren't based on statistics, but human visual interpretation.
 

the_wart

Member
Oct 25, 2017
2,261
But the end result is not merely the consequence of statistical inferencing, so to suggest as much is just misinformation. Using human visual analysis to correct inferencing errors is not statistical inferencing.

And the back propagation training aims to get as close to a 'perfect' reconstruction as possible, correcting any errors that arise during training. The corrections aren't based on statistics, but human visual interpretation.

Unless you're referring to something else, the fact that they train the network using backprop doesn't have anything to do with anything. It's just a method for fitting a model.

A statistical model that has been hand-tweaked (or even hand-designed) by humans is still a statistical model. Whether it's an off-the-shelf algorithm or a bespoke human-tweaked algorithm, what's going on is still the generalization of a sampled population of images and pieces of images to unknown and fundamentally uncertain "complete" images. This is fundamentally, definitionally, a problem of statistical inference. The involvement of a human doesn't change that.

Hell, the secret to DLSS could be a tiny man living inside the RTX tensor cores frantically hand-painting images extrapolated from the low-res images the rest of the card is feeding him. It's still statistical inference, just performed by the deep neural network in the guy's head instead of the one in the silicon.

The only way DLSS could be perfect is if a low res image perfectly predicted the higher res image. In other words, the higher res image could not actually include any new information, and so nothing is actually unknown. Equivalently, for every low res image there would have to be only one possible corresponding high-res image. This is not possible for the real world, though I guess it is conceivable for a 3D rendered world; however that would require limiting the amount of visual information the 3D rendered world actually contains.
 

Bricktop

Attempted to circumvent ban with an alt account
Banned
Oct 27, 2017
2,847
I don't know how DLSS is going to end up, but one thing that has bugged me about the criticism is the argument that everything Nvidia has shown has been "on rails" type content and we don't know how well it will hold up when things change. But doesn't the FFXV demo do exactly that? There is a fight sequence in that demo that changes every time you run it, and the image quality stays the same. We definitely need some more demos, or games, but if it's capable of keeping that part of the demo looking the same regardless of what happens in that scene it's certainly capable of doing it for any other game.
 

brainchild

Independent Developer
Verified
Nov 25, 2017
9,478
Unless you're referring to something else,

I am. I'm referring to the DLSS methodology as a whole, which is what matters, not just the statistical inferencing part of the process. My point is that DLSS as a technology is not just a statistical model; it involves a lot of steps before it even works in a shipped product.

As a technology, it is possible for the AI to eventually reach a point where the result is perfectly predictive in that the result is indistinguishable from the ideal counterpart. Statistically, it will not be perfectly predictive, but as I've said many, many times before, the technology is not just about statistics and that is not what is going to matter to consumers.
 

the_wart

Member
Oct 25, 2017
2,261
I am. I'm referring to the DLSS methodology as a whole, which is what matters, not just the statistical inferencing part of the process. My point is that DLSS as a technology is not just a statistical model; it involves a lot of steps before it even works in a shipped product.

As a technology, it is possible for the AI to eventually reach a point where the result is perfectly predictive in that the result is indistinguishable from the ideal counterpart. Statistically, it will not be perfectly predictive, but as I've said many, many times before, the technology is not just about statistics and that is not what is going to matter to consumers.

Well then we're talking about fundamentally different things so I guess I'll leave it at a few final remarks: a) "perfect" is a very different thing than "close enough for consumer purposes" so I would not bandy that word about unless you want to get into Arguments About It on the Internet, b) we really don't know what the limits are of this kind of technology are, so you are counting your chickens etc, and c) you have an overly reductive idea of what a statistical model is.
 

Kyle Cross

Member
Oct 25, 2017
8,407
I'm second guessing my 2080 purchase. I realized it's silly to drop $850 on a card when I can spend another $400 on top of that for a much better one that'll starve off an upgrade even longer.

Question is, can I even return it to Newegg?
 

the_wart

Member
Oct 25, 2017
2,261
I don't know how DLSS is going to end up, but one thing that has bugged me about the criticism is the argument that everything Nvidia has shown has been "on rails" type content and we don't know how well it will hold up when things change. But doesn't the FFXV demo do exactly that? There is a fight sequence in that demo that changes every time you run it, and the image quality stays the same. We definitely need some more demos, or games, but if it's capable of keeping that part of the demo looking the same regardless of what happens in that scene it's certainly capable of doing it for any other game.

The variety of visual scenes across an entire game is certainly going to dwarf that of even a variable demo (I assume? I haven't checked out the demo). So to me it sounds like a promising proof-of-concept that the tech can handle meaningful variability, and the remaining question is whether it's feasible to scale the data generation process and engineering man-hours to cover an entire open-world game. So I guess what I'd want to know is, how much did it cost to get DLSS working on just the demo and how much can the process be streamlined?
 

Xclash

Member
Oct 25, 2017
852
I'm second guessing my 2080 purchase. I realized it's silly to drop $850 on a card when I can spend another $400 on top of that for a much better one that'll starve off an upgrade even longer.

Question is, can I even return it to Newegg?

You have to double check the item page under Warranty & Returns tab.
 

myzhi

Member
Oct 27, 2017
1,650
I'm second guessing my 2080 purchase. I realized it's silly to drop $850 on a card when I can spend another $400 on top of that for a much better one that'll starve off an upgrade even longer.

Question is, can I even return it to Newegg?
Unless it's DOA, unopened or wrong item, Newegg will charge you a restocking fee.
 

brainchild

Independent Developer
Verified
Nov 25, 2017
9,478
Well then we're talking about fundamentally different things so I guess I'll leave it at a few final remarks: a) "perfect" is a very different thing than "close enough for consumer purposes" so I would not bandy that word about unless you want to get into Arguments About It on the Internet, b) we really don't know what the limits are of this kind of technology are, so you are counting your chickens etc, and c) you have an overly reductive idea of what a statistical model is.

To clarify, I agree with you that the result will never be statistically "perfectly predictive" 100% of the time, but when the result is indistinguishable from the ground truth, it's basically a semantic argument to say that it's not perfectly predictive, or impossible to match ideal results.

And I don't think you understood the significance of humans being involved in the process. The error adjustments made during back propagation aren't based on precise known data points but subjective ranges within which the algorithm is constrained. There are no absolute correct values to fit the model to, so there's enough wiggle room to account for the variability that is inevitable with AI inferencing while still producing a result that is humanly indistinguishable from the ideal counterpart.

I guess my point here is...even acknowledging that DLSS has inherent flaws with statistical inferencing, it doesn't really preclude DLSS from produce results that are, for all intents and purposes, perfect to the naked eye.
 

Kyle Cross

Member
Oct 25, 2017
8,407
Yes. You have to contact them to get an RMA before you can send it back.
Sigh. $400 is a lot of money to spend on top of the $850, plus there's the restocking fee and that we don't know when the Ti's will be back in stock or how the tariffs will affect them. Maybe I should just stick with the 2080. It just hurts to have spent this kind of money and still not get 4k/60 in a lot of games.
 

myzhi

Member
Oct 27, 2017
1,650
Sigh. $400 is a lot of money to spend on top of the $850, plus there's the restocking fee and that we don't know when the Ti's will be back in stock or how the tariffs will affect them. Maybe I should just stick with the 2080. It just hurts to have spent this kind of money and still not get 4k/60 in a lot of games.
Yeah, it's not worth return with a restocking fee. I am kinda surprise you went with a 2080 when you want play at 4K@60fps. Even before reviews, most speculate that 2080 would be about similar to 1080 Ti.
 

Vash63

Member
Oct 28, 2017
1,681
The variety of visual scenes across an entire game is certainly going to dwarf that of even a variable demo (I assume? I haven't checked out the demo). So to me it sounds like a promising proof-of-concept that the tech can handle meaningful variability, and the remaining question is whether it's feasible to scale the data generation process and engineering man-hours to cover an entire open-world game. So I guess what I'd want to know is, how much did it cost to get DLSS working on just the demo and how much can the process be streamlined?

Nvidia didn't build DLSS just to run the FF XV and Infiltrator demos, those are just the first ones to be shipped to reviewers. They have something like 20 games confirmed to use it, and the FF XV demo shows that it can work regardless of being on rails.

As an aside, given that Nvidia stipulated that it only works in engines that supported temporal AA, I suspect there's a strong temporal component where it recognizes objects from previous frames and infers what texture data should be used into the next ones, I don't think it's just blind AI. Simply having motion vectors and previous frames is a huge advantage for this kind of thing, similar to what many video codecs do.
 

Kyle Cross

Member
Oct 25, 2017
8,407
Yeah, it's not worth return with a restocking fee. I am kinda surprise you went with a 2080 when you want play at 4K@60fps. Even before reviews, most speculate that 2080 would be about similar to 1080 Ti.
I usually turn off some often times resource hungry settings like Depth of Field and Motion Blur due to personal preference, so I was hopeful. I'm crossing my fingers that DLSS will be the savior. Whereas early looks seem to suggest that it is no replacement for true 4K, if it can present 1800p quality with a 4k picture then that'll at least bypass my TVs upscaling that makes native 1800p blurry.
 

the_wart

Member
Oct 25, 2017
2,261
I guess my point here is...even acknowledging that DLSS has inherent flaws with statistical inferencing, it doesn't really preclude DLSS from produce results that are, for all intents and purposes, perfect to the naked eye.

I think that's an optimistic but reasonable take; my main concern with these type of discussions is that people can easily get the idea that ML is doing the impossible, i.e., perfectly recovering information that just doesn't exist in the data. In the domain of image processing the gap between this and what is actually happening is shrinking every day. But in other domains, like predicting human health and behavior, the distinction is incredibly important and if people don't grok that they end up with totally nutty beliefs about what AI is capable of.

Nvidia didn't build DLSS just to run the FF XV and Infiltrator demos, those are just the first ones to be shipped to reviewers. They have something like 20 games confirmed to use it, and the FF XV demo shows that it can work regardless of being on rails.

Of course, but as they say the proof is in the pudding and so far most of the pudding has been locked in the display case. To be clear I am extremely interested in this technology and I will be thrilled if it works half as well as Nvidia claims.

Edit: To be even clearer, I wasn't suggesting that DLSS is vaporware and that implementations in other games don't exist. I was talking about the quality of the average implementation, in terms of image quality and noticeable artifacts. We don't know first-hand yet that the level of quality they reach in, say, FFXV proper, will match that in the demos.
 

brainchild

Independent Developer
Verified
Nov 25, 2017
9,478
I think that's an optimistic but reasonable take; my main concern with these type of discussions is that people can easily get the idea that ML is doing the impossible, i.e., perfectly recovering information that just doesn't exist in the data. In the domain of image processing the gap between this and what is actually happening is shrinking every day. But in other domains, like predicting human health and behavior, the distinction is incredibly important and if people don't grok that they end up with totally nutty beliefs about what AI is capable of.

Oh I completely agree with you here. This technology can not be relied on for more vital branches of science and it's important for people to understand that. Context is definitely key here, but that's the case with many things, like developers insistence on referring to workflows as Physically-based when in reality the term is pretty loosely defined.
 

the_wart

Member
Oct 25, 2017
2,261
Oh I completely agree with you here. This technology can not be relied on for more vital branches of science and it's important for people to understand that. Context is definitely key here, but that's the case with many things, like developers insistence on referring to workflows as Physically-based when in reality the term is pretty loosely defined.

I hereby declare this internet argument... RESOLVED!

 

l0rd_farquaad

Banned
Sep 19, 2018
31
Sigh. $400 is a lot of money to spend on top of the $850, plus there's the restocking fee and that we don't know when the Ti's will be back in stock or how the tariffs will affect them. Maybe I should just stick with the 2080. It just hurts to have spent this kind of money and still not get 4k/60 in a lot of games.
I don't see why you wouldn't want 1440p/ 144hz? 2080 is perfect for that.
 

low-G

Member
Oct 25, 2017
8,144
I'm second guessing my 2080 purchase. I realized it's silly to drop $850 on a card when I can spend another $400 on top of that for a much better one that'll starve off an upgrade even longer.

Question is, can I even return it to Newegg?

You bought an EVGA, they have the step up program. You have 90-ish days if you want to upgrade via EVGA.
 

MrBob

Member
Oct 25, 2017
6,668
You can step up to the 2080 TI now too I believe but it will take awhile.

https://www.evga.com/support/stepup/

I'd recommend using the step up program too if you want the 2080TI instead.

Also seems like 2080 TI are finally shipping from Nvidia, hopefully this means best buy starts shipping too.
 
Last edited:

Kyle Cross

Member
Oct 25, 2017
8,407
I don't see why you wouldn't want 1440p/ 144hz? 2080 is perfect for that.
I have a 4k display, so sub-4k resolutions look bleh. And super high framerates I consider a waste.
You bought an EVGA, they have the step up program. You have 90-ish days if you want to upgrade via EVGA.
Sadly I don't think I can, as you have to register the product, right? For some weird reason when I try their website says the serial and part number don't match. I did it multiple times, even switching 0's for O's. I'm at wits end.
 

low-G

Member
Oct 25, 2017
8,144
Sadly I don't think I can, as you have to register the product, right? For some weird reason when I try their website says the serial and part number don't match. I did it multiple times, even switching 0's for O's. I'm at wits end.

Remember these numbers are on both the box and the GPU. I'd contact EVGA regardless about serial number problems.

After all, you're going to want your warranty, too.
 

MrBob

Member
Oct 25, 2017
6,668
This seems crazy to me...no video card can hold 60 fps as a minimum frame rate at 1440p on Assassins Creed Odyssey at very high settings with an 8700k at 5Ghz:



I wonder if the lower performance is linked to multiple UBI drm layers on pc.
 

Linus815

Member
Oct 29, 2017
19,692
I don't have the means to watch that video now, but is that just the benchmark?
I've played the game for 5 hours now on a 2080 at 1440p on mostly max settings and I have not dropped under 60 fps. But the benchmark does show a dip to 30 fps.

AC games were always very demanding, even when it was just a smaller map, or one city... the fact that having such a gigantic map tanks the fps so much doesn't surprise me. Ultimately its not much worse than Origins while looking considerably better. I think it's a bit silly to think that DRM would cause more than 0-5% performance drop; if it truly was the culprit, I'd wager it'd be extremely dumb of ubisoft to keep it in the game as they are limiting their audience that can actually play the damn game.
 

MrBob

Member
Oct 25, 2017
6,668
Maybe the benchmark is bugged currently? Hearing you keep 60 fps as a min number makes me happy. What cpu do you have? I don't like any dips below 60 fps since I'll be using a 4k tv that doesnt have variable refresh rate to fall back on.
 
Oct 25, 2017
11,574
Maybe the benchmark is bugged currently? Hearing you keep 60 fps as a min number makes me happy. What cpu do you have? I don't like any dips below 60 fps since I'll be using a 4k tv that doesnt have variable refresh rate to fall back on.

It's a ubisoft open world game with dynasty warriors scale battles and fancy water tech. Can't realistically expect any dips below 60 haha.

Red dead 2 is what I'm worried about though. Might be the gta4 of 2019
 

piratepwnsninja

Lead Game Designer
Verified
Oct 25, 2017
3,811
well now what do we have on our front porch

at3x2czsoq9ulawt.png




She better deliver what I need in my VR games or she's going right back to Amazon.


Lucky.

My order of the exact same card from Amazon that I made on the day preorders went up hasn't shipped.
 

zerocalories

Member
Oct 28, 2017
3,231
California
2080ti owners, what are your oc's? just got mine today

edit: also has msi been updated yet? I don't see one. it's not showing my cards fan controls at all
 
Last edited:

Kyle Cross

Member
Oct 25, 2017
8,407
I just found out I voided my EVGA 2080 warranty because I removed a sticker from it. FUCK. I saw a sticker and was like "I don't want that on something that's going to get hot inside my pc." and only just now read it. It broke into pieces (by design, as I now realize) so I can't just put it back.

This was my first GPU upgrade, I had no idea about this...