Yeah baha, that's the worst bit - everything just processing, money out of the account.
I'm looking to upgrade my GPU but still undecided, I'm going the 1440p 144Hz route and trying to find out if it's worth forking an extra 300€ for a Ti compared to a regular 2080, is there anything more in it besides an extra 10-20 fps?
It is impossible to always get arbitrary images correct to arbitrary precision. AI algorithms make reasonable guesses regarding the complete image on the basis of statistical regularities. Those statistical regularities are not going to be perfectly predictive and in some edge cases may be misleading for the same reasons that election polling results are not perfectly predictive and are sometimes misleading.
Now, it may be that something like this ends up being "good enough" that it eventually becomes the default rendering method for many applications. But the way some proponents talk about, it seems like they think it is actually magic. It's not magic. It's generalization from a sample to a population based on statistical inference.
I think best buy just authorizes with a credit card. You pay in full right away if you use PayPal though. I have a purchase from Sept 7 still in progress. I'll wait it out since I was able to use a bday coupon and I doubt I'll be and able to find one cheaper anytime soon.
As with many people before you who have made incorrect assumptions about DLSS and NVIDIA's DNN models, the inferencing model does not solely rely on completely arbitrary information, which is why each model needs to be pre-trained on a per-game basis. It is a combination of artificial and human intelligence that effectively produce the final result, it is not a simple statistical inference model.
I understand that the technology is still very new, but please do some research before speaking about new technology, especially if you're otherwise going to spread misinformation.
As with many people before you who have made incorrect assumptions about DLSS and NVIDIA's DNN models, the inferencing model does not solely rely on completely arbitrary information, which is why each model needs to be pre-trained on a per-game basis. It is a combination of artificial and human intelligence that effectively produce the final result, it is not a simple statistical inference model.
I understand that the technology is still very new, but please do some research before speaking about new technology, especially if you're otherwise going to spread misinformation.
But that's what he said, "It's generalization from a sample". No one knows exactly what it will produce in non specific curated inviroments, well other than nvidia, and it sound like they are still ironing it out as well. I honestly think that until we see real world examples everything is speculation one way or the other.
I already knew all that, none of it contradicts what I said, and yes it is a statistical inference model just like every other machine learning algorithm. I never said it was a simple one.
In theory I suppose the could make the algorithm perform "perfectly" by restricting the renderer to output only images that the algorithm could perfectly predict, but that would be a very silly thing to do.
But the end result is not merely the consequence of statistical inferencing, so to suggest as much is just misinformation. Using human visual analysis to correct inferencing errors is not statistical inferencing.
And the back propagation training aims to get as close to a 'perfect' reconstruction as possible, correcting any errors that arise during training. The corrections aren't based on statistics, but human visual interpretation.
I am. I'm referring to the DLSS methodology as a whole, which is what matters, not just the statistical inferencing part of the process. My point is that DLSS as a technology is not just a statistical model; it involves a lot of steps before it even works in a shipped product.
As a technology, it is possible for the AI to eventually reach a point where the result is perfectly predictive in that the result is indistinguishable from the ideal counterpart. Statistically, it will not be perfectly predictive, but as I've said many, many times before, the technology is not just about statistics and that is not what is going to matter to consumers.
I don't know how DLSS is going to end up, but one thing that has bugged me about the criticism is the argument that everything Nvidia has shown has been "on rails" type content and we don't know how well it will hold up when things change. But doesn't the FFXV demo do exactly that? There is a fight sequence in that demo that changes every time you run it, and the image quality stays the same. We definitely need some more demos, or games, but if it's capable of keeping that part of the demo looking the same regardless of what happens in that scene it's certainly capable of doing it for any other game.
I'm second guessing my 2080 purchase. I realized it's silly to drop $850 on a card when I can spend another $400 on top of that for a much better one that'll starve off an upgrade even longer.
Question is, can I even return it to Newegg?
Unless it's DOA, unopened or wrong item, Newegg will charge you a restocking fee.I'm second guessing my 2080 purchase. I realized it's silly to drop $850 on a card when I can spend another $400 on top of that for a much better one that'll starve off an upgrade even longer.
Question is, can I even return it to Newegg?
But they will take it back? All I can find on my order is to ask for a replacement.Unless it's DOA, unopened or wrong item, Newegg will charge you a restocking fee.
Well then we're talking about fundamentally different things so I guess I'll leave it at a few final remarks: a) "perfect" is a very different thing than "close enough for consumer purposes" so I would not bandy that word about unless you want to get into Arguments About It on the Internet, b) we really don't know what the limits are of this kind of technology are, so you are counting your chickens etc, and c) you have an overly reductive idea of what a statistical model is.
Yes. You have to contact them to get an RMA before you can send it back.But they will take it back? All I can find on my order is to ask for a replacement.
Sigh. $400 is a lot of money to spend on top of the $850, plus there's the restocking fee and that we don't know when the Ti's will be back in stock or how the tariffs will affect them. Maybe I should just stick with the 2080. It just hurts to have spent this kind of money and still not get 4k/60 in a lot of games.Yes. You have to contact them to get an RMA before you can send it back.
Yeah, it's not worth return with a restocking fee. I am kinda surprise you went with a 2080 when you want play at 4K@60fps. Even before reviews, most speculate that 2080 would be about similar to 1080 Ti.Sigh. $400 is a lot of money to spend on top of the $850, plus there's the restocking fee and that we don't know when the Ti's will be back in stock or how the tariffs will affect them. Maybe I should just stick with the 2080. It just hurts to have spent this kind of money and still not get 4k/60 in a lot of games.
The variety of visual scenes across an entire game is certainly going to dwarf that of even a variable demo (I assume? I haven't checked out the demo). So to me it sounds like a promising proof-of-concept that the tech can handle meaningful variability, and the remaining question is whether it's feasible to scale the data generation process and engineering man-hours to cover an entire open-world game. So I guess what I'd want to know is, how much did it cost to get DLSS working on just the demo and how much can the process be streamlined?
I usually turn off some often times resource hungry settings like Depth of Field and Motion Blur due to personal preference, so I was hopeful. I'm crossing my fingers that DLSS will be the savior. Whereas early looks seem to suggest that it is no replacement for true 4K, if it can present 1800p quality with a 4k picture then that'll at least bypass my TVs upscaling that makes native 1800p blurry.Yeah, it's not worth return with a restocking fee. I am kinda surprise you went with a 2080 when you want play at 4K@60fps. Even before reviews, most speculate that 2080 would be about similar to 1080 Ti.
I guess my point here is...even acknowledging that DLSS has inherent flaws with statistical inferencing, it doesn't really preclude DLSS from produce results that are, for all intents and purposes, perfect to the naked eye.
Nvidia didn't build DLSS just to run the FF XV and Infiltrator demos, those are just the first ones to be shipped to reviewers. They have something like 20 games confirmed to use it, and the FF XV demo shows that it can work regardless of being on rails.
I think that's an optimistic but reasonable take; my main concern with these type of discussions is that people can easily get the idea that ML is doing the impossible, i.e., perfectly recovering information that just doesn't exist in the data. In the domain of image processing the gap between this and what is actually happening is shrinking every day. But in other domains, like predicting human health and behavior, the distinction is incredibly important and if people don't grok that they end up with totally nutty beliefs about what AI is capable of.
Oh I completely agree with you here. This technology can not be relied on for more vital branches of science and it's important for people to understand that. Context is definitely key here, but that's the case with many things, like developers insistence on referring to workflows as Physically-based when in reality the term is pretty loosely defined.
Windows update should be out now with ray tracing support:
https://blogs.nvidia.com/blog/2018/10/02/real-time-ray-tracing-rtx-windows-10-october-update/
Now we just wait on games.
I don't see why you wouldn't want 1440p/ 144hz? 2080 is perfect for that.Sigh. $400 is a lot of money to spend on top of the $850, plus there's the restocking fee and that we don't know when the Ti's will be back in stock or how the tariffs will affect them. Maybe I should just stick with the 2080. It just hurts to have spent this kind of money and still not get 4k/60 in a lot of games.
I'm second guessing my 2080 purchase. I realized it's silly to drop $850 on a card when I can spend another $400 on top of that for a much better one that'll starve off an upgrade even longer.
Question is, can I even return it to Newegg?
I have a 4k display, so sub-4k resolutions look bleh. And super high framerates I consider a waste.I don't see why you wouldn't want 1440p/ 144hz? 2080 is perfect for that.
Sadly I don't think I can, as you have to register the product, right? For some weird reason when I try their website says the serial and part number don't match. I did it multiple times, even switching 0's for O's. I'm at wits end.You bought an EVGA, they have the step up program. You have 90-ish days if you want to upgrade via EVGA.
Sadly I don't think I can, as you have to register the product, right? For some weird reason when I try their website says the serial and part number don't match. I did it multiple times, even switching 0's for O's. I'm at wits end.
Thanks for the blog post, Nvidia.Windows update should be out now with ray tracing support:
https://blogs.nvidia.com/blog/2018/10/02/real-time-ray-tracing-rtx-windows-10-october-update/
Now we just wait on games.
Maybe the benchmark is bugged currently? Hearing you keep 60 fps as a min number makes me happy. What cpu do you have? I don't like any dips below 60 fps since I'll be using a 4k tv that doesnt have variable refresh rate to fall back on.
well now what do we have on our front porch
She better deliver what I need in my VR games or she's going right back to Amazon.