What problems did you have with BFV? I can also run that >1080p 60fps absolutely maxed...
Hmm? Why is that? No I'm at 60. Matches what GamersNexus shows...
Even a 2060 can do 1080p 60 on ultra + RTX high...
Oh sweet, thanks for the info about the code!Yes, the game code came on the receipt.
The card they had available was the EVGA 2060. Do you see a price match somewhere? Id love to knock another 20 bucks off my credit card bill, lol.
I still havent been able to use it though because my PSU did not have the correct cables. I ordered a modular PSU from newegg and it should be here tomorrow, so Ill finally get to try this card that is just sitting here.
I, too, have been knocking down settings more than I would like with my 970. Apex legends really wasnt getting me the FPS I wanted and it was really the first time I noticed the age of my card. It didnt run bad, but it was not getting me 60 fps consistently in that game.
I probably could have splurged on a 2080, but it seems like the 2060 is plenty for 1080p 144hz. Hell, it is apparently even great at 1440p and can do some games in 4k.
Oh sweet, thanks for the info about the code!
As for a price match, I know amazon has that specific card cheaper by like $20. Bestbuy.com appears to now have it @ $380 now as well it seems!
YES. This as well. Digital Foundry is usually very good at showing any artifacts when objects are in motion from TAA/Upscaling methods so hope this continues when they look at DLSS-supported games.This is a great suggestion. Yeah, we should really be comparing like for like *performance*. Not necessarily exclusively, but since DLSS 4K runs at a lower frame rate than 1440p native, we should compare IQ between 1800p or whatever like you say.
One other thing I haven't seen yet, that we need to see however, is we should *also* be comparing IQ while the character is moving or at the very least while on screen elements are moving. Only showing TLAA examples on relatively static scenes really hides a lot of the issues TLAA has, and isn't that representative of in gameplay. I know that it's more difficult to get similar shots if stuffs moving, but it's arguably only fair to compare BOTH static and in motion things when comparing the IQ.
Like in FF15, with its poor TLAA implementation, in motion stuff looks terrible, and DLSS looks way better imo. I've seen precisely zero screenshots of what Metro Exodus's TLAA implementation looks like on stuff in motion. I can't imagine it's anything like as bad as FF15, but like I said, you can't adequately judge TLAA on a static scene, unless it's a game where you spend most of the time looking at static scenes.
He's running a 2080 on 1080p @ 60fps. Of course the GPU should easy be able to run that, and thus, why it's only 60% usage. I already told him to use DSR and downscale since he has extra GPU power.I can't watch right now, and maybe I'm misunderstanding, but are you saying your GPU utilization is at 60% or you're getting 60fps on your rig?
I can't watch right now, and maybe I'm misunderstanding, but are you saying your GPU utilization is at 60% or you're getting 60fps on your rig?
He's running a 2080 on 1080p @ 60fps. Of course the GPU should easy be able to run that, and thus, why it's only 60% usage. I already told him to use DSR and downscale since he has extra GPU power.
Ah it's all good!Oh ok, that is what I paid for it. I just wanted to make sure I wasnt missing something. It came out to 411 with tax.
If I hook it up and want more, I hope I can return it and get a 2080.
Damn it, there goes my excitement for DLSS
I have the Acer Predator. Stupidly pricey, but very nice.I was curious if DLSS had an effect on hdr, thanks for the info. Now that modern games are thankfully offering resolution scale option i wonder if DLSS is even necessary.
plagiarize do you have Asus or Acer 4k monitor?
Right now at least in Metro Exodus, they don't work together.
FFXV benchmark results page has an entry for the GTX 1660 Ti in the 1440p high settings chart, it matches the GTX 1070:
Anything less than the RTX 2070/GTX 1080/Vega 64 wouldn't be a good enough upgrade, I think, and even those 3 wouldn't be a big upgrade either.I currently have a gtx 980 ti 6gb an im looking to upgrade, what would you guys recommend my next graphics card be ? Any replies are appreciated
Wait for reviews and pricing on the Nvidia RTX 3080. The 2000 series isn't worth it at your power level.I currently have a gtx 980 ti 6gb an im looking to upgrade, what would you guys recommend my next graphics card be ? Any replies are appreciated
I currently have a gtx 980 ti 6gb an im looking to upgrade, what would you guys recommend my next graphics card be ? Any replies are appreciated
same here. i had a 1070 and now have a 2080. i don't think either a 2060 or 2070 would have been a worthwhile upgrade. i would've got a 2080 ti but yeah i ain't paying £1,100+ lol.I went from a 1070 (a bit better than your 980 Ti) to a 2080 and I really appreciate the difference. So if you're going to upgrade, go with at least a 2080 or 2080 Ti IMO.
Seems like crazy ammount of work is required for this and results are of questionable quality. I don't think DLSS will ever become something that is mainstream and integrated into every game. It will probably always remain something for select high profile titles, assuming, ofcourse, Nvidia doesn't abandon this in a couple of years just like they abandoned many of their proprietary antialiasing methods.NVIDIA DLSS: Your Questions, Answered
https://www.nvidia.com/en-us/geforce/news/nvidia-dlss-your-questions-answered/
TL;DR taken from reddit
- DLSS is not enabled for every resolution because if the game is already running at high frame rate, you don't get much performance gain from DLSS due to DLSS needing a fixed amount of GPU time per frame to run the neural network. If the GPU processed the frame faster than DLSS operations, it won't get performance boost.
- Blurry frames in lower resolution is the product of having less source data which makes it challenging for DLSS to detect the input frame and predict the final frame. NVIDIA is working to add more training data to improve image quality
- Battlefield V Update: Focusing testing and training to improve 1080p and Ultrawide with DLSS
- Metro Exodus Update: Game update coming to improve DLSS sharpness and overall image quality across all resolution. Also training more sections of the game and looking into other reported issues such as HDR.
Poor nvidia never doing anything right.Seems like crazy ammount of work is required for this and results are of questionable quality. I don't think DLSS will ever become something that is mainstream and integrated into every game. It will probably always remain something for select high profile titles, assuming, ofcourse, Nvidia doesn't abandon this in a couple of years just like they abandoned many of their proprietary antialiasing methods.
First RTX On OctaneBench results by Otoy. Up to 3x higher scores depending on the scene.
On vs Off
RTX 2070: 561 vs 184 OB
https://twitter.com/otoy/status/1096674322666541056
RTX 2080: 630 vs 217 OB
https://twitter.com/otoy/status/1096659916645818368
RTX 2080 Ti: 869 vs 308 OB
https://twitter.com/otoy/status/1096654214778875904
Titan RTX: 919 vs 322 OB
https://www.facebook.com/photo.php?fbid=2252344881483340&id=100001235494825
All numbers are preliminary as they are based on an experimental future 2019.2 or later release.
Jules Urbach also writes: "This speed is scene specific, and most scenes don't get more than 1.25x RTX with path tracing. That might change with more tuning (we never thought we'd get near 3x in any real world scene a few months back). I would say it's safe for 1080 Ti users for a while, as it will take many releases before we get RTX finalized for RNDR."
Now the best thing. If you have an RTX card you can test it yourself.
Download it here:
https://render.otoy.com/forum/viewtopic.php?f=7&t=70705
Requirements:
- Windows (64-bit) 7 (SP1), 8 or 10 (1803).
- NVIDIA graphics card (CM 3.0 or higher, CM 7.5 is required for RTX support).
- NVIDIA driver 416.94 (GeForce) or 416.78 (Quadro) or newer (except 418 series).
Limitations:
- Benchmark result uploads have been disabled.
- This build will expire in 120 days (Sunday, June 16th 2019).
I got this working on 417.71, Win 10 1803 (GTX 1080 Ti)FAQs
Q: How do I know my GPU is using hardware for ray tracing acceleration?
A: At this stage only Turing cards support accelerated ray tracing, if you have the right driver you should see `RTX √` next to your GPU's name.
Q: I have an NVIDIA RTX series GPU but it does is not displayed as having RTX support.
A: Make sure you have one of the drivers stated above or newer.
Q: Does this work with GTX cards?
A: It will work but you won't get any speedup measurements as RTX is not supported.
Q: Does this leverage NV Link?
A: No, the scene in this benchmark will not make use of peer-to-peer memory using NV Link. Future OctaneBench versions will specifically measure NV Link and out of core speeds, as well as denoising speed.
Q: Will you release Linux or OSX builds?
A: We can release them as soon as NVIDIA provides with suitable drivers, no ETA for that at the moment.
Q: When will a version of Octane with full RTX support be released?
A: RTX support is currently planned as an experimental feature in Octane 2019 (with a first integration coming in 2019.2).
Q: What performance boost should I expect?
A: That depends heavily on the scene, we've experienced speedups of up to 5x for best case scenarios, this will be lower with scenes with heavier shading or smaller geometry sizes. For this specific benchmark scene which has 1.7M triangles we have seen figures ranging from 2.5x to 3.x.
Q: Does this new technology affect the quality of the final render?
A: Although it is not directly related to the final quality of your render, using hardware acceleration for ray tracing translates into faster render times so you can render more samples and get a cleaner image in the same amount of time as before.
Q: Do I need to install the Windows 10 October 2018 Update for this to work?
A: No, however the required drivers may require you upgrade to version 1803 (April 2018 Update).
anifex3D said:WOW! This is awesome 2.84 times faster with the 4 Titan RTX's, 1290.63 OB RTX OFF - 3659.67 OB RTX ON, that is insane. That is like having 15.5 1080 TI's and the whole system is only pulling 1140 watts of power and easily staying cool on air.
2080 at least. You'll get +40-50% performance on average if you won't be CPU limited.I currently have a gtx 980 ti 6gb an im looking to upgrade, what would you guys recommend my next graphics card be ? Any replies are appreciated
FFXV benchmark results page has an entry for the GTX 1660 Ti in the 1440p high settings chart, it matches the GTX 1070:
With a rtx 2060 from a GTX 980 ti he will gets 20-30% more performance. With a 2080 ti he will get 100-120% more performance.2080 at least. You'll get +40-50% performance on average if you won't be CPU limited.
1060 has almost identical price/performance metric as 2060. There has been no progress at all. This GPU generation is screwed up on multiple levels.And for roughly the same price I paid for the 1070 years ago. What the hell is going on with the GPU market? We've made literally zero price/performance progress in three years?
A rtx 2060 oc scanned offers 20-30% more performance in every game. So it is already something. Most of games 25-30% more.Anything less than the RTX 2070/GTX 1080/Vega 64 wouldn't be a good enough upgrade, I think, and even those 3 wouldn't be a big upgrade either.
The GTX 980 Ti is close to the GTX 1070 level of performance, so the RTX 2060/GTX 1070 Ti/Vega 56 would be just a small upgrade over that.
1060 has almost identical price/performance metric as 2060. There has been no progress at all. This GPU generation is screwed up on multiple levels.
There's no reason why DLSS should look worse, so yeah, whatever training they've done on Metro so far, it isn't nearly enough. Glad to hear that they recognize it isn't where it needs to be.NVIDIA DLSS: Your Questions, Answered
https://www.nvidia.com/en-us/geforce/news/nvidia-dlss-your-questions-answered/
TL;DR taken from reddit
- DLSS is not enabled for every resolution because if the game is already running at high frame rate, you don't get much performance gain from DLSS due to DLSS needing a fixed amount of GPU time per frame to run the neural network. If the GPU processed the frame faster than DLSS operations, it won't get performance boost.
- Blurry frames in lower resolution is the product of having less source data which makes it challenging for DLSS to detect the input frame and predict the final frame. NVIDIA is working to add more training data to improve image quality
- Battlefield V Update: Focusing testing and training to improve 1080p and Ultrawide with DLSS
- Metro Exodus Update: Game update coming to improve DLSS sharpness and overall image quality across all resolution. Also training more sections of the game and looking into other reported issues such as HDR.
Seems like crazy ammount of work is required for this and results are of questionable quality. I don't think DLSS will ever become something that is mainstream and integrated into every game. It will probably always remain something for select high profile titles, assuming, ofcourse, Nvidia doesn't abandon this in a couple of years just like they abandoned many of their proprietary antialiasing methods.
how does nvidia release updated dlss data, does it come with driver updates or through game patches?
When you launch a game supporting DLSS the driver should get the latest data according to this answer in the link posted before :how does nvidia release updated dlss data, does it come with driver updates or through game patches?
We are constantly working to improve image quality. Recently we updated the core of DLSS so that you get the latest model updates the moment you launch your game. So make sure you have our latest Game Ready Driver (418.91 or higher) installed.
The 1660 Ti's rumored MSRP for the US is $280, rumored Euro price is 300€ (I never seen the 1070 below 350€, actually years ago that's how much the 1060 costed here, so at least here there's a clear price/performance progress, 1070 performance for possibly lower than 1060 price at launch), but I really have no idea about the UK prices in general.And for roughly the same price I paid for the 1070 years ago. What the hell is going on with the GPU market? We've made literally zero price/performance progress in three years?
And for roughly the same price I paid for the 1070 years ago. What the hell is going on with the GPU market? We've made literally zero price/performance progress in three years?
A rtx 2060 oc scanned offers 20-30% more performance in every game. So it is already something. Most of games 25-30% more.
My GTX 980 ti scores 5400 on timespy. My rtx 2060 scores 7970.
Well here most of the games get 25-30% diff.Isn't timespy a dx12 based test? Firestrike (dx11) a well OCd 980 ti can hit 20k graphics, which is pretty much right where a 2060 is iirc.
Here's a 1450mhz 980 ti (a pretty easy OC for that card) vs a 2050mhz 2060. Pretty similar results accross a range of titles with the 980 ti actuallly besting the 2060 a fair amount. They pretty much trade blows. I wouldn't upgrade from a similar performing 6gb card to another 6gb card but that's just me. I guess if you really want dxr but the performance at 1080p on the 2060 isn't that great with dxr. Minimum upgrade from a 980ti at this point for me yeah would probably be a 2080 or Radeon vii, or used 1080 ti.
https://youtu.be/KFjWi0_g2Rs
Well here most of the games get 25-30% diff.
https://youtu.be/aIHrdc38hQU
Firestrike is a old test and doesn't make sense anymore for the new gpus.
A rtx 2080 ti gets 22-23k but blow 980ti perf by 100% or more.
It is :)Dumb question, but is RTX compatible with DSR?
I ordered 2080Ti and intend to use it on 1080p plasma, DSRed from 1440p and 4K. So I hope it is.
980Ti = 1070, 2060 is about 10% faster than 1070, this would be a pointless upgrade. Hence why I've said "at least 2080". And 2080Ti is just really bad value.With a rtx 2060 from a GTX 980 ti he will gets 20-30% more performance. With a 2080 ti he will get 100-120% more performance.
10% faster then an 1070ti. Rtx 2060 oc scanned performs on pair with a GTX 1080. So 25-30% more performance in games.980Ti = 1070, 2060 is about 10% faster than 1070, this would be a pointless upgrade. Hence why I've said "at least 2080". And 2080Ti is just really bad value.
No.
"OC Scanner" works on 1080 as well as 2060.
Nope. 980Ti is also a much better overclocker than 1070/2060.
Why are people always talk about overclocking as if it's only possible to do on one GPU and not all of them?It is the same Chip from 2070 cut a bit. Overclocked performs like a stock 2070