I think timing of next gen is a factor for price too.
I'm hoping but they might just price their lower cards more competitively and leave the high-end pretty much where they are. Which would be a bit short-sighted as having a much more powerful GPU than the consoles at a somewhat reasonable cost could be incentive for people to upgrade or take up PC gaming instead of investing in a next-gen console for less than half the price. Especially with all the talks about porting to PC.I think timing of next gen is a factor for price too.
RTX came out with a few years of headroom, but now the ~$500 consoles will have it.
I'm going into my fourth year with the 1070. I'd really like to be able to think about a 3070 this year.
I'm hoping but they might just price their lower cards more competitively and leave the high-end pretty much where they are. Which would be a bit short-sighted as having a much more powerful GPU than the consoles at a somewhat reasonable cost could be incentive for people to upgrade or take up PC gaming instead of investing in a next-gen console for less than half the price. Especially with all the talks about porting to PC.
Maybe that will just happen in the years that follow as the cards get better.
Are we expecting there to be limited supply? There usually is, isn't there :/
This. geez.Jeez. Makes my 8 gb 2060 SUPER that I just got this year look like a chump purchase. 😰
you guys do realize you don't have to have the latest and greatest.Jeez. Makes my 8 gb 2060 SUPER that I just got this year look like a chump purchase. 😰
What i hope for in terms of price is the following
3060 - 399€
3070 - 499€
3080 - 699€
3080 TI - 999
Also 16GB for 3070 and respectively 20GB for the 3080 won't happen. It will be something around 8-10GB for the 3070 and 10-12GB for the 3080.
you guys do realize you don't have to have the latest and greatest.
4K games at Ultra?
Also possible but i kinda expect Nvidia to charge less in the lower and mid price segment to challenge AMD and keep the higher price tag for highend GPUs since AMD has no real answer to those.
I think RE2 is completely broken, because I use an alleged 12gb on my 1070 and there's nothing affecting my performance at all.Please explain how 20gig is absurd to a complete idiot like me since I remember RE2 needing like 12 (for textures? I cant really recall). Seems like 20 would be needed with the upcoming consoles arriving?
RE2 doesn't actually need 12GB.Please explain how 20gig is absurd to a complete idiot like me since I remember RE2 needing like 12 (for textures? I cant really recall). Seems like 20 would be needed with the upcoming consoles arriving?
Same. Got my 1070 for $375, can we get the xx70 cards back down in price too please.I'm going into my fourth year with the 1070. I'd really like to be able to think about a 3070 this year.
I'm going into my fourth year with the 1070. I'd really like to be able to think about a 3070 this year.
3080 Ti + my C9 OLED is going to be absolute insanity.
That sweet 4k 120hz VRR goodness. Oh yes.
Watch out for the fans.
Sounds like a plan.A man of good taste.
Quite happy with my PG27UQ. My OLED is a B6, but could definitely use a upgrade to a CX...
Ouch.
Please explain how 20gig is absurd to a complete idiot like me since I remember RE2 needing like 12 (for textures? I cant really recall). Seems like 20 would be needed with the upcoming consoles arriving?
I hope they add a fucking shitload of RTX cores. The raytracing performance on the 20xx series is unnacceptable for what we get out of it. If they slowly drip feed in RTX cores it'll be such a shit thing to do but I can see them doing it.
you are aware of the 2080 Ti's current price, are you not?The Titan is the premium, a RTX 3080ti would probably be $800 at most.
I think it's definitely how games used the technology. BF5 was a terrible example of using ray tracing. But several games have come out since then that showed good performance. Also DLSS has been showing very impressive results with recent titles.I hope they add a fucking shitload of RTX cores. The raytracing performance on the 20xx series is unnacceptable for what we get out of it. If they slowly drip feed in RTX cores it'll be such a shit thing to do but I can see them doing it.
LMAO! Ouch.
I wonder if anybody actually played RDR2 on 2gb of VRAM, it seems a dreadful experience lolMost games don't use as much VRAM as their often flawed counters might report. The games that can use all the VRAM you have will usually use a significant portion of it as a cache for minor performance improvements. Meanwhile older GPUs with far less VRAM are not going away all of a sudden and the required spec for games does not shift to 30xx series overnight. Even something as massively detailed as Red Dead Redemption 2 has a minimum requirement of GTX 770 2 GB VRAM and a recommendation of GTX 1060 / RX 480 with 4-6 GB.