In all high profile AAA games - sure.
Sure, you need eye-tracking. But every 2nd gen headset will have this including console headsets, so that won't be a limiter.But as we know you then need eye tracking. Also the macula is still responsible for contrast and movement, so you can't neglect it altogether
It's been that short? It feels like people have been talking about it forever already. I think it's advancing pretty well then for the short time it's had.
It's the next-next-big thing. The next generation of consoles will not be up to the task of ray tracing in the premiere AAA kinds of games. Even on PC, it won't be broadly accessible for several years. Maybe it starts becoming the standard bragging right of PC players in like 2023, and doesn't make it to consoles either until a mid-cycle hardware refresh, or more likely the next-next-generation of consoles.
One interesting disruptor in this could be cloud streaming. I wouldn't be surprised to see Stadia bragging about having raytracing in some high profile games where it is unavailable on console. And that will be accessible to many more people than those who can afford the RTX2-3080 Ti.
it's not just RT, but the tensor cores as well, which are used for denoising.The real huddle right now, imo, is the dedicated silicon on gpu. The RT module takes about half of the chip, and it does nothing for non-RT tasks. AMD is behind RT design, and GPU design in general. But if they can produce a unified GPU, it can still upset NVidia while push RT to the mass adoption.
PCs already have SDDs and they deal with more assets to load and load times still exist.
So long as graphics demand keeps escalating instant loading is a dream
The only alternative is to load up on insane ram and keep entire games suspended but that still requires an initial load and it also depends on how many suspend points you want.
Correct. for consoles it's a long ways offIn terms of brute force tech, PCs are only just starting to get PCIe4 via AMD motherboards, which could be legitimately a lot faster than existing SSD loading.
In terms of system design, PS5 (and probably xbox too) might have API features and customized hardware for quick-loading important data necessary to load a game that PC devs haven't tried before. Similar to how the current consoles have low power operation modes that aren't available on PC.
Finally, this is another area where I think the cloud could be very disruptive. If Stadia is deploying ramstates as is widely speculated, it could put consoles and PCs to shame in terms of load times. They're talking sub-5 seconds to start any game, which is unheard of.
I agree, PS5 and Xbox Scarlet will have very limited ray tracing support. You can't expect too much when a $1200 video card can't really handle full ray traced global illumination at higher resolutions. I imagine PS5 will be limited to reflections for ray tracing or something similar for most AAA games.It'll be the new standard in another console generation I imagine. It's far too expensive and inefficient in its current iteration, so 5-10 years down the line I imagine it'll be widely adopted.
In terms of brute force tech, PCs are only just starting to get PCIe4 via AMD motherboards, which could be legitimately a lot faster than existing SSD loading.
Oh I very much doubt the PS5 and equivalent Xbox will have any ray tracing, I'm thinking more PS6.I agree, PS5 and Xbox Scarlet will have very limited ray tracing support. You can't expect too much when a $1200 video card can't really handle full ray traced global illumination at higher resolutions. I imagine PS5 will be limited to reflections for ray tracing or something similar for most AAA games.
Don't temporal denoising/anti-aliasing solutions tend to fall apart in VR?You could probably get a full raytraced pipeline running quite well in VR sooner since you're only going to need to sample rays around the fovea, making it more than a magnitude more performant. It would be like skipping 5+ years of GPU advancements.
This. It's what will make next-gen look a lot better than current-gen.
Precisely.Considering both Sony and Microsoft have touted this feature for their next machines, why would we be thinking anything different?
Those work in conjunction. Raytracing itself works on a per pixel level afterall.Don't temporal denoising/anti-aliasing solutions tend to fall apart in VR?
But remember this is early stuff. Just remember the early dx11 titles verses now. Also GPU performance will significantly improve.I got my 2070 in yesterday (don't worry, it was only $350), and so far I'm not exactly blown away by RT. It gives Tomb Raider slightly softer shadows. Lighting seems a bit different in Metro. Quake 2 looks better but holy crap does it perform badly.
Developers have done such a great job over the years of imitating RT that I'm not convinced that RT is something worth pursuing at this point.
The speeds are only on paper and in benchmarks. Real life you wont have faster loading than existing nvme's. Theres basically no speed difference now between the highest end nvme's and a sata ssd for loading times.
The difference between this gen and last gen due to PBR getting widespread use is already an insane difference.It's not going to be the jump from 2D to 3D some think it will be, it's going to be like the widespread use of PBR or normal maps. An incredible useful tool that makes game development more versatile, but not something every 3D game will use.
I think something like Cyberpunk 2077 is gonna give people a rude awakening about how far off raytracing as a norm is going to be. I can't imagine it running decent at all. Another console gen, at least, is necessary.
I think the killer for people people who think RT is too expensive is that 1080p/30fps is too low for their tastesOn Metro Exodus I can get 60fps at 1080p/High settings with RT turned on with a 2070. It's really not as computationally expensive as some people make out.
RT will be in most 30fps AAA games on the next set of consoles imo because their GPU's will be even more powerful than my 2070.
Quake 2 has it, but that's because Q2's rasterizer is old and runs at like 300fps on modern hardware. the on/off thing for modern games might be too expensiveI only have a 980 Ti, but is that RTX on/off thing something you can actually do in games to see the differences instantly? For me, that would go a long way in selling it, because a lot of times if I'm just watching raytraced gaming footage normally, I don't really notice the benefits.