• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.
  • We have made minor adjustments to how the search bar works on ResetEra. You can read about the changes here.

Raytracing = Next Gen

  • Yes

  • No


Results are only viewable after voting.

spad3

Member
Oct 30, 2017
7,125
California
NVIDIA announced the RTX line of cards today showcasing some amazing tech that will eventually be industry standard over the coming years. From their presentation, this one specific moment stood out the most:

twarren_nvidia_223.JPG
(images - NVIDIA)

Aside from the above images, the reels that they showed for SotTR, Metro (!!), and BFV were pretty astounding as well and don't feel like current-gen games, even though they're all current-gen games, thanks to this new tech.

Seeing the drastic changes in lighting, shadows, reflections, and damn-near-lifelike representation of these 3D objects (and light refraction through the various glass shapes), can we consider this to be the first step into "next-gen" ? (Next-gen being defined here as a full on generational leap in visuals).

Also, can we also expect the "next-gen" consoles to have similar architecture to be standardized within their custom GPUs (supposed new AMD architecture thanks to the rumor mills)?
 

electricblue

Member
Oct 27, 2017
2,991
The examples I saw didn't look terribly impressive or important but it's early days, I expect it will do some cool shit some day if AAA games exist in the future
 

Crossing Eden

Member
Oct 26, 2017
53,406
Any argument that realtime ray tracing isn't a huge step forward in rendering for a medium that previously literally couldn't do that outside of in-engine tech demos, is one based on ignorance. It would be like trying to argue against real time full GI.
 

Nostremitus

Member
Nov 15, 2017
7,777
Alabama
Well, it's a step back to fixed shaders instead of programmable shaders, so any games that don't need or would minimally use Ray Tracing will have a lot of unused silicon just sitting.

But for the games that will use it, ie most AAA games, it will free up a lot of performance by offloading lighting and shadows from the compute units to the fixed function RT(?) units.

You know how much performance you lose when setting lighting and shadows to ultra? With fixed function shaders doing it, you should get ABOVE current ultra setting with the performance impact of setting them to low or off if Nvidia did it right.
 

Derrick01

Banned
Oct 25, 2017
7,289
The examples shown were impressive but I'm still doubtful games coming out this year will utilize it properly. It feels like something we'll only get a taste of for the next couple of years.

But I'm glad it has uses outside of reflections. The reflections themselves are neat I guess but probably not something I'd notice while playing a game. I was much more impressed with the shadows example in Tomb Raider and the lighting example in Metro. We had a taste of proper GI earlier this year with Kingdom Come on PC and it looked so good it was virtually a gen ahead of the PS4/Xb1 versions, which didn't have it at all. It gave interiors life so to speak while the console versions looked completely flat. That stuff is more exciting to me.

Good example of what I mean

 

Nostremitus

Member
Nov 15, 2017
7,777
Alabama
The examples shown were impressive but I'm still doubtful games coming out this year will utilize it properly. It feels like something we'll only get a taste of for the next couple of years.

But I'm glad it has uses outside of reflections. The reflections themselves are neat I guess but probably not something I'd notice while playing a game. I was much more impressed with the shadows example in Tomb Raider and the lighting example in Metro. We had a taste of proper GI earlier this year with Kingdom Come on PC and it looked so good it was virtually a gen ahead of the PS4/Xb1 versions, which didn't have it at all. It gave interiors life so to speak while the console versions looked completely flat. That stuff is more exciting to me.
Using reflections to see around any corner without devs having to prebake any of it would be awesome, but that kind of use, building the game around having it, won't happen until all current GPUs have been rendered obsolete.


What I'd like to know is if they'll release cheaper RT only companion cards to work with older cards like they did with PhysX.
 
Feb 10, 2018
17,534
No.
Next gen to me is when there are new consoles (a successor not an iteration like the mid gens)
Where every visual aspect is improved not just one aspect.
 

Nzyme32

Member
Oct 28, 2017
5,245
The examples shown were impressive but I'm still doubtful games coming out this year will utilize it properly. It feels like something we'll only get a taste of for the next couple of years.

But I'm glad it has uses outside of reflections. The reflections themselves are neat I guess but probably not something I'd notice while playing a game. I was much more impressed with the shadows example in Tomb Raider and the lighting example in Metro. We had a taste of proper GI earlier this year with Kingdom Come on PC and it looked so good it was virtually a gen ahead of the PS4/Xb1 versions, which didn't have it at all. It gave interiors life so to speak while the console versions looked completely flat. That stuff is more exciting to me.

Good example of what I mean



This is tessellation all over again in a way (although ray tracing is genuinely a much much bigger deal in the right circumstances). Back with the GTX 460 up, tessellation was the buzz
You will have benchmarks like Unigine heaven or whatever that really blow it out, then you will have all the support games that do a mixed job of meh to nice - regardless, all baby steps as implementation improves but also the cards themselves have enough punch to do justice.
 

Deleted member 34714

User requested account closure
Banned
Nov 28, 2017
1,617
The current implementation is just a partial or "hybrid". All the demos are showing minimal examples while showing massive fps loss. This wont be a thing for awhile.
 

low-G

Member
Oct 25, 2017
8,144
The current hybrid implementation doesn't seem impressive so far because it seems only applicable to a single feature of a game and in a limited capacity.

BF Vietnam: low LOD reflections
Tomb Raider: indoor area light shadows
Metro: single bounce GI

This may not be the limit of what 20x0 RTX can do, but that's not impressive to me. I'd at least want more than 1 feature in a game, and I expect more than single bounce GI (as we already have single bounce GI, albeit at voxel res)
 
Last edited:
OP
OP
spad3

spad3

Member
Oct 30, 2017
7,125
California
No.
Next gen to me is when there are new consoles (a successor not an iteration like the mid gens)
Where every visual aspect is improved not just one aspect.

See the thing is Raytracing actually DOES improve every visual aspect to a certain degree when paired with AI mapping. Lighting is a core aspect of rendering and real-time playback that by changing the way light is essentially "perceived" you're changing every possible core aspect.

But yeah I can see how people 'mark' generations based off of consoles.

Using reflections to see around any corner without devs having to prebake any of it would be awesome, but that kind of use, building the game around having it, won't happen until all current GPUs have been rendered obsolete.


What I'd like to know is if they'll release cheaper RT only companion cards to work with older cards like they did with PhysX.

Fairly likely, it'd be a good business move as well to stay competitive price-wise considering Intel is going to be hopping in the game as well.
 

devSin

Member
Oct 27, 2017
6,197
Better fake lighting is still fake lighting.

But real-time ray tracing is a "next gen" technique by definition. It will probably be a next-gen technique next generation too. :P
 

Wollan

Mostly Positive
Member
Oct 25, 2017
8,816
Norway but living in France
Motion Matching animation as seen in The Last of Us 2 gameplay demo is what I consider next-gen (even though it's current-gen hardware) and something that will actually be of immediate practical value to consumers. Helping to solve the really hard problems of totally seamless animation blending it could end up being one of the holy grails within interaction.

AerEAFy.gif

Gif credit to nib95.

RTX hybrid ray-tracing effects are very subtle and extremely computationally expensive and premature from a consumer standpoint (though I appreciate the R&D and developer benefits, I would love to play around with it). We're a decade away from proper use and more wide-spread adoption (even considering ML reconstruction and de-noising).
Of course, realtime ray-tracing itself is the holy grail of graphics but RTX is NOT delivering on that and vendors won't anytime soon. It's a tiny little step on a long road.
 
Last edited:

jelly

Banned
Oct 26, 2017
33,841
Thought it looked a bit crap to be honest but I guess new techniques take a while to shine properly.
 

Nostremitus

Member
Nov 15, 2017
7,777
Alabama
See the thing is Raytracing actually DOES improve every visual aspect to a certain degree when paired with AI mapping. Lighting is a core aspect of rendering and real-time playback that by changing the way light is essentially "perceived" you're changing every possible core aspect.

But yeah I can see how people 'mark' generations based off of consoles.



Fairly likely, it'd be a good business move as well to stay competitive price-wise considering Intel is going to be hopping in the game as well.
What gives me hope it that instead of something like PhysX, Ray Tracing has to be added after the scene is rendered in order to be able to play off the scene.

This also gives me hope that the supplemental computing device that Nintendo patented for switch could be an RTX and NGX implementation build into a dock. That would free up all the power the Switch uses for lighting and shadows when in docked mode and in a closed system NGX would have the best effect by having learned behavior for specific games running at specific resolutions with specific settings packed in patches.
 

Phellps

Member
Oct 25, 2017
10,816
Yeah, definitely. Though I don't think we'll see it being widely implemented for now.
 

Derrick01

Banned
Oct 25, 2017
7,289
This is tessellation all over again in a way (although ray tracing is genuinely a much much bigger deal in the right circumstances). Back with the GTX 460 up, tessellation was the buzz
You will have benchmarks like Unigine heaven or whatever that really blow it out, then you will have all the support games that do a mixed job of meh to nice - regardless, all baby steps as implementation improves but also the cards themselves have enough punch to do justice.

Yeah I had a 470 and really the tesselation and other DX11 stuff wasn't that useful because it murdered my performance in games. I went from maxing games out at 60fps to 30-40 even with turning settings down a notch or two. It was more of a sneak peek at what was coming in a few more years when it was standard in most games. I think it will be the same with this, like how the games advertised only had it working in 1 area at a time. 3+ years from now it'll probably be built into every game and in multiple ways.
 

Deleted member 1120

user requested account closure
Banned
Oct 25, 2017
1,511
What gives me hope it that instead of something like PhysX, Ray Tracing has to be added after the scene is rendered in order to be able to play off the scene.

This also gives me hope that the supplemental computing device that Nintendo patented for switch could be an RTX and NGX implementation build into a dock. That would free up all the power the Switch uses for lighting and shadows when in docked mode and in a closed system NGX would have the best effect by having learned behavior for specific games running at specific resolutions with specific settings packed in patches.
Would the usb-c port have enough bandwidth to handle something like that?
 
OP
OP
spad3

spad3

Member
Oct 30, 2017
7,125
California
Motion matching animation as seen in The Last of Us 2 gameplay demo is what I consider next-gen (even though it's current-gen hardware). Solving the hard problems of totally seamless animation blending.

AerEAFy.gif

Gif credit to nib95.

Yeah Animation Blending is definitely another core aspect that needs improvement. Integrating AI and machine learning for smoothly transitioning animation frames is definitely very "next-gen." Surprisingly enough, the lighting in this specific demo was amazing as well. RT tech for something like this would help it look even more natural and life-like.
 

devSin

Member
Oct 27, 2017
6,197
I think it will be the same with this, like how the games advertised only had it working in 1 area at a time. 3+ years from now it'll probably be built into every game and in multiple ways.
It's not going to be built into any game until it's supported on consoles. That may happen with the next gen, but I honestly doubt it.

The potential to fundamentally alter the way games are created is there, but the reality is still far off IMO. The existing techniques developed to fake more accurate lighting are fairly robust, and they don't require special hardware that prices out all but the enthusiast market.

I'm thinking 5-10 years, rather than 3-5.
 

SlothmanAllen

Banned
Oct 28, 2017
1,834
While I think some of the tech nVidia demoed looked cool, the price they attached to it doesn't justify what is basically an exclusive feature set with a limited number of games. On top of that, they didn't bother to tells how much better these new cards will run existing and upcoming games. So I think it looks cool, but not next-gen. As Wollan pointed out above, Last of US 2 looks better than anything demoed at the nVidia event.
 

Nostremitus

Member
Nov 15, 2017
7,777
Alabama
Would the usb-c port have enough bandwidth to handle something like that?
Depends on how it works. If it is added as a post process, then it would add it in real time as the signal passes through without needing to send anything back to the Switch. It would just be a matter of it knowing what the surfaces are made of and how much lag it adds.

Or if the Supplemental device is powerful enough to do most of the processing with the Switch sending data to the Device and only doing processes that wouldn't be as affected by the latency, similar to cloud assisted processing.

If it's not able to be done either way then no, probably not.

But I can hope.
 
Last edited:

1-D_FE

Member
Oct 27, 2017
8,268
I don't expect the new consoles to have it because I don't believe it's worth the high cost right now. Do I feel this is the natural evolution? Of course. But I also feel Nvidia is hiding the frame-rate hit and this probably should have waited until it wasn't such a high trade off. Nvidia loves to crow. The fact they're either completely refusing to mention frame-rate or using unimpressive single frame render times to try and confuse people who don't take 2 seconds to do the conversion troubles me a lot. It's the actions of a company really trying to hide something.
 

Nooblet

Member
Oct 25, 2017
13,637
That RTX on and off example in the OP is kind of pissing me off as in my opinion it's a misleading piece of material. What's actually happened there is RTX off is without Global Illumination and RTX on is with GI. Yes I understand that RTX is being used to compute the GI in that scene but that's not what the image implies, instead it implies that you cannot have GI without RTX when on the contrary we've seen real time GI (of some form) being done without ray tracing in recent years, as such it should be that without RTX you cannot have as accurate GI as you have when you are using RTX.

Ofcourse then they run into the issue of not being able to clearly show substantial visible difference between non RTX GI and GI that's done using RTX, but atleast it'd be an accurate representation.
 

Crayon

Member
Oct 26, 2017
15,580
I could see this being the defining next gen graphical flourish. I'm more interested in what happens when the CPU target moves up dramatically.
 

Carn

Member
Oct 27, 2017
11,924
The Netherlands
Dunno. From what we've seen so far, I would pitch it more in the 'look what GPU compute can do' league. As in, great tech, but will take years to trickle down in our games in a significant matter. But when it does, it will be glorious.
 
OP
OP
spad3

spad3

Member
Oct 30, 2017
7,125
California
What gives me hope it that instead of something like PhysX, Ray Tracing has to be added after the scene is rendered in order to be able to play off the scene.

This also gives me hope that the supplemental computing device that Nintendo patented for switch could be an RTX and NGX implementation build into a dock. That would free up all the power the Switch uses for lighting and shadows when in docked mode and in a closed system NGX would have the best effect by having learned behavior for specific games running at specific resolutions with specific settings packed in patches.

image.php
fig4_simple_vs_complex-625x305.png

tyzdennik-nastup-raytracingu-image-695.jpg



Real-time ray-tracing can be turned on and off "like a light-switch" according to Jensen.

And yeah a beefed up dock for the Switch would be kinda like using an external GPU to push for more demanding visuals like how Microsoft's Surface Book and the Razer Core work.
 

Soundchaser

Member
Oct 25, 2017
1,613
I don't expect the new consoles to have it because I don't believe it's worth the high cost right now. Do I feel this is the natural evolution? Of course. But I also feel Nvidia is hiding the frame-rate hit and this probably should have waited until it wasn't such a high trade off. Nvidia loves to crow. The fact they're either completely refusing to mention frame-rate or using unimpressive single frame render times to try and confuse people who don't take 2 seconds to do the conversion troubles me a lot. It's the actions of a company really trying to hide something.
You sound like a FUD mouthpiece. If you find any actual evidence of Nvidia hiding something, you are more than welcome to present it.
 
Nov 8, 2017
13,124
Some of the demos looked very good but the importance is more subtle than just "ooh shiny new tech". If it could be made mainstream it would drastically cut down on how long it takes for developers to make sure their scenes look right with the sort of "dirty hack" methods of making shadows/ao/reflections/etc look convincing that they've had to use in lieu of ray tracing for years. I think it's very important in that sense.
 

Razor Mom

Member
Jan 2, 2018
2,549
United Kingdom
That RTX on and off example in the OP is kind of pissing me off as in my opinion it's a misleading piece of material. What's actually happened there is RTX off is without Global Illumination and RTX on is with GI. Yes I understand that RTX is being used to compute the GI in that scene but that's not what the image implies, instead it implies that you cannot have GI without RTX when on the contrary we've seen real time GI (of some form) being done without ray tracing in recent years, as such it should be that without RTX you cannot have as accurate GI as you have when you are using RTX.

Ofcourse then they run into the issue of not being able to clearly show substantial visible difference between non RTX GI and GI that's done using RTX, but atleast it'd be an accurate representation.
This. Arguably, real-time GI is the biggest missing component in the overall as far as what we're capable of producing in games goes, but if you don't need it to be real time (stationary lighting environments) then baked GI can look phenomenal.
 

Nooblet

Member
Oct 25, 2017
13,637
This is tessellation all over again in a way (although ray tracing is genuinely a much much bigger deal in the right circumstances). Back with the GTX 460 up, tessellation was the buzz
You will have benchmarks like Unigine heaven or whatever that really blow it out, then you will have all the support games that do a mixed job of meh to nice - regardless, all baby steps as implementation improves but also the cards themselves have enough punch to do justice.
It's a larger leap than tessellation. Additionally tessellation is still not super common because at the end of the day turning it on eats away performance which devs would rather have over tessellated objects. However, unlike tessellation this is not really going to affect the performance when turned on in the game because it's a discrete unit in the GPU itself that is separate from the rest of the GPU that will carry out the more traditional aspects of rendering. Overtime the implementation of it and the ability of the hardware to carry out these implementations will improve ofcourse, but it's going to be a different sort of case because we won't necessarily be able to turn it off for a boost in overall framerate. Infact we will instead see a boost in performance as more of the GPU will be freed up when the RTX units take over some of these things that would've otherwise had a performance impact were they rendered in a more traditional rasterised manner.
 
OP
OP
spad3

spad3

Member
Oct 30, 2017
7,125
California
What else is there that can be considered next-gen technology?

Mentioned above: AI and Machine Learning for Animation Blending
Improved Object Collision Detection
Improved Pre-Loading for Pop-In Reduction
Physics Engines (variable depending on the type of game)

There's a lot that can be considered "next-gen," it doesn't necessarily have to be new tech, it could be new improvements on old tech that make it new again (like Raytracing).
 

Ozgiliath

Alt-Account
Member
Aug 13, 2018
653
This is tessellation all over again in a way (although ray tracing is genuinely a much much bigger deal in the right circumstances). Back with the GTX 460 up, tessellation was the buzz
You will have benchmarks like Unigine heaven or whatever that really blow it out, then you will have all the support games that do a mixed job of meh to nice - regardless, all baby steps as implementation improves but also the cards themselves have enough punch to do justice.

These days i don't care anymore with all the buzzwords and shit.

Remember the tessellation in Crysis 2?

I was fucking bullshit, even with those video's they have shown..

 

Nooblet

Member
Oct 25, 2017
13,637
This. Arguably, real-time GI is the biggest missing component in the overall as far as what we're capable of producing in games goes, but if you don't need it to be real time (stationary lighting environments) then baked GI can look phenomenal.
Things like SVOGI are pretty real time though. Quantum Break's GI for example was real time, but yes all these implementation of real time GI that we've seen so far have some elements that were still baked (though still more or less "real time" as the lighting and scene could be manipulated and cause a change, compared to something like AC Unity which is totally static from start to finish)...but the comparison should show that (in the RTX off image) instead because it isn't like we've been playing games without GI all these years and have never seen what it looks like.
 

Deleted member 43077

User requested account closure
Banned
May 9, 2018
5,741
the only demo they showed that was actually noticeable/nice was BFV

im curious how it looks like in like 4 years time.
 

GMM

Banned
Oct 27, 2017
5,484
Absolutely, ray tracing is a huge deal when it comes to making believable game worlds since everything can react in a believable way. A big reason for games looking really good these days are because of how good developers have gotten at developing techniques for faking things like lighting, shadows and reflections. The faked stuff looks great, but it lack the dynamic nature proper ray tracing brings.

RTX/DXR is a huge deal for the computer graphics industry since it can radically change how VFX productions are made and will have a big impact on video games over the next decade as RTX/DXR capable GPU's find their way into regular consumer devices.

The verdict over how efficient RTX is is still up in the air, but what they have shown has been incredibly impressive and I look forward to getting my RTX 2080 Ti next month so I can test the rendering capabilities of it.