Digital Foundry || Minecraft DXR on Xbox Series X: Next-Gen Ray Tracing Analysis!

JaseC

Member
Oct 25, 2017
5,543
Western Australia
Obviously, this is just one point of data, and I'm sure the devs could've squeezed out more consistent performance if given more time, but, uh, safe to say the XSX's RTRT performance is not ~4x that of the 2080 Ti. ;)

Rasterisation performance comparable to a 2080 and RTRT performance somewhere around the 2060 sounds just about right for an AMD APU in a $400-$500 console. AMD wasn't going to pull a rabbit out of its hat with its first run at ray tracing.
 
Last edited:

Bluforce

Member
Oct 27, 2017
256
I apologize form my noobness.
I don't know Minecraft very well, but I don't see all this "beautifull stuff" that some of you are seeing.

To have those kind of light effects, do we really needs raytracing tech?
 
Oct 27, 2017
2,310
Obviously, this is just one point of data, and I'm sure the devs could've squeezed out more consistent performance if given more time, but, uh, safe to say the XSX's RTRT performance is not ~4x that of the 2080 Ti. ;)
But... the math says so! ~Do the math~ ;)

But yeah, if it were 4x 2080TI at RT, it would perform much better even at an unoptimized stage.
 

DSP

Member
Oct 25, 2017
4,883
rtx 2060 is a very good card for the price, it is just a bit gimped by its VRAM and RTX takes a whole bunch of VRAM. Apparently asus is making a 8GB version, it is interesting to see how it compares to 6GB version.

This performance is about what I expected compared with nvidia's implementation. It is going to get better though and AAA games will use it smarter to manage the performance hit.
 
OP
OP
ILikeFeet

ILikeFeet

Member
Oct 25, 2017
30,060
I apologize form my noobness.
I don't know Minecraft very well, but I don't see all this "beautifull stuff" that some of you are seeing.

To have those kind of light effects, do we really needs raytracing tech?
sure, if you want to place sixty billion lights by hand and spend hours baking gigabytes of low resolution shadow and diffuse maps (which isn't doable because of the procedural world)
 

JudgmentJay

Member
Nov 14, 2017
1,602
Texas
Obviously, this is just one point of data, and I'm sure the devs could've squeezed out more consistent performance if given more time, but, uh, safe to say the XSX's RTRT performance is not ~4x that of the 2080 Ti. ;)
Immediately what I thought when I saw this video. That poster was very sure of himself.
 

dmix90

Member
Oct 25, 2017
778
Russian Federation
Well yeah..... I am sorry but i will repeat my initial reaction to Lockhart -> fuck Lockhart

Here is the case, right from the start where it won't be able to keep up with ambition unless its running at like PS2 resolution.

Yes, it's not fully optimized example yet but still valid. Can we please get a good solid baseline without excessive branching for a change? Kill Lockhart while it's not too late.
 

SummitRidge

Member
Oct 28, 2017
122
The unoptimized rasterization performance of XBSeX has already been shown to be comparable to a 2080 Super. So yes, I expect at least 2080 Super levels of raytracing performance or else the RT side of things its going to be a big bottleneck.

2060 levels of RT perf is disappointing if true. I'd rather turn off RT and run at 4K60 at that point instead of running at 1080p30 with RT.

i mean, dictator might say its 2080 non super level for raster, but we have gears 5 running at 4k at a locked 60 on settings beyond pc ultra for the demo, and then mike rayner says they have it running over 100fps now, when a 2080Ti only does 60fps at 4k ultra in the benchmark, and only like, 80 in the section shown in gameplay in the demo, with lower settings.... sooo...... and that's right out of the mouth of the technical lead at the coalition, so yeah, no, the xsx is faster than the 2080Ti and DF got it wrong on that one. IF it were 2080 tier for raster you'd be looking at 4k at ~40fps on ultra, not 4k100+ on settings beyond that, and that aint what we got. As for the RT performance, we'll see i guess. Minecraft RT looks ok so far i guess. Also, dictator/DF claiming that minecraft RT runs way over 60fps at all times on a 2080Ti is DIRECTLY at odds with every other preview that said their 2080Tis struggled to hit and hold 60 at 1080p with it. so at the very least, in beta software using the same code, XSX is roughly matching the 2080Ti per everyone else except dictator's reports about minecraft RTX performance.... and one dude did that demo in a week apparently, as opposed to an entire team optimizing minecraft RTX with nvidia's help for months? come the hell on son.
 
Last edited:

SummitRidge

Member
Oct 28, 2017
122
User warned: hostility
But... the math says so! ~Do the math~ ;)

But yeah, if it were 4x 2080TI at RT, it would perform much better even at an unoptimized stage.
if you want to talk shit about someone do it to their face, and to date not one actual developer has come out and corrected the formula i used to calculate the number of intersection tests you can do per second, and until one does, you can shut the fuck up sir. Infact nobody has provided a different formula for calculating that information at all. saying "youre wrong" but not proving it means literally nothing, and until someone can provide a formula that matches the known data, my formula stands. My formula straight up matches the fucking data both companies advertised, and if you think its wrong you're welcome to provide a better one, otherwise you can kindly fuck off with that passive agressive bullshit and talking shit about people behind their backs.

every preview we have of minecraft RTX says it runs at 1080p at 30 - 60 at best on a 2080Ti and they spend months and had an entire team working on it.

XSX is apparently delivering the same, except 1 dude did the tech demo in a month, without support from a team of engineers spending months helping him like the team has for the minecraft RTX work. so show me how minecraft RTX ran exactly one month into development, or uh, you're simply wrong here sir. its not an apples to apples comparison, and the data we have about minecraft RTX anyway says it doesn't run better on a 2080TI. dictator saying it does is not the same thing as dictator providing proof that it does, and we have footage of it running at sub 60 at 1080p on the 2080Ti, and none of it running over 60 at all times
 
Last edited:

thevid

Member
Oct 25, 2017
536
...and one dude did that demo in a week apparently, as opposed to an entire team optimizing minecraft RTX with nvidia's help for months? come the hell on son.
Wouldn't this XSX build be based on the RTX version, which utilizes DXR (as opposed to something like Quake 2 RTX which utilizes Vulkan)? I'm sure there's room for optimizations but I doubt it's a from scratch version by one person.
 
Oct 27, 2017
2,310
if you want to talk shit about someone do it to their face, and to date not one actual developer has come out and corrected the formula i used to calculate the number of intersection tests you can do per second, and until one does, you can shut the fuck up sir. Infact nobody has provided a different formula for calculating that information at all. saying "youre wrong" but not proving it means literally nothing, and until someone can provide a formula that matches the known data, my formula stands. My formula straight up matches the fucking data both companies advertised, and if you think its wrong you're welcome to provide a better one, otherwise you can kindly fuck off with that passive agressive bullshit and talking shit about people behind their backs.
Calm down. I'm sorry I didn't mean to offend you that much, in my eyes it was just a funny little reference but yeah I can see how it's offending. However we should wait to see to determine RT performance correctly, that 4x 2080Ti claim is just ridiculous as you probably saw here now. We also don't know things like the quality and depth of the BVH structure, you always go by the industry standard but that could have changed between the time Nvidia released their metrics and Microsoft, especially since it's using a different version of the API now, DXR 1.1 instead of DXR 1.0 which could also have changed things up.

Please just be patient and wait before you continue making your claims.
 
Last edited:

SummitRidge

Member
Oct 28, 2017
122
Wouldn't this XSX build be based on the RTX version, which utilizes DXR (as opposed to something like Quake 2 RTX which utilizes Vulkan)? I'm sure there's plenty of optimizations that could be done, but I doubt it's a from scratch version by one person.
one person spent a month doing the work, and i doubt very much that nvidia allowed microsoft to use their code to make a demo for an amd powered product. unless you can provide proof that they took already running code and it was perfectly optimized for amd's architecture already, then its still not an apples to apples comparison to compare months of work by a whole team workign with nvidia engineers directly, to one dude porting it in his spare time as a tech demo.
Either way, saying that "welp, guess XSX is crap at RT because its not beating a 2080Ti in this test" is misleading because 1 dude vs entire team, and extremely early tech demo vs software thats about to ship.
 

Bluforce

Member
Oct 27, 2017
256
It will get better. A question for you. Would you rather play Minecraft at 1080p looking like this or at 4K looking its standard way?
I'm honest, I like the original graphics of Minecraft more than the raytraced one.
So I'd probably go for 4K not raytraced in this case.

In general, I play on a 144hz monitor, so I always like to have more fps than more graphics.
 

SummitRidge

Member
Oct 28, 2017
122
Calm down. I'm sorry I didn't mean to offend you that much, in my eyes it was just a funny little reference but yeah I can see how it's offending. However we should wait to see to determine RT performance correctly, that 4x 2080Ti claim is just ridiculous as you probably saw here now. We also don't know things like the quality and depth of the BHV structure, you always go by the industry standard but that could have been changed by the time Nvidia released their metrics and Microsoft released it now, especially since it's using a different API now, DXR 1.0 instead of DXR 1.1 which could also have changed things up.

Please just be patient and wait before you continue making your claims.
fair enough, i was talking purely theoretical numbers that aren't real world in any way because thats all we actually can do right now. my formula for figuring out the raw peak intersection tests is accurate. converting that into rays/sec may not be, though the math works too perfectly on nvidia gpus to be incorrect for them imo. how any of that translates to ACTUAL rt performance is another matter. nvidia quotes 10 gigarays/sec, real world how low is it? is it 5? is it 3? is it 1? the real world figure is going to change everything. as will real world performance with the XSX, we dont know yet and using minecraft RT to compare it is flawed until such time as both are actually shipped, because its far too early and far too unoptimized with only literally 1 dude working on it, to compare to a team's months long effort backed by nvidia directly.
 
Last edited:

thevid

Member
Oct 25, 2017
536
one person spent a month doing the work, and i doubt very much that nvidia allowed microsoft to use their code to make a demo for an amd powered product. unless you can provide proof that they took already running code and it was perfectly optimized for amd's architecture already, then its still not an apples to apples comparison to compare months of work by a whole team workign with nvidia engineers directly, to one dude porting it in his spare time as a tech demo.
Either way, saying that "welp, guess XSX is crap at RT because its not beating a 2080Ti in this test" is misleading because 1 dude vs entire team, and extremely early tech demo vs software thats about to ship.
Shouldn't it be the same API though? DXR? Which is owned by Microsoft. And Minecraft is also owned by Microsoft. I don't think Nvidia has much say in what Microsoft can do with its code. What leverage do you think Nvidia has here? It feels like you are conflating DXR and RTX. I'm not sure how a version of Minecraft, a Microsoft game, utilizing the DXR API, a Microsoft API, is being controlled by Nvidia.
 

SummitRidge

Member
Oct 28, 2017
122
Shouldn't it be the same API though? DXR? Which is owned by Microsoft. And Minecraft is also owned by Microsoft. I don't think Nvidia has much say in what Microsoft can do with its code. What leverage do you think Nvidia has here? It feels like you are conflating DXR and RTX. I'm not sure how a version of Minecraft, a Microsoft game, utilizing the DXR API, a Microsoft API, is being controlled by Nvidia.
nvidia is bankrolling minecraft RTX, its not called minecraft DXR its called minecraft RTX specifically. Nvidia engineers are also working with that team directly to get it released. So yeah, MS owns DXR but they dont have any legal ownership or right to use someone elses code that someone else funded.
 

SummitRidge

Member
Oct 28, 2017
122
Nope, it runs way higher than 60fps on the 2080Ti according to DF and everyone who saw it and experienced it.

“These are RTX 2080 Ti-based systems and we’re getting roughly 60fps from this,” Nvidia said.

nvidia themselves claimed 2080Ti was roughly 60 on the demo at 1080p, rock paper shotgun noted it wasn't smooth, others have as well, i'd be glad to dig them out. Digital foundry can claim whatever they like but until there is a video proving it, i'll take the footage we have of it at sub 60fps, and nvidia's statement about it being sub 60fps as fact. they HOPE it runs at 1080p60 on a 2060 when its released, but currently, it does not sir, its 1080p, sub 60 on a 2080Ti
 

Muhammad

Member
Mar 6, 2018
152
nvidia themselves claimed 2080Ti was roughly 60 on the demo at 1080p, rock paper shotgun noted it wasn't smooth, others have as well, i'd be glad to dig them out. Digital foundry can claim whatever they like but until there is a video proving it, i'll take the footage we have of it at sub 60fps, and nvidia's statement about it being sub 60fps as fact. they HOPE it runs at 1080p60 on a 2060 when its released, but currently, it does not sir, its 1080p, sub 60 on a 2080Ti
You do whatever you want, the conversation is no longer logical with you. The 2080Ti is getting mostly 60fps even according to your link, while others like DF noted way higher than 60fps. The Xbox is running between 30 to 60fps, which is way lower. So can't ever be 4X the RT performance according to your flawed math.

Oh and the 2060 will run this at 1080p, according to your link, and the 2080Ti will be able to push 4K:
Performance-wise, Nvidia told us it’s expecting its bottom end RTX graphics card, the Nvidia GeForce RTX 2060 6GB, to be able to play Minecraft RTX at 1080p, 60 frames per second with all raytracing features enabled.
For higher resolutions you will require a higher-end GeForce RTX 20 Series video. The top-end Nvidia GeForce RTX 2080 Ti and Nvidia GeForce RTX 2080 Super 8GB are well placed to handle Minecraft RTX at 4K screen resolution.
 

Monster Zero

Member
Nov 5, 2017
5,229
Southern California




nvidia themselves claimed 2080Ti was roughly 60 on the demo at 1080p, rock paper shotgun noted it wasn't smooth, others have as well, i'd be glad to dig them out. Digital foundry can claim whatever they like but until there is a video proving it, i'll take the footage we have of it at sub 60fps, and nvidia's statement about it being sub 60fps as fact. they HOPE it runs at 1080p60 on a 2060 when its released, but currently, it does not sir, its 1080p, sub 60 on a 2080Ti

SummitRidge you're making our 2080Ti's angry. Gonna be looking for you in the inevitable DF Gears 5 Series X vs PC comparison video thread.
 

SummitRidge

Member
Oct 28, 2017
122
You do whatever you want, the conversation is no longer logical with you. The 2080Ti is getting mostly 60fps even according to your link, while others like DF noted way higher than 60fps. The Xbox is running between 30 to 60fps, which is way lower. So can't ever be 4X the RT performance according to your flawed math.
if my math is flawed, show me the correct formula, otherwise you're just deliberately trying to start a fight. You and nobody else have provided a single legitimate counterpoint or other math to contradict mine. Show me how to calculate intersection tests per second on a gpu? show me the industray standard bvh depth level, show me then how to take intersections /sec and convert that into a theoretical peak rays/second. Until you can do that, you're just trying to pick a fight, and you're definitely being hostile to me, so kindly please stop. Saying i'm wrong but not providing proof, using better math, and correcting my math directly, is nothing its literally nothing.
 
Oct 27, 2017
2,310
but currently, it does not sir, its 1080p, sub 60 on a 2080Ti
That's not true, currently we don't know how well it runs. If anything, we know that it ran at roughly 60 FPS in October 2019. A lot optimization could have been done since then. Plus, only the author stated it ran at 1080p, but not Nvidia themselves. Question is how reliable that claim is.
 

Muhammad

Member
Mar 6, 2018
152
if my math is flawed, show me the correct formula, otherwise you're just deliberately trying to start a fight. You and nobody else have provided a single legitimate counterpoint or other math to contradict mine. Show me how to calculate intersection tests per second on a gpu? show me the industray standard bvh depth level,
Show me the link to your stated BVH standard depth? Show me a link that NVIDIA used this on their Giga Ray formula please? Otherwise your math is just pure conjecture on your part.
 

SummitRidge

Member
Oct 28, 2017
122
That's not true, currently we don't know how well it runs. If anything, we know that it ran at roughly 60 FPS in October 2019. A lot optimization could have been done since then. Plus, only the author stated it runs at 1080p, but not Nvidia themselves. Question is how reliable that claim is.
could pixel count it if you really want me to, but there are other sources claiming sub 60fps at 1080p for the demo. it may be better now, but then how many months of dev has gone into that? was it more than 4 weeks and more than 1 engineer? if the answer is yes then its not an apples to apples comparison now is it?
 

SummitRidge

Member
Oct 28, 2017
122
Show me the link to your stated BVH standard depth? Show me a link that NVIDIA used this on their Giga Ray formula please? Otherwise your math is just pure conjecture on your part.
if my formula is not accurate please tell me what the correct formula is, otherwise you cannot prove it isn't accurate. you're trying to pick a fight and you're pissing me off, please stop being hostile to me sir.
 

SummitRidge

Member
Oct 28, 2017
122
Where are people getting this 4x RTX 2080 Ti nonsense from?
how many raw intersection tests per second can you do on a turing gpu? do you know the formula to calculate it? in terms of raw intersects/sec, on paper, purely theoretical, the XSX is ~3.65x faster than a reference clocked 2080Ti


RT core count times clockspeed gets you the raw theoretical peak intersections/sec count you can do... do the math for the XSX and you get 208 times 1825mhz = 379.6 billion per second. your own article says 380 billion/sec, so the formula is accurate. for an nvidia gpu then, you do, for a 2080Ti its 68 times 1545mhz for 105.06 billion intersects/second. Real world this has no relevance, but purely on paper which is all you can do right now, that's what the math says it is. I'd be happy to be corrected if you have a better formula for calculating the RAW , peak intersections/second the gpus can do.
 
Oct 27, 2017
2,310
could pixel count it if you really want me to, but there are other sources claiming sub 60fps at 1080p for the demo. it may be better now, but then how many months of dev has gone into that? was it more than 4 weeks and more than 1 engineer? if the answer is yes then its not an apples to apples comparison now is it?
That is indeed true, however it is important to remember that probably a lot of work on the Minecraft RTX version can also be transported over to the Xbox Series X version, since both are running on DXR. Architectural optimization still has to be made but in theory, you could port many optimizations between PC and console, which is AFAIK one of the strength points of DXR.

But yeah, we have to wait and see for sure. Still, path tracing on a console in 1080p and more than 30 FPS is impressive as hell.
 

Muhammad

Member
Mar 6, 2018
152
if my formula is not accurate please tell me what the correct formula is, otherwise you cannot prove it isn't accurate. you're trying to pick a fight and you're pissing me off, please stop being hostile to me sir.
Please provide the required links to the BVH depth test number and to the NVIDIA reference forumla (and that they used this number precisely), either put up or don't brag about your accurate math, because it"s not until you provide the links for your claims.
 

SummitRidge

Member
Oct 28, 2017
122
It's that time of the console cycle, a time when facts don't matter and console tech is basically magic.
again, to date not one person has provided a different formula that also arrives at microsoft's officially confirmed 380 billion intersections/second figure. my formula arrives PRECISELY at that figure, RT core count times clockspeed gets you that figure.
 

thevid

Member
Oct 25, 2017
536
nvidia is bankrolling minecraft RTX, its not called minecraft DXR its called minecraft RTX specifically. Nvidia engineers are also working with that team directly to get it released. So yeah, MS owns DXR but they dont have any legal ownership or right to use someone elses code that someone else funded.
I think you are going to be surprised when Minecraft RTX works on AMD cards that support DXR. Because it's not Minecraft RTX, it's Minecraft Bedrock Edition.

"It’ll be playable on Windows 10 with devices that are capable of DirectX R, such as with an NVIDIA GeForce RTX GPU (and we plan to expand it to future platforms that support DirectX R raytracing)."
 

Alexandros

Member
Oct 26, 2017
8,607
again, to date not one person has provided a different formula that also arrives at microsoft's officially confirmed 380 billion intersections/second figure. my formula arrives PRECISELY at that figure, RT core count times clockspeed gets you that figure.
So you are absolutely certain that XSX will have 4x 2080 Ti ray tracing performance?
 

SummitRidge

Member
Oct 28, 2017
122
So you are absolutely certain that XSX will have 4x 2080 Ti ray tracing performance?
no, i'm absolutely certain that on paper it can do ~3.6x as many intersections per second. in the real world all bets are off, but my formula was never going to give you real world performance, just like the formula for flops doesnt tell you squat about real world performance. its all just raw peak on paper figures. the math my formula uses arrives at exactly the XSXs confirmed intersects/second figure (380 billion) and the formula can also accurately arrive at the RT performance figure nvidia quotes for every single one of its gpus. People keep saying "youre wrong", or being hostile and trying to stir up shit but to date none of them have provided a different formula that accurately arrives at the figures from either company let alone both of them.
 

eonden

Member
Oct 25, 2017
6,445
Real world this has no relevance, but purely on paper which is all you can do right now, that's what the math says it is. I'd be happy to be corrected if you have a better formula for calculating the RAW , peak intersections/second the gpus can do.
Why are we comparing on paper stuff when there is a big chance that there is no real correlation with real world (like in the case of teraflops)? For all we know we are comparing apples and oranges (edit: due to the behaviour of both things being different).
I say this mainly because I doubt AMD has a better raytracing solution than Nvidia taking into consideration where the technology from raytracing came from.
 

plagiarize

Don't touch your face!
Moderator
Oct 25, 2017
11,393
Cape Cod, MA
I'm just very glad that both new consoles (and upcoming amd gpus) support this so people can stop pretending it isn't great due to platform preference etc.

Whatever the performance difference may be. RT will play a key part in next gen engines etc.
 

SummitRidge

Member
Oct 28, 2017
122
Why are we comparing on paper stuff when there is a big chance that there is no real correlation with real world (like in the case of teraflops)? For all we know we are comparing apples and oranges.
on paper is about all we can do right now. and DF comparing a version of minecraft 1 dude did in his spare time over a month vs an entire team spending many months and many millions is about as apples and oranges as it can get too, but nobody's bitching about that
 
Last edited: