Digital Foundry || Minecraft DXR on Xbox Series X: Next-Gen Ray Tracing Analysis!

DukeBlueBall

Member
Oct 27, 2017
3,976
Seattle, WA

Ah yes, that famously believable game, Minecraft. As I said, I’ll reserve judgement on this until I can do a more thorough side by side comparison, but even if the ray tracing version does end up looking much better than current Minecraft shaders, I’m still a bit concerned over how well it will actually run. My demo was running on an RTX 2080 Ti, for example (a graphics card that, lest we forget, costs over a grand at time of writing), and even that produced some noticeable stutters and juddery camera movements when it was playing at 1920×1080. I was, admittedly, recording the game at the same time for our RPS vid buds using Nvidia ShadowPlay, but still: it’s not exactly reassuring that the most powerful graphics card on the planet is struggling with it.

“These are RTX 2080 Ti-based systems and we’re getting roughly 60fps from this,” Nvidia said. “But the goal now is to always make sure every game runs well on the RTX 2060. That’s always the plan.”https://assets.rockpapershotgun.com/images/2019/10/Minecraft-RTX-forest-off.png
I can't find other mentions of perf on RTX 2080TI, besides Nvidia aiming to hit 1080p60fps on RTX 2060.

So most recent apples to most recent apples:

Minecraft RTX runs at 1080p <=60fps on RTX 2080 TI
Minecraft RTX runs at 1080p 30-60fps on XSX

Dictator help needed! :p
 
Last edited:

SummitRidge

Member
Oct 28, 2017
122



I can't find other mentions of perf on RTX 2080TI, besides Nvidia aiming to hit 1080p60fps on RTX 2060.

So apples to apples:

Minecraft RTX runs at 1080p <=60fps on RTX 2080 TI
Minecraft RTX runs at 1080p 30-60fps on XSX
yep, except 1 dude did the XSX demo in a month by himself and the nvidia demos are a full team spending months and millions of dollars.... so yeah.....
 

Csr

Member
Nov 6, 2017
381
So significantly faster than 2080ti in both raster and rt...
Those expectations are beyond unrealistic. You are setting yourselves up for disappointment by interpreting facts in the most optimistic way possible.
 
OP
OP
ILikeFeet

ILikeFeet

Member
Oct 25, 2017
30,066
come thursday, we'll know more about DXR and will probably see minecraft again, albeit on PC. since the last time we saw Minecraft DXR was back last year, there's probably a lot of improvements that the XBO version benefited from

let's just wait and see
 

JahIthBer

Member
Jan 27, 2018
5,785
1080p in Minecraft, yeah next gen isn't going to be 4K/60fps with RT like people think.
yep, except 1 dude did the XSX demo in a month by himself and the nvidia demos are a full team spending months and millions of dollars.... so yeah.....
Im pretty sure the DXR build is based off the RTX build, the RTX build we haven't seen in months either, so im going to guess it's been cancelled in favour of this universal build that should work on AMD too when RDNA2 comes to PC.
 
Last edited:
OP
OP
ILikeFeet

ILikeFeet

Member
Oct 25, 2017
30,066
1080p in Minecraft, yeah next gen isn't going to be 4K/60fps with RT like people think.
Minecraft DXR is full path tracing, not hybrid ray tracing. if people are calling hybrid RT expensive, they'll faint at the cost of path tracing. hence why only games like minecraft and quake are using it
 

JahIthBer

Member
Jan 27, 2018
5,785
Do you think every PC has ray tracing?

Raytracing on these consoles is going to be special feature not used by many games. Some will use it for a specific effect here and there, some will not use it at all. Some will use it for their lighting.

But even those games that use it for their lighting, like this Minecraft demo are also going to offer an alternative lighting model for everyone on PC who doesn't have a raytracing graphics card.
I dunno, the Metro devs said they want to go full ray tracing, i think it's going to be quite mainstream. PC gamers will need to suck it up & get an RTX GPU sooner or later, AMD users who sadly brought the 5700/XT have my sympathy, they got kinda scammed & will need to buy the 6700/XT or whatever if they want to keep up.
Minecraft DXR is full path tracing, not hybrid ray tracing. if people are calling hybrid RT expensive, they'll faint at the cost of path tracing. hence why only games like minecraft and quake are using it
Yeah the stuff Quake 2 & MC are doing is quite demanding, but still, doing say Control or Metro's RT GI, coupled with next gen visuals, i can't see that possible without 20+ TFLOPS at 4K/60fps.
 
Last edited:
OP
OP
ILikeFeet

ILikeFeet

Member
Oct 25, 2017
30,066
I dunno, the Metro devs said they want to go full ray tracing, i think it's going to be quite mainstream. PC gamers will need to suck it up & get an RTX GPU sooner or later, AMD users who sadly brought the 5700/XT have my sympathy, they got kinda scammed & will need to buy the 6700/XT or whatever if they want to keep up.
I wouldn't say AMD buyers got scammed. everyone knew that these cards had a short shelf-life when next gen systems were announced to have ray tracing. say what you want about RTX cards but at least they can run future games
 

JahIthBer

Member
Jan 27, 2018
5,785
I wouldn't say AMD buyers got scammed. everyone knew that these cards had a short shelf-life when next gen systems were announced to have ray tracing. say what you want about RTX cards but at least they can run future games
I don't think a lot of AMD users understood that RT would become so mainstream, but AMD themselves sure did.
 

DSP

Member
Oct 25, 2017
4,884
those PC without RT will have to upgrade. have you ever been through a generational transition? The entry point goes up by a lot. I expect something like 2060 to become min spec in 2021 AAA games.

when xbox 360 came out, all the new games asked for shader model 3.0 support, ATI X cards at the time didn't have it despite being pretty new, they became obsolete in like a year on the market.

when this gen started, every game started to ask for DX11 support. This time, the generation had lasted so long that most cards already started supporting dx11 3 years prior but most didn't have the performance to actually run the new games anyway so they were still obsolete when a year earlier a card like GTX 400 were completely crushing the games, it suddenly was unusable in 2014 because the VRAM was too low.

It is just the norm, of course what you have right now won't give you 1440p 100 fps on new games and there will be a lot of new features that become a requirement now which some cards straight up can't support. The main concern are the owners of the navi cards, those are not going to age well imo.

I just hope the uninformed don't start with the "lazy ports" whining because you are not going to get the performance you are used to, even with newer cards. It's a complete reset. Specially to get like 100+ fps the cpus for that probably won't exist for a fair bit. This has been really annoying since RTX cards launched and people kept just downplaying what it does because the performance is lower than they wanted, duh? nothing is free. This is a brand new leap in visuals, it will come with a high cost.
 
Last edited:

Gitaroo

Member
Nov 3, 2017
1,779
That’s insane actually. Imagine what we can achieve with full optimization?
Different scenarios, it's done through direct X DXR that works with any hardware that has the raw processing power. Nvidia optimized it to make it work their their own Turing architecture that is design to off load that work.
 

JaseC

Member
Oct 25, 2017
5,548
Western Australia
Considering it's running at a locked 4k60 on SX (which means 60 is the minimum framerate) I doubt it would run at only 100fps @1080p or that they would have any problems at all to get to 1080p 120.

Also the X already runs the game at 4k60fps, dynamic but always relatively close to 4k, it's not that outlandish that a gpu more than twice as strong would push less than double the framerate considering its pushing native resolution at all times and higher settings.
A point I hadn't considered is that while some aspects of the game are beyond Ultra on the XSX, the few settings that can be set to Insane on PC, or at least the more taxing ones, are below that. The PC benchmarks I was looking at while um'ing and ah'ing had the game maxed out, inclusive of Insane, so I can see 4k100 on the XSX being true if the game is running at Ultra+ and not Insane+.
 

rokkerkory

Member
Jun 14, 2018
5,017
Excited at the potential to patch in some kind of RT goodness into existing games. Can't believe the once halo RT capabilities are actually coming to consoles. Crazy.
 
OP
OP
ILikeFeet

ILikeFeet

Member
Oct 25, 2017
30,066
I have a hunch that Gears running at 4K/100 is using some stuff like VRS and dynamic res or upscaling to maintain that.
 

Terbinator

Member
Oct 29, 2017
1,663
I have a hunch that Gears running at 4K/100 is using some stuff like VRS and dynamic res or upscaling to maintain that.
I think this has already been clarified that above Ultra doesn't necessarily mean the extreme setting from PC which are absolutely ludicrous on performance.

4K 60/100 Ultra+ is probably doable. Obviously the RT enhancements aren't on PC yet, either.
 

Walken

Member
Nov 25, 2019
1,107
Looks like Raytracing is just way too demanding. Do we really expect these new consoles to do Cyberpunk 2077 at decent frame rates with RT on?
 
OP
OP
ILikeFeet

ILikeFeet

Member
Oct 25, 2017
30,066
Looks like Raytracing is just way too demanding. Do we really expect these new consoles to do Cyberpunk 2077 at decent frame rates with RT on?
this isn't the same thing. minecraft is using path tracing, which is 100% ray tracing. Cyberpunk is using the cheaper, more efficient hybrid RT

I think this has already been clarified that above Ultra doesn't necessarily mean the extreme setting from PC which are absolutely ludicrous on performance.

4K 60/100 Ultra+ is probably doable. Obviously the RT enhancements aren't on PC yet, either.
there's no RT (in the traditional sense) in Gears 5, just SSGI which runs on compute
 

plagiarize

Don't touch your face!
Moderator
Oct 25, 2017
11,393
Cape Cod, MA
A point I hadn't considered is that while some aspects of the game are beyond Ultra on the XSX, the few settings that can be set to Insane on PC, or at least the more taxing ones, are below that. The PC benchmarks I was looking at while um'ing and ah'ing had the game maxed out, inclusive of Insane, so I can see 4k100 on the XSX being true if the game is running at Ultra+ and not Insane+.
Maybe people will listen to you. Lol. Not sure why so many folks forgot about Insane or how much of a hit those settings are. I run at a mix of mostly ultra with one or two things (volumetric) on high and only screen space reflections on insane. My cutscenes stay north of 60 at 4K (with drs on) on a 2080 Ti and a CPU quite possibly less powerful than what the Series X uses.
 

Muhammad

Member
Mar 6, 2018
152
208 times 1825 gives you 379.6 Tflops by your formula, divide by 3.47 like nvidia did and uh? 109.34TF? so where's the discrepency?
Who said Xbox has 208 RT cores? As explained above if the RT cores reside in the TMU, then each 4 TMUs has 1 RT core, possibly even less. In fact the quoted TF numbers from Microsoft strengthens that idea. The Series X will have fewer cores than you imagine.

And even if it weren't, the 2080Ti doesn't do over 60fps in it in any video we presently have for it, and nvidia themselves are straight up on record saying its sub 60fps on a 2080Ti.
NVIDIA said they will run 4K on the 2080Ti, the 2060 will run 1080p60, don't dance around the subject please.

show me how minecraft ran on a 2080Ti with 1 dude working on it by himself for 4 weeks. until you an do that, you can't provide an apples to apples comparison using raytraced minecraft, because the size of the team, the amount of man hours, money, and optimization done is lightyears apart
The Series X is running already optimzied DXR code, which runs on any GPU really, even Pascal GPUs can use it immeditely, If optimizations will occur it will be to reduce the load on certain aspects to allow for better fps, or to allow for resolution scaling .. etc.
 

Betelgeuse

Member
Nov 2, 2017
1,814
I don't want to play games at 1080p again. The cost of RT seems too high.
It's a legit concern and I think literally the one area where next-gen machines won't deliver. It's a shame because we're going to be stuck with this RT performance for 5+ years. But whaddyagonnado? It's the best AMD can do.

I hope devs can achieve a happy medium - RT with a more immediate resolution like 1440p, plus good ML-based upscaling. Good implementations of DLSS, and heck, even good checkerboard implementations, can do a lot to beautify sub-4K resolutions.
 
Last edited:

JaseC

Member
Oct 25, 2017
5,548
Western Australia
Maybe people will listen to you. Lol. Not sure why so many folks forgot about Insane or how much of a hit those settings are. I run at a mix of mostly ultra with one or two things (volumetric) on high and only screen space reflections on insane. My cutscenes stay north of 60 at 4K (with drs on) on a 2080 Ti and a CPU quite possibly less powerful than what the Series X uses.
Nobody ever reads my PC performance thread OPs, so I wouldn't get your hopes up. :p
 

SummitRidge

Member
Oct 28, 2017
122
Who said Xbox has 208 RT cores? As explained above if the RT cores reside in the TMU, then each 4 TMUs has 1 RT core, possibly even less. In fact the quoted TF numbers from Microsoft strengthens that idea. The Series X will have fewer cores than you imagine.


NVIDIA said they will run 4K on the 2080Ti, the 2060 will run 1080p60, don't dance around the subject please.


The Series X is running already optimzied DXR code, which runs on any GPU really, even Pascal GPUs can use it immeditely, If optimizations will occur it will be to reduce the load on certain aspects to allow for better fps, or to allow for resolution scaling .. etc.
nvidia said they would, but so far nobody has proven anything other than the 2080Ti cant even hold 60 at 1080p in minecraft DXR sir.
and it has 208 RT cores because their patent says each TMU is arranged exactly like this.


Figure 10 here clearly shows that each texture processor (TMU) has the intersection engine in it, the rest of the patent details how it all works.
and we know that AMD gpu designs contain 4 TMUs per compute unit. furthermore, unless each RT core is doing multiple intersection tests at once (which is impossible based on what this patent says about how they're doing intersection testing), then it MUST have 208 RT cores or it CANNOT do the peak 380 billion intersection tests/second that microsoft straight up confirmed.
 
Last edited:

Nikokuno

Member
Jul 22, 2019
57
It's a legit concern and I think literally the one area where next-gen machines won't deliver. It's a shame because we're going to be stuck with this RT performance for 5+ years. But whaddyagonnado? It's the best AMD can do.

I hope devs can achieve a happy medium - RT with a more immediate resolution like 1440p, plus good ML-based upscaling. Good implementations of DLSS, and heck, even good checkerboard implementations, can do a lot to beautify sub-4K resolutions.
I mean RT is very new in gaming, and with solution like DLSS and a proper implementation you can play at 4K.

And with both Nvidia on PC and AMD on consoles, developers will get even more tools to make RT and all variations (GI/Reflection/shadow) a reality at higher resolution.
Path Tracing is extremely taxing, and I wouldn't be mad if I have to test that kind of lighting on more modest games honestly.

It'll be an interesting journey to witness. I mean it's quite exciting.
 

Muhammad

Member
Mar 6, 2018
152
nvidia said they would, but so far nobody has proven anything other than the 2080Ti cant even hold 60 at 1080p in minecraft DXR sir.
NO RTX game, and I do mean no RTX game whatsoever, ran 1080p60 on a 2080Ti, it's either 1440p60 or 4K60. Period. If NVIDIA says they are targeting 4K then they will achieve it.

Figure 10 here clearly shows that each texture processor (TMU) has the intersection engine in it, the rest of the patent details how it all works.
and we know that AMD gpu designs contain 4 TMUs per compute unit. furthermore, unless each RT core is doing multiple intersection tests at once (which is impossible based on what this patent says about how they're doing intersection testing), then it MUST have 208 RT cores or it CANNOT do the peak 380 billion intersection tests/second that microsoft straight up confirmed.
Maybe read this:

Not really. There is no sign of each of RDNA2 TMU containing a dedicated BVH unit inside them.
Since TMUs are usually designed in quads logic points to there actually being just one BVH unit per a quad TMU unit.
 

ArchedThunder

Uncle Beerus
Member
Oct 25, 2017
10,559
Impressive stuff, especially considering it was just one engineer and 4 weeks. Hope they can get it to a locked 60fps at 1080p and then have a 30FPS mode that is a higher resolution, even if that's just 1440p or something. Image reconstruction could also be hugely beneficial for this.
If they are able to get it to 60fps at a resolution higher than 1080p then mad props to them.
 

SummitRidge

Member
Oct 28, 2017
122
NO RTX game, and I do mean no RTX game whatsoever, ran 1080p60 on a 2080Ti, it's either 1440p60 or 4K60. Period. If NVIDIA says they are targeting 4K then they will achieve it.


Maybe read this:
in other words you didn't even read the patent, quoting someone else that's wrong doesn't make you right sir, read the patent. the way AMD is doing intersect testing means the xbox series x at 1825mhz MUST have 208 RT cores to do 380 billion intersection tests per second (peak). You go ahead and read the patent, then explain how they can do multiple intersections at once on each RT core so you can hit 380 billion intersections per second with fewer than 208 RT cores at the known 1825mhz clockspeed. And control ran sub 60 for weeks or months on all max settings at 1080p with DLSS off (DLSS is sub native rendering, it doesn't apply here)

That's because this data is based on the same basics of the number of RT cores in each of said GPUs. Again, this means nothing really.
and if you read the rest of his reply to me, he admits that i can arrive at the correct figures because i know the number of RT cores in the gpu, and says it means nothing aka its purely on paper only and not useful to measure real world performance.

Again, read the patent and then explain to me how you get 380 billion intersection tests per second on the xbox series x at 1825mhz if it has anything other than exactly 208 RT cores.

You keep declaring that i'm wrong but refusing to actually explain to me, yourself, in your own words, why i'm wrong. how many RT cores does the xbox series X have? and how does it do 380 billion intersect tests per second with them at 1825mhz? The answer is simple, it has 208 RT cores, each of them can do 1 intersect test per cycle (peak), which at 1825mhz gives you 380 billion intersect tests per second. you cannot do 2, 3 , 4 ,etc intersect tests per cycle on the RT core. Again, read the patent, or ask anyone who knows how the RT cores actually work.

And if the things CAN do, say, 4 intersect tests per cycle, each, and it has 52 RT cores, well then the raytracing performance is exactly the same since the RT cores on rdna2 are only doing ray sorting and bvh intersect testing, traversal is in shaders. Once we knew the intersect tests/second figure, the RT performance is set in stone no matter how you divide those intersects up on the gpu. But nah, you can only do 1 op/clock on the RT cores, so there's 208 of em. Guess nobody bothered to think for 5 seconds before saying it cant possibly have 208 RT cores.
 
Last edited:
Nov 8, 2017
4,986
I thought we since learned not to buy cards that has less features than the upcoming consoles
We didn't have confirmation that the next gen consoles would have RT for quite a while. But I sure saw a lot of comments about how useless rtx was and how even if consoles had it, the first gen nvid ones would be bad at it anyway so no point.
 

eonden

Member
Oct 25, 2017
6,446
A point I hadn't considered is that while some aspects of the game are beyond Ultra on the XSX, the few settings that can be set to Insane on PC, or at least the more taxing ones, are below that. The PC benchmarks I was looking at while um'ing and ah'ing had the game maxed out, inclusive of Insane, so I can see 4k100 on the XSX being true if the game is running at Ultra+ and not Insane+.
That would make more sense. The insane settings are well.. insane resource intensives, and what i understood as "beyond ultra".
 

Muhammad

Member
Mar 6, 2018
152
And control ran sub 60 for weeks or months on all max settings at 1080p with DLSS off (DLSS is sub native rendering, it doesn't apply here)
That's complete BS, Control runs fine 1440p60 on a 2080Ti.

Here to put this fantasy about the Series X being so much faster than a 2080 to an end: DF was shown the Series X running the internal benchmark at Ultra Settings and 4K, vs a PC with RTX 2080 running the exact same thing, and performance was the same.


the way AMD is doing intersect testing means the xbox series x at 1825mhz MUST have 208 RT cores to do 380 billion intersection tests per second (peak).
No Mark Cerny said there is one intersection engine inside each CU.
 

Monster Zero

Member
Nov 5, 2017
5,234
Southern California
That's complete BS, Control runs fine 1440p60 on a 2080Ti.

Here to put this fantasy about the Series X being so much faster than a 2080 to an end: DF was shown the Series X running the internal benchmark at Ultra Settings and 4K, vs a PC with RTX 2080 running the exact same thing, and performance was the same.




No Mark Cerny said there is one intersection engine inside each CU.
I'm glad we all can finally move on past the misinformation being spread.
 

SummitRidge

Member
Oct 28, 2017
122
That's complete BS, Control runs fine 1440p60 on a 2080Ti.

Here to put this fantasy about the Series X being so much faster than a 2080 to an end: DF was shown the Series X running the internal benchmark at Ultra Settings and 4K, vs a PC with RTX 2080 running the exact same thing, and performance was the same.




No Mark Cerny said there is one intersection engine inside each CU.
now show me how it ran after two weeks of development on the RTX 2080. apples to apples sir. show me how a 2080 runs it with all the extra settings XSX can use. you keep using apples to oranges comparisons, and DF didnt even provide any video or proof of what they were saying at all. a threadripper 2950x machine is an oddball config and i personally think DF are lying, or they're mistaken. until they can provide video evidence or show an official statement from the developers then everything they're saying is pure conjecture too. They didn't pixel count the footage to see if dynamic res was being used, they didnt throw a framerate counter on it, nothing. The developers plainly stated it runs over 100fps on settings beyond PC ultra. so far DF has not provided video evidence that it does not. Until that happens, they're no more right or wrong than i am. Digital foundry are not infallible, like i said i personally have gotten them to correct outright lies used in one of their articles last year, when they reccomended the slower, more expensive GTX 1060 over the faster, cheaper RX580 at the time and said flat out untrue things in the article despite their own charts proving them wrong.

If they can't provide frame counted evidence to back up their claim, as a site that's primarily known for doing said frame analysis, then the statement is worthless on its face. the entire video they compared it to xbox one x, and then showed a 2080Ti running it at 30fps on insane settings and basically never did a direct XSX vs pc ultra comparison shot for shot, which was the whole point of the original tech demonstration in the first place. So what the hell are they even doing? They said "its almost certainly using dynamic resolution on XSX", ok well then pixel count every frame of the video until you find evidence of it then, and provide the proof, and tell us what percentage of frames in the entire demonstration are below native, and how far below it is. Do your jobs basically.
 
Last edited:

Muhammad

Member
Mar 6, 2018
152
and DF didnt even provide any video or proof of what they were saying at all
So now we are doubting DF! WOW!

show me how a 2080 runs it with all the extra settings XSX can use
There is no extra settings the Xbox uses, its using the same Ultra settings as the PC version in that benchmark comparison.

They didn't pixel count the footage to see if dynamic res was being used,
Microsoft showed them the settings used in every configuration.

they didnt throw a framerate counter on it, nothing
It's an internal benchmark, it gives you average fps in the end!!
The developers plainly stated it runs over 100fps on settings beyond PC ultra
No they DID NOT .. that was your interpretation of their statement, an internal benchmark trumps any claims man.

when they reccomended the slower, more expensive GTX 1060 over the faster, cheaper RX580 at the time and said flat out untrue things in the article despite their own charts proving them wrong.
If you have an axe to grind with them, please do it somewhere else, DF is the highest credible tech source for games on the internet.
 

Dictator

Digital Foundry
Verified
Oct 26, 2017
2,740
Berlin, 'SCHLAND
XSX CPU ran worse than the thread ripper and GPU similar to the RTX 2080 in Gears 5 bench. Sorry, no lies there. Same settings. Ultra with nothing extra.
 

napata

Member
Nov 2, 2017
794
DF didnt even provide any video or proof of what they were saying at all. a threadripper 2950x machine is an oddball config and i personally think DF are lying, or they're mistaken.
Wow, this is embarrassing. Accusing DF of lying because what they said contradicts you is just sad.

The developers plainly stated it runs over 100fps on settings beyond PC ultra.
I'm pretty sure they said that Gears 5 was running at 100fps without specifying resolution or settings.
 

SummitRidge

Member
Oct 28, 2017
122
User Banned (2 Weeks): Hostility and conspiracy theorizing across multiple posts; account in junior phase
XSX CPU ran worse than the thread ripper and GPU similar to the RTX 2080 in Gears 5 bench. Sorry, no lies there. Same settings. Ultra with nothing extra.
right, now pixel count the actual demo presentation we have a video of and tell us how many of those frames are sub native and how far below native they are, and then run an RTX 2080 on pc on ultra with dynamic res enabled and do that exact same comparison. And even then that favors the 2080 because the public demo video as you clearly point out uses settings way beyond pc ultra and has effects pc doesnt even have at all. You didn't even do your jobs on the actual publicly available video, you point out a few effects but make no statement as to what resolution it runs at, nor the percentage of subnative frames, did no framerate analysis, none of that. and you didn't even compare PC ultra to XSX directly shot for shot, you spent most of the video comparing it to xbox one X, or a 2080Ti on insane preset at 30fps.

Now also show us how it ran on an RTX 2080 two weeks into devleopment, because that's as long as the XSX version has been developed for.
Do an apples to apples comparison, show actual proof, and quit lying. This aint the first time i've caught your site out in outright lies sir. And yeah, i can provide proof of that if anyone doubts me.
 
OP
OP
ILikeFeet

ILikeFeet

Member
Oct 25, 2017
30,066
he developers plainly stated it runs over 100fps on settings beyond PC ultra. so far DF has not provided video evidence that it does not.
that was multiplayer. watch the video at the end

right, now pixel count the actual demo presentation we have a video of and tell us how many of those frames are sub native and how far below native they are, and then run an RTX 2080 on pc on ultra with dynamic res enabled and do that exact same comparison. And even then that favors the 2080 because the public demo video as you clearly point out uses settings way beyond pc ultra and has effects pc doesnt even have at all.
so now the devs are lying?