You forgot to add TSR being used under performance mode, that's why I wanted to post the screenshot lol
Oh shoot fr 💀 mb, thanks I'll add it!
You forgot to add TSR being used under performance mode, that's why I wanted to post the screenshot lol
Someone explain to me how the game looks great if it's 720p? It also hardly ever dips below 60 in balanced, there are a few places it did but it was specific to those.
Game can't even hold a stable 60fps at 720p. PS5 pro ain't gonna fix bad optimization
I never expected to read about a game needing to run in ~720p to be at 60fps on PS5 ever lmao.
You are not going to believe thisI'm grabbing a 40 Series GPU and not leaving the option of IQ and Performance in a developers hands anymore, since they clearly can't figure out either.
Atleast then, if my game runs like ass due to poor optimization, it will look sharp while doing so or vice versa with running smooth while looking less sharp.
I mean... I'm seeing really low resolutions and compromises across the board. The games with high pixel counts and steady framerates are few and far between. Sure, upscalling tech can make low pixel counts look better but it also introduces artifacts and it's far from ideal.are people really asking for new hardware because some aa game runs poorly? a game on an engine that devs are still figuring out even
indeedWhile 729p sounds low as fuck, I'm on the train of thought of DF that resolution numbers doesn't really matter that much anymore with these great upscaling methods.
Pixel quality > pixel count.
4090 looking pretty good now huh. Take the L people who said it was unoptimised on PC and were acting like it was native 4K/60 on PS5.
People need to start accepting next gen games are not going to be hitting native 4K easily on either PC/Consoles. Games hitting native 4K on something like One X was more understandable when the base games were designed to run on Xbox One which was honestly a really fucking weak console. DDR3 for VRAM even in 2013 was hilarious.
It doesn't look like it should be so demanding i agree. However it's likely something underneath the hood (the new fancy LOD system) that is making it so performance hungry.A 4090 can run Cyberpunk 2077 with raytracing in native 4K at similar performance as Remnant 2 runs on it. Except Remnant 2 does not have a complex open world or any raytracing.
The issue is with Remnant 2 or UE5, not games in general being so demanding that you need the fastest GPU on the market to get a decent results.
Looks quite clean on XSX
Wonder how many people here joining the dogpile have actually played it.
It doesn't look like it should be so demanding i agree. However it's likely something underneath the hood (the new fancy LOD system) that is making it so performance hungry.
I guess we will see with more UE5 games hitting the market.
It will take a bit for people to understand what upscaling is. It's easy to see a number like 720p and assume it will look like shit, but in reality the final image won't look anywhere close to the internal resolution. It's even funnier for a DF video because you can just like, watch the actual video that has footage from the game and realize it's a pretty silly concern.
Of course, it's another talk if the game looks good enough to warrant the usage of such aggressive upscaling but honestly if the end result is good who gives a shit.
Consoles don't have the same quality of reconstruction techniques as DLSS/XeSS, FSR 2 is only really decent at doing 1440p->4k, much less fucking FSR1 like FF16 uses. And even DLSS starts to falter a bit when you start reconstructing from lower resolutions, even if it's the same multiplier (e.g. 720p->1440p isn't quite as clear as 1080p->4k)Why does it seem like some on this forum have a double standard when it comes to DLSS 2 and Consoles using their own versions of reconstruction to achieve a similar effect? This isn't the first time I've noticed this.
When I see analysis of PC games that use DLSS, I don't see folks freaking out at the internal resolution. In fact I remember folks going crazy that DLSS 2 could make something subHD look better than something that was native HD.
Oliver says the consoles approach to reconstruction (UE5's version) looks better than FSR2.0 on PC at the same output resolution of 1440p. So why have some lost their minds throwing out "720p!!!".
Game apparently looks 1440p. 🤷🏾♂️
Consoles don't have the same quality of reconstruction techniques as DLSS/XeSS, FSR 2 is only really decent at doing 1440p->4k, much less fucking FSR1 like FF16 uses. And even DLSS starts to falter a bit when you start reconstructing from lower resolutions, even if it's the same multiplier (e.g. 720p->1440p isn't quite as clear as 1080p->4k)
I'm aware, though this is using TSR on console and not FSR. In the games I've tried it on PC (which is just Fortnite, Satisfactory and Layers of Fears tbf), TSR reconstructing to 1080p from 60% res is surprisingly very good and this is reconstructing from a higher res to 1440p, really can't imagine it being an issue.Consoles don't have the same quality of reconstruction techniques as DLSS/XeSS, FSR 2 is only really decent at doing 1440p->4k, much less fucking FSR1 like FF16 uses. And even DLSS starts to falter a bit when you start reconstructing from lower resolutions, even if it's the same multiplier (e.g. 720p->1440p isn't quite as clear as 1080p->4k)
Dev said in the OT that he is personally working on this and it should be out soon. No haptics though unfortunately.My only gripe with this game is that it strangely doesn't support Dualshock 4 or Dual Sense controller on PC and after forcing compatibility with Steam it defaults to Xbox prompts in game so for now I have to deal with Xbox prompts until someone makes a mod to change it to PlayStation Prompts.
Well it's a good thing it's being added thanks for the updateDev said in the OT that he is personally working on this and it should be out soon. No haptics though unfortunately.
Why does it seem like some on this forum have a double standard when it comes to DLSS 2 and Consoles using their own versions of reconstruction to achieve a similar effect? This isn't the first time I've noticed this.
When I see analysis of PC games that use DLSS, I don't see folks freaking out at the internal resolution. In fact I remember folks going crazy that DLSS 2 could make something subHD look better than something that was native HD.
Oliver says the consoles approach to reconstruction (UE5's version) looks better than FSR2.0 on PC at the same output resolution of 1440p. So why have some lost their minds throwing out "720p!!!".
Game apparently looks 1440p. 🤷🏾♂️
I think FF16 did this as well lolI never expected to read about a game needing to run in ~720p to be at 60fps on PS5 ever lmao.
Why does it seem like some on this forum have a double standard when it comes to DLSS 2 and Consoles using their own versions of reconstruction to achieve a similar effect? This isn't the first time I've noticed this.
When I see analysis of PC games that use DLSS, I don't see folks freaking out at the internal resolution. In fact I remember folks going crazy that DLSS 2 could make something subHD look better than something that was native HD.
Oliver says the consoles approach to reconstruction (UE5's version) looks better than FSR2.0 on PC at the same output resolution of 1440p. So why have some lost their minds throwing out "720p!!!".
Game apparently looks 1440p. 🤷🏾♂️
I don't think the hardware is the problem.792p….
Nope, no mid-gen refresh needed! I see no reason for them..
There's literally a PC thread about this game's dev saying that the game was created to run with DLSS, and people have been complaining about it loudly all over the internet, including era.Why does it seem like some on this forum have a double standard when it comes to DLSS 2 and Consoles using their own versions of reconstruction to achieve a similar effect? This isn't the first time I've noticed this.
When I see analysis of PC games that use DLSS, I don't see folks freaking out at the internal resolution. In fact I remember folks going crazy that DLSS 2 could make something subHD look better than something that was native HD.
Shhh numbers low! Game bad!Who cares what the internal resolution is as long as the final image looks good?
Also one last thing. It would be great if we could switch the unreal engine Anti-Aliasing pipeline to tensor cores, Xe cores, AMD matrix, for performance reasons. (A console command would allow devs to let players to decide to tap into unused GPU components if DLSS,XeSS, etc is not available).
TSR could run on tensor cores instead of the main graphical computing units providing the games main render frames.
What we can do specifically on PS5 and XSX is that conveniently most of their AMD GPU is public ( https://www.amd.com/system/files/TechDocs/rdna2-shader-instruction-set-architecture.pdf 1 ) , so we can go really deep into hardware details and imagine and experiments crazy ideas. And this is what TSR did in 5.1, it exploit performance characterists of RDNA's 16bit instructions heavily which can have huge performance benefits. In UE5/Main for 5.3, added shader permutation ( https://github.com/EpicGames/UnrealEngine/commit/c83036de30e8ffb03abe9f9040fed899ecc94422 2 ) to finaly tap on these instructions exposed in standard HLSL in Shader model 6.2 ( 16 Bit Scalar Types · microsoft/DirectXShaderCompiler Wiki · GitHub 3 ) and for instance on an AMD 5700 XT, the performance savings in TSR are identical to how much these consoles are optimised too;
We can't do miracle using specificaly marketed hardware feature in most efficient maneur with what is only a subset of these exposed capability to us at the moment. But we can still do some surprising stuf on existing hardware wrongly assumed by the publicly uncapable of doing some particular things. And we are able to do it just with standard features, and better understanding of how the GPU works thanks for instance to that AMD pdf I linked above.
- New functionality has been introduced for more automated control of the screen percentage in games when dynamic resolution is disabled (primarily on PC platforms). The default heuristic sets the screen percentage based on the optimal TSR convergence rate for the number of pixels displayed. This should result in end users requiring less experimentation in order to find the best settings for their device.
- The editor has several new UX features to aid in understanding and controlling the behavior of the screen percentage in editor and PIE viewports.
- We've added initial support for 16-bit operations on desktop GPUs where backed by hardware support, including on Radeon RX 5000 series (RDNA) cards and newer.
- We've fixed an inconsistency between TAA and TSR regarding Lumen screen-space traces of emissive surfaces.
- There have been numerous optimizations and bug fixes.
Interesting read and sounds like good news on the console and AMD side.What we can do specifically on PS5 and XSX is that conveniently most of their AMD GPU is public ( https://www.amd.com/system/files/TechDocs/rdna2-shader-instruction-set-architecture.pdf 1 ) , so we can go really deep into hardware details and imagine and experiments crazy ideas. And this is what TSR did in 5.1, it exploit performance characterists of RDNA's 16bit instructions heavily which can have huge performance benefits. In UE5/Main for 5.3, added shader permutation ( https://github.com/EpicGames/UnrealEngine/commit/c83036de30e8ffb03abe9f9040fed899ecc94422 2 ) to finaly tap on these instructions exposed in standard HLSL in Shader model 6.2 ( 16 Bit Scalar Types · microsoft/DirectXShaderCompiler Wiki · GitHub 3 ) and for instance on an AMD 5700 XT, the performance savings in TSR are identical to how much these consoles are optimised too;
Interesting read and sounds like good news on the console and AMD side.
It never worked like. Not a single time in the history of computers.792p….
Nope, no mid-gen refresh needed! I see no reason for them..
This is my reaction. People cry for devs to upgrade their tech and use new engines to keep up with "AAA" graphics and then are surprised when the game performs poorly. Guess the only sensible thing is to release better hardware so these shitty games can run properly!!are people really asking for new hardware because some aa game runs poorly? a game on an engine that devs are still figuring out even
It never worked like. Not a single time in the history of computers.
If there was a more powerful console to be released, gamedev will put more geometry, more shadows, more light bounces, less optimizations, and so on.
30fps or 60fps, 800p or 4K, it's all about design. What did the gamedev wanted to achieve, what did the publisher was ready to pay for it, and how competently the production ran.
And guess what, there are a number of examples where games never receive a patch update or optimization and the newer console helps clean it up. Whether that's dynamic res staying on the higher side, the framerate being finally stable or both in some instances. I don't need a lecture about how things work, we've all been around this block for the last couple generations.It never worked like. Not a single time in the history of computers.
If there was a more powerful console to be released, gamedev will put more geometry, more shadows, more light bounces, less optimizations, and so on.
30fps or 60fps, 800p or 4K, it's all about design. What did the gamedev wanted to achieve, what did the publisher was ready to pay for it, and how competently the production ran.
Interesting tidbit found regarding why TSR has been used;
[April 10th]
TSR feedback thread
Hi, My name is Guillaume Abadie, principal graphics programmer at Epic Games, author of Temporal Super Resolution in UE5. The goal of the TSR is to be a temporal upscaler that answer the anti-aliasing need of displaying unprecedented Nanite amount of details on screen, while accommodating...forums.unrealengine.com
Question in feedback thread;
Answer by principal graphics programmer at Epic Games, author of Temporal Super Resolution in UE5.
UE5.3 main improvements were outlined in the roadmap; https://portal.productboard.com/epi...-roadmap/c/1161-temporal-super-resolution-tsr