• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.
  • We have made minor adjustments to how the search bar works on ResetEra. You can read about the changes here.

barbarash22

Member
Oct 19, 2019
600
I just wish more games would use a more stylized look, that doesn't require as much computing power.
So they can have better performance and image quality.
 
OP
OP

Deleted member 14089

Oct 27, 2017
6,264
The game doesnt look like 720p when your playing it.
720p for Xbox Series X / Playstation 5?
Someone explain to me how the game looks great if it's 720p? It also hardly ever dips below 60 in balanced, there are a few places it did but it was specific to those.
Game can't even hold a stable 60fps at 720p. PS5 pro ain't gonna fix bad optimization
I never expected to read about a game needing to run in ~720p to be at 60fps on PS5 ever lmao.

FYI; If you're thinking that RAW 720p is output to the screen, I forgot to add that the performance mode also uses TSR to upsample to 1440p, so that cleaner image you may see is true.
720p refers to the internal resolution, I'll pay more attention to this if I make a thread in the future 😂 .

Sorry for any misunderstanding for those who did not have time to watch the video.
 

Dupr Dog

Alt-Account
Banned
Dec 16, 2022
654
The reasons for the PS5 Pro to exist will continue to pile up over the next year, just in time for the console to be announced.
 

War95

Banned
Feb 17, 2021
4,463
I'm grabbing a 40 Series GPU and not leaving the option of IQ and Performance in a developers hands anymore, since they clearly can't figure out either.
Atleast then, if my game runs like ass due to poor optimization, it will look sharp while doing so or vice versa with running smooth while looking less sharp.
You are not going to believe this
 

LossAversion

The Merchant of ERA
Member
Oct 28, 2017
10,756
are people really asking for new hardware because some aa game runs poorly? a game on an engine that devs are still figuring out even
I mean... I'm seeing really low resolutions and compromises across the board. The games with high pixel counts and steady framerates are few and far between. Sure, upscalling tech can make low pixel counts look better but it also introduces artifacts and it's far from ideal.

At this point, we aren't getting 60fps games without massive compromises in most cases. Pro consoles could brute force their way to better results. You can blame these results on poor optimization but there is a lot of that going around in that case. And more power will lessen the impact of poor optimization.
 

JahIthBer

Member
Jan 27, 2018
10,399
User banned (1 day): Platform wars
4090 looking pretty good now huh. Take the L people who said it was unoptimised on PC and were acting like it was native 4K/60 on PS5.
People need to start accepting next gen games are not going to be hitting native 4K easily on either PC/Consoles. Games hitting native 4K on something like One X was more understandable when the base games were designed to run on Xbox One which was honestly a really fucking weak console. DDR3 for VRAM even in 2013 was hilarious.
 

jett

Community Resettler
Member
Oct 25, 2017
44,687
While 729p sounds low as fuck, I'm on the train of thought of DF that resolution numbers doesn't really matter that much anymore with these great upscaling methods.

Pixel quality > pixel count.
indeed

raw pixel count is something that doesn't matter much these days
 

laxu

Member
Nov 26, 2017
2,783
4090 looking pretty good now huh. Take the L people who said it was unoptimised on PC and were acting like it was native 4K/60 on PS5.
People need to start accepting next gen games are not going to be hitting native 4K easily on either PC/Consoles. Games hitting native 4K on something like One X was more understandable when the base games were designed to run on Xbox One which was honestly a really fucking weak console. DDR3 for VRAM even in 2013 was hilarious.

A 4090 can run Cyberpunk 2077 with raytracing in native 4K at similar performance as Remnant 2 runs on it. Except Remnant 2 does not have a complex open world or any raytracing.

The issue is with Remnant 2 or UE5, not games in general being so demanding that you need the fastest GPU on the market to get a decent results.
 

Afrikan

Member
Oct 28, 2017
17,069
Why does it seem like some on this forum have a double standard when it comes to DLSS 2 and Consoles using their own versions of reconstruction to achieve a similar effect? This isn't the first time I've noticed this.

When I see analysis of PC games that use DLSS, I don't see folks freaking out at the internal resolution. In fact I remember folks going crazy that DLSS 2 could make something subHD look better than something that was native HD.

Oliver says the consoles approach to reconstruction (UE5's version) looks better than FSR2.0 on PC at the same output resolution of 1440p. So why have some lost their minds throwing out "720p!!!".

Game apparently looks 1440p. 🤷🏾‍♂️
 

RoboPlato

Member
Oct 25, 2017
6,822
Pretty impressed how how well TSR cleans up the image. Pleasing final IQ even from pretty low res
 

JahIthBer

Member
Jan 27, 2018
10,399
A 4090 can run Cyberpunk 2077 with raytracing in native 4K at similar performance as Remnant 2 runs on it. Except Remnant 2 does not have a complex open world or any raytracing.

The issue is with Remnant 2 or UE5, not games in general being so demanding that you need the fastest GPU on the market to get a decent results.
It doesn't look like it should be so demanding i agree. However it's likely something underneath the hood (the new fancy LOD system) that is making it so performance hungry.
I guess we will see with more UE5 games hitting the market.
 

JJD

Member
Oct 25, 2017
4,526
It doesn't look like it should be so demanding i agree. However it's likely something underneath the hood (the new fancy LOD system) that is making it so performance hungry.
I guess we will see with more UE5 games hitting the market.

If the devs cave and downgrade LOD and other features on PC to make the game less demanding I hope they leave the option of running the game on current settings for us with higher end PCs. I'm playing the game maxed out at 4K at above 100FPS with DLSS and frame generation. Looks great and it's super smooth.

Ideally they patch more settings so everyone can find a good graphics/performance ratio. The game is kinda anemic when it comes to graphic options.
 

Spoit

Member
Oct 28, 2017
4,052
It will take a bit for people to understand what upscaling is. It's easy to see a number like 720p and assume it will look like shit, but in reality the final image won't look anywhere close to the internal resolution. It's even funnier for a DF video because you can just like, watch the actual video that has footage from the game and realize it's a pretty silly concern.

Of course, it's another talk if the game looks good enough to warrant the usage of such aggressive upscaling but honestly if the end result is good who gives a shit.

Why does it seem like some on this forum have a double standard when it comes to DLSS 2 and Consoles using their own versions of reconstruction to achieve a similar effect? This isn't the first time I've noticed this.

When I see analysis of PC games that use DLSS, I don't see folks freaking out at the internal resolution. In fact I remember folks going crazy that DLSS 2 could make something subHD look better than something that was native HD.

Oliver says the consoles approach to reconstruction (UE5's version) looks better than FSR2.0 on PC at the same output resolution of 1440p. So why have some lost their minds throwing out "720p!!!".

Game apparently looks 1440p. 🤷🏾‍♂️
Consoles don't have the same quality of reconstruction techniques as DLSS/XeSS, FSR 2 is only really decent at doing 1440p->4k, much less fucking FSR1 like FF16 uses. And even DLSS starts to falter a bit when you start reconstructing from lower resolutions, even if it's the same multiplier (e.g. 720p->1440p isn't quite as clear as 1080p->4k)
 

Xion_Stellar

Member
Oct 25, 2017
3,317
My only gripe with this game is that it strangely doesn't support Dualshock 4 or Dual Sense controller on PC and after forcing compatibility with Steam it defaults to Xbox prompts in game so for now I have to deal with Xbox prompts until someone makes a mod to change it to PlayStation Prompts.
 

Afrikan

Member
Oct 28, 2017
17,069
Consoles don't have the same quality of reconstruction techniques as DLSS/XeSS, FSR 2 is only really decent at doing 1440p->4k, much less fucking FSR1 like FF16 uses. And even DLSS starts to falter a bit when you start reconstructing from lower resolutions, even if it's the same multiplier (e.g. 720p->1440p isn't quite as clear as 1080p->4k)

I did use the word "similar" for a reason lol. And in this example, it is UE5's reconstruction technique which seems to do a good job with internal resolutions below 1080p. I believe the Matrix demo was at or below 1080p?

Some who have played the game say it looks good (not 720p). If it achieves 60fps most of the time and looks good, I think that's something that shouldn't be overreacted to in a negative way (not you, but some earlier in the thread).
 

Lant_War

Classic Anus Game
Banned
Jul 14, 2018
23,601
Consoles don't have the same quality of reconstruction techniques as DLSS/XeSS, FSR 2 is only really decent at doing 1440p->4k, much less fucking FSR1 like FF16 uses. And even DLSS starts to falter a bit when you start reconstructing from lower resolutions, even if it's the same multiplier (e.g. 720p->1440p isn't quite as clear as 1080p->4k)
I'm aware, though this is using TSR on console and not FSR. In the games I've tried it on PC (which is just Fortnite, Satisfactory and Layers of Fears tbf), TSR reconstructing to 1080p from 60% res is surprisingly very good and this is reconstructing from a higher res to 1440p, really can't imagine it being an issue.

I think the main issue is that a lot of the talking points around upscalers (especially when DLSS2 first arrived) were that you could get just as good if not better image quality while also saving performance. This is very rarely the case for FSR 2 (basically only when the game has some really shitty TAA), but on a relative scale, a small degradation of image quality in exchange of much better performance is a tradeoff that's probably still worth it in most cases.
 

Orioto

Member
Oct 26, 2017
4,716
Paris
Some places in this are really impressive in details and have that UE5 demo feel, even without lumen, that's a good start!
 

JJD

Member
Oct 25, 2017
4,526
My only gripe with this game is that it strangely doesn't support Dualshock 4 or Dual Sense controller on PC and after forcing compatibility with Steam it defaults to Xbox prompts in game so for now I have to deal with Xbox prompts until someone makes a mod to change it to PlayStation Prompts.
Dev said in the OT that he is personally working on this and it should be out soon. No haptics though unfortunately.
 

RedSparrows

Prophet of Regret
Member
Feb 22, 2019
6,537
Don't worry guys, the next consoles will hit [insert arbitrary moving target here] and all will be well. GET HYPED GO CRAZY THE REVOLUTION IS COMING etc etc etc...

Game is great.
 

Ra

Rap Genius
Moderator
Oct 27, 2017
12,283
Dark Space
Why does it seem like some on this forum have a double standard when it comes to DLSS 2 and Consoles using their own versions of reconstruction to achieve a similar effect? This isn't the first time I've noticed this.

When I see analysis of PC games that use DLSS, I don't see folks freaking out at the internal resolution. In fact I remember folks going crazy that DLSS 2 could make something subHD look better than something that was native HD.

Oliver says the consoles approach to reconstruction (UE5's version) looks better than FSR2.0 on PC at the same output resolution of 1440p. So why have some lost their minds throwing out "720p!!!".

Game apparently looks 1440p. 🤷🏾‍♂️

I don't think PC gamers are the ones bashing the internal resolution of an upscaled console game, just as I don't see the console gamers in PC threads praising DLSS as the savior and black magic. There is no overlap.

So the question really is, why are the technologically illiterate showing their asses on the subject?
 

Rosol

Member
Oct 29, 2017
1,397
Why does it seem like some on this forum have a double standard when it comes to DLSS 2 and Consoles using their own versions of reconstruction to achieve a similar effect? This isn't the first time I've noticed this.

When I see analysis of PC games that use DLSS, I don't see folks freaking out at the internal resolution. In fact I remember folks going crazy that DLSS 2 could make something subHD look better than something that was native HD.

Oliver says the consoles approach to reconstruction (UE5's version) looks better than FSR2.0 on PC at the same output resolution of 1440p. So why have some lost their minds throwing out "720p!!!".

Game apparently looks 1440p. 🤷🏾‍♂️

I was going to say the same thing; lots of drive by posts looking at the rendering res and saying how it's a failure. Some of it comes from when you play on PC, you always have the option to test the features on and off - so you can visibly see the advantages DLSS 2.0+ offers over FSR 2.0; which varies game to game. I think a lot of people are making assumptions console reconstruction is always inferior, (or in some cases think it's not there at all)- now it seems the TSR solution is quite competitive with DLSS 2 - specifically reducing blurring that plagues FSR; I also think some people do sort of have a bias of wanting their platform to have an edge. The TSR in UE5 is really awesome for everyone on consoles or non new-nvidia cards if you ask me - though I think there's a lot of parts to adding it on UE5 so it won't always just be the same for each game - cause it's not as much of an off/on switch to making a good implementation - I hear it is a bit more of a performance hit too. I think a lot of it was Nvidia's marketing; they convinced enough people you need their cores to do good reconstruction; which still plagues the discourse today.
 

Ales34

Member
Apr 15, 2018
6,455
Why does it seem like some on this forum have a double standard when it comes to DLSS 2 and Consoles using their own versions of reconstruction to achieve a similar effect? This isn't the first time I've noticed this.

When I see analysis of PC games that use DLSS, I don't see folks freaking out at the internal resolution. In fact I remember folks going crazy that DLSS 2 could make something subHD look better than something that was native HD.
There's literally a PC thread about this game's dev saying that the game was created to run with DLSS, and people have been complaining about it loudly all over the internet, including era.
 

Tendo

Member
Oct 26, 2017
10,474
Who cares what the internal resolution is as long as the final image looks good?
Shhh numbers low! Game bad!


I had no idea until this thread about the resolution stuff. Game looks fine. This is gonna end up my goty And I hope this numbers talk doesn't prevent people from giving it a shot.
 

DocScroll

Member
May 25, 2021
440
UE5 with all the bells and whistles just isn't ready with the tech we have on consoles and pcs, at least with all of the main features being pushed. I feel like the push for next gen features has overestimated the consoles capability to some degree.

I feel bad for everyone involved as it's such a tremendous task and it seems Remnant 2 mostly achieves the task with a number of caveats.
 

Mr.Deadshot

Member
Oct 27, 2017
20,285
I still don't understand what happened this gen. Games don't look much better than last-gen. This in particular doesn't look anything special. There are better looking PS4 games. Still, these games struggle to reach 1080p/60fps now? And THIS should be the reason why we need another mid-gen refresh? Yeah, fuck that.
 
OP
OP

Deleted member 14089

Oct 27, 2017
6,264
Interesting tidbit found regarding why TSR has been used;

[April 10th]

forums.unrealengine.com

TSR feedback thread

Hi, My name is Guillaume Abadie, principal graphics programmer at Epic Games, author of Temporal Super Resolution in UE5. The goal of the TSR is to be a temporal upscaler that answer the anti-aliasing need of displaying unprecedented Nanite amount of details on screen, while accommodating...

Question in feedback thread;

Also one last thing. It would be great if we could switch the unreal engine Anti-Aliasing pipeline to tensor cores, Xe cores, AMD matrix, for performance reasons. (A console command would allow devs to let players to decide to tap into unused GPU components if DLSS,XeSS, etc is not available).
TSR could run on tensor cores instead of the main graphical computing units providing the games main render frames.

Answer by principal graphics programmer at Epic Games, author of Temporal Super Resolution in UE5.

What we can do specifically on PS5 and XSX is that conveniently most of their AMD GPU is public ( https://www.amd.com/system/files/TechDocs/rdna2-shader-instruction-set-architecture.pdf 1 ) , so we can go really deep into hardware details and imagine and experiments crazy ideas. And this is what TSR did in 5.1, it exploit performance characterists of RDNA's 16bit instructions heavily which can have huge performance benefits. In UE5/Main for 5.3, added shader permutation ( https://github.com/EpicGames/UnrealEngine/commit/c83036de30e8ffb03abe9f9040fed899ecc94422 2 ) to finaly tap on these instructions exposed in standard HLSL in Shader model 6.2 ( 16 Bit Scalar Types · microsoft/DirectXShaderCompiler Wiki · GitHub 3 ) and for instance on an AMD 5700 XT, the performance savings in TSR are identical to how much these consoles are optimised too;

We can't do miracle using specificaly marketed hardware feature in most efficient maneur with what is only a subset of these exposed capability to us at the moment. But we can still do some surprising stuf on existing hardware wrongly assumed by the publicly uncapable of doing some particular things. And we are able to do it just with standard features, and better understanding of how the GPU works thanks for instance to that AMD pdf I linked above.


UE5.3 main improvements were outlined in the roadmap; https://portal.productboard.com/epi...-roadmap/c/1161-temporal-super-resolution-tsr

  • New functionality has been introduced for more automated control of the screen percentage in games when dynamic resolution is disabled (primarily on PC platforms). The default heuristic sets the screen percentage based on the optimal TSR convergence rate for the number of pixels displayed. This should result in end users requiring less experimentation in order to find the best settings for their device.
  • The editor has several new UX features to aid in understanding and controlling the behavior of the screen percentage in editor and PIE viewports.
  • We've added initial support for 16-bit operations on desktop GPUs where backed by hardware support, including on Radeon RX 5000 series (RDNA) cards and newer.
  • We've fixed an inconsistency between TAA and TSR regarding Lumen screen-space traces of emissive surfaces.
  • There have been numerous optimizations and bug fixes.
 

bitcloudrzr

Member
May 31, 2018
14,223
What we can do specifically on PS5 and XSX is that conveniently most of their AMD GPU is public ( https://www.amd.com/system/files/TechDocs/rdna2-shader-instruction-set-architecture.pdf 1 ) , so we can go really deep into hardware details and imagine and experiments crazy ideas. And this is what TSR did in 5.1, it exploit performance characterists of RDNA's 16bit instructions heavily which can have huge performance benefits. In UE5/Main for 5.3, added shader permutation ( https://github.com/EpicGames/UnrealEngine/commit/c83036de30e8ffb03abe9f9040fed899ecc94422 2 ) to finaly tap on these instructions exposed in standard HLSL in Shader model 6.2 ( 16 Bit Scalar Types · microsoft/DirectXShaderCompiler Wiki · GitHub 3 ) and for instance on an AMD 5700 XT, the performance savings in TSR are identical to how much these consoles are optimised too;
Interesting read and sounds like good news on the console and AMD side.
 

Oski

Member
Jun 15, 2023
557
France
792p….
Nope, no mid-gen refresh needed! I see no reason for them..
It never worked like. Not a single time in the history of computers.

If there was a more powerful console to be released, gamedev will put more geometry, more shadows, more light bounces, less optimizations, and so on.

30fps or 60fps, 800p or 4K, it's all about design. What did the gamedev wanted to achieve, what did the publisher was ready to pay for it, and how competently the production ran.
 

Love Machine

Member
Oct 29, 2017
4,252
Tokyo, Japan
are people really asking for new hardware because some aa game runs poorly? a game on an engine that devs are still figuring out even
This is my reaction. People cry for devs to upgrade their tech and use new engines to keep up with "AAA" graphics and then are surprised when the game performs poorly. Guess the only sensible thing is to release better hardware so these shitty games can run properly!!

It's whatever, honestly. I'll play on PS5 in balanced mode. Would be nice if we got a better idea of loading times.
 

RedSparrows

Prophet of Regret
Member
Feb 22, 2019
6,537
It never worked like. Not a single time in the history of computers.

If there was a more powerful console to be released, gamedev will put more geometry, more shadows, more light bounces, less optimizations, and so on.

30fps or 60fps, 800p or 4K, it's all about design. What did the gamedev wanted to achieve, what did the publisher was ready to pay for it, and how competently the production ran.

I don't get how this isn't 'got' more often.
 

Zep

Member
Jul 12, 2021
1,467
It never worked like. Not a single time in the history of computers.

If there was a more powerful console to be released, gamedev will put more geometry, more shadows, more light bounces, less optimizations, and so on.

30fps or 60fps, 800p or 4K, it's all about design. What did the gamedev wanted to achieve, what did the publisher was ready to pay for it, and how competently the production ran.
And guess what, there are a number of examples where games never receive a patch update or optimization and the newer console helps clean it up. Whether that's dynamic res staying on the higher side, the framerate being finally stable or both in some instances. I don't need a lecture about how things work, we've all been around this block for the last couple generations.

Also, like last generation the refreshes gave some devs a reason to go back to their older titles and give their game new life.
 
Last edited:
Oct 30, 2017
8,973
It's not the prettiest game by any metric but the IQ looks fine and it's smooth. It looks pretty sharp albeit with someone weird artifacts. I'm actually playing it unlike most of these kneejerk reactions in here.
 

Gitaroo

Member
Nov 3, 2017
8,075
Interesting tidbit found regarding why TSR has been used;

[April 10th]

forums.unrealengine.com

TSR feedback thread

Hi, My name is Guillaume Abadie, principal graphics programmer at Epic Games, author of Temporal Super Resolution in UE5. The goal of the TSR is to be a temporal upscaler that answer the anti-aliasing need of displaying unprecedented Nanite amount of details on screen, while accommodating...

Question in feedback thread;



Answer by principal graphics programmer at Epic Games, author of Temporal Super Resolution in UE5.






UE5.3 main improvements were outlined in the roadmap; https://portal.productboard.com/epi...-roadmap/c/1161-temporal-super-resolution-tsr

Cool beans, wish Ghostwire tokyo hd TSR support on console rather than FSR1 which is a joke. Curious why didn't they target 4k instead of 1440p. Those next gen image reconstruction tech should be able to construct from 1/4 of target rest which is 1080p if the target res is 4K. Scaling from 1080p to 4K probably give it a better results than 1260p to 1440p.