We don't play games here, we play numbers.Looks quite clean on XSX
Wonder how many people here joining the dogpile have actually played it.
We don't play games here, we play numbers.Looks quite clean on XSX
Wonder how many people here joining the dogpile have actually played it.
Who cares what the internal resolution is as long as the final image looks good?
Yeah no, I'm not waiting around for no Pro consoles anymore.
This has been a disturbing trend that I've noticed among several current-gen only console games this year and with how it is getting more and more frequent, I can't handwave this and pretend it's not a thing anymore.
It's clear we have hit the point where, like John Linneman from DF said, we are either going to lose 60FPS modes on consoles and go back to 30FPS only. Or we are going to get 60FPS modes that sacrifice so much to hit that target, that it's not even worth it and in the end won't even be able to maintain that framerate.
Remnant 2: 792P on "Balanced" mode
Jedi Survivor: 648P on "Performance" mode
Final Fantasy 16: 720P on "Framerate" mode
These games all drop the internal resolution far below 1080P and then upscale them in order to give you 60FPS. Problem is, no upscaler works well at those low pixel counts, not even DLSS. So what you are left with, is a blurry and aliased image and worst of all, a framerate that is highly unstable and unable to even give you the 60FPS all of those sacrifices were made for in the first place.
This has been happening so much, this year alone. As someone who games on a 4K TV, this kind of upscaling never looks good on such a display. Worst part is I can't even shrug and go "it looks ugly, but atleast it runs smooth" because none of them ever do.
If this is what developers consider respectable Image Quality and smooth Performance, then I'm done.
I'm grabbing a 40 Series GPU and not leaving the option of IQ and Performance in a developers hands anymore, since they clearly can't figure out either.
Atleast then, if my game runs like ass due to poor optimization, it will look sharp while doing so or vice versa with running smooth while looking less sharp.
Are they? I would challenge that after my time fair share of time with DLSS2 on PC and other great reconstructed games on consoles."The final image looks good" is a very subjective opinion. Resolution numbers are an objective metric.
Are they? I would challenge that after my time fair share of time with DLSS2 on PC and other great reconstructed games on consoles.
I'm grabbing a 40 Series GPU and not leaving the option of IQ and Performance in a developers hands anymore, since they clearly can't figure out either.
Atleast then, if my game runs like ass due to poor optimization, it will look sharp while doing so or vice versa with running smooth while looking less sharp.
Super PixelsWhile 729p sounds low as fuck, I'm on the train of thought of DF that resolution numbers doesn't really matter that much anymore with these great upscaling methods.
Pixel quality > pixel count.
I've been buying all of my games on PS5 because for the last couple of years (the cross-gen period unsurprisingly) it has been so consistent with great image quality and framerates.Makes sense what you're saying, but I am not sure it works out to compare a $400 GPU to a $400 console. The XSX and Ps5 are equivalent to what..a 20 series GPU? Maybe a teeny bit more?
PC is always going to be the place to be for best performance in almost all circumstances.
View: https://media.giphy.com/media/l2JhqbmCzyMjk4yA0/giphy.gif
Before you do something drastic google "microstutter" or search era about stuttering pc games.
I have a very beefy pc with a 4090 and too many new games still run like shit or stutter with no real way to fix it.
It's so bad that you can find many threads about the problem on era and that some pc players even prefer to play on consoles to not get the stuttering games.
I think what's happening with this game and FFXVI is something that was mooted in the DF overview for the latter; where so far most performance targets this gen have been 60fps with 30fps 'graphics' modes because the majority of games have been cross-gen, we're starting to get 30fps performance targets laden with 'next gen' visual features with a 'performance' mode that butchers the resolution to get there. The story here is less 'omg PS5 and XSX are so weak they can't run this game at 1440p 60fps', it's that the developer designed the game for the 30fps mode and hacked off whatever they needed to in order to reach 60fps.
A stable performance matters more than the internal res. For example the PS5 version of Dead Space Remake is 936p performance mode and 1296p quality, but uses FSR 2.1.2 so it looks great and runs smoothly.Maybe it´s too early to tell but based on this + Matrix demo I wonder if expectations, on UE5 titles, for these machine should be :
- 1440p/30fps for Quality Mode
- 900p/60fps for Performance Mode
I´d be ok with that as long as they can iron the perfomance at these resolutions...
The trade off would be fine if the game at least performed well almost all the time.Agree. The 60 FPS pressure is somehow real but its kind of "toxic." Many of these games would be better off not including these modes outside of VRR.
Thanks for that example. I'm usually quality>performance but Dead Space's quality mode was atrocious, almost unplayable to me. Performance looked amazing, not much difference.A stable performance matters more than the internal res. For example the PS5 version of Dead Space Remake is 936p performance mode and 1296p quality, but uses FSR 2.1.2 so it looks great and runs smoothly.
The trade off would be fine if the game at least performed well almost all the time.
it help that epic was embarassed in to spending alot of resources in to building tools to help devs deal with shader compalation.I've been buying all of my games on PS5 because for the last couple of years (the cross-gen period unsurprisingly) it has been so consistent with great image quality and framerates.
This year has been anything but consistent, which is the #1 thing I valued in console gaming.
Games coming out running at sub-1080p while still struggling to hit 60FPS.
It's clear that if I want the things that I am looking for "Good Image Quality, Stable Framerates" then upgrading to a beefy PC is the only choice going forward.
Remnant 2 just helped push me to the decision of upgrading my PC, luckily I've found a great deal on an RTX 4080.
Don't get me wrong, things aren't perfect on PC either. But atleast there I can bruteforce issues like resolution and framerate away with hardware rather than begging developers to please patch the Frankenstein thing they released.
I know about the stutter struggle that has been plaguing PC for awhile.
But Imma be honest with you. I would rather deal with micro-stutters than playing another fully priced game that not only looks like a blurry aliased mess, but also runs like complete ass (side note, why the hell can't we toggle off motion blur in Remnant 2 on console?)
Developers will slowly and eventually figure out how to eliminate stuttering (Remnant 2 is UE5 and it is stutter free, which is a great sign).
But performance on the console side (outside of great 1st party devs) will only get worse as graphics get more complex and there is only so much you can do on fixed hardware.
I know about the stutter struggle that has been plaguing PC for awhile.
But Imma be honest with you. I would rather deal with micro-stutters than playing another fully priced game that not only looks like a blurry aliased mess, but also runs like complete ass (side note, why the hell can't we toggle off motion blur in Remnant 2 on console?)
Developers will slowly and eventually figure out how to eliminate stuttering (Remnant 2 is UE5 and it is stutter free, which is a great sign).
But performance on the console side (outside of great 1st party devs) will only get worse as graphics get more complex and there is only so much you can do on fixed hardware.
Man, I'd love to view this game in Unreal Editor and take a look at what's chewing up so much performance. 1080p on medium settings not being able to hit 60 on a 4060 with the game looking like it does is crazy.Note, although this video was primarily focused on consoles, I hope that Alex will publish the PC side of things soon, and there have been titles who hit that similair low internal resolution before. I don't think a overall conclusion of UE5 on consoles can be made.
In the meantime, you can get the gist of some struggles/performance achieved on the PC platform. Given that UE is a middleware engine, I think it shows some interesting parallels.
The recommended hardware was an; i5 - 10600K / Ryzen 5 3600 & RTX 2060/RX 5700
The video is quite low, but it's the most cohesive perspective there is now
Your PC isn't ready for this Unreal Engine 5 game! Remnant 2 PC Performance
Remnant II is one of the first fully released Unreal Engine 5 games so I was incredibly curious to see how it would perform on a variety of hardware. The gam...www.youtube.com
He tested with an;
CPU
GPU:
- i5 9600K
- 7800X3D
You can observe that at only at low 1080p + DLSS performance the game will be performant. The story continues as you go up the stack, showcasing that the game is relatively hard on the hardware. Moreover at the lowest end of the hardware, stuttering is very prominent.
- RTX 2060 6GB
- RTX 3060 12GB
- RTX 4060 8GB
- RTX 3080
- RTX 4090
Now I don't want to throw the developers under the bus or anything, but for an UE5 title this is of course not the best showing out of the gate. Although the game is larger than Layers of Fear (Also an UE5 title), that game does fare a lot better across hardware w.r.t. scaling.
Eitherway, continue :p. There's going to be a seperate thread about this anyway.
Some of the recent games had RT in performance mode as well which tanks performance, and removing that would be ideal. Almost feels like they did not expect performance mode to be as popular as it is, maybe the demand for a smooth one at that. We will see how this all turns out going forward, but I hope it is a trend that stays and improves.Thanks for that example. I'm usually quality>performance but Dead Space's quality mode was atrocious, almost unplayable to me. Performance looked amazing, not much difference.
A stable performance matters more than the internal res. For example the PS5 version of Dead Space Remake is 936p performance mode and 1296p quality, but uses FSR 2.1.2 so it looks great and runs smoothly.
The trade off would be fine if the game at least performed well almost all the time.
just assume dlss will be required more most games. hell I have a 3070 and still played cyberpunk at 1080p with dlss on. if their is any rt at all, it eats proformance.Man, I'd love to view this game in Unreal Editor and take a look at what's chewing up so much performance. 1080p on medium settings not being able to hit 60 on a 4060 with the game looking like it does is crazy.
Games already take longer than this to make. Some even come in at 5-6 years, I'd imagine this would just make things worse, no?Also stop the generational consoles. Just do a new model every 2-3 years. We do it already with phones, CPUs and GPUs.
Yeah, like FF16 needs to update the FSR to a later version since it is using 12 and the current is 2.2.2. Aside from that I understand that tradeoffs are needed for solid fps, also stop putting RT performance mode.Completely agree with that.
Didn´t know about the resolutions for Dead Space, but they seem close to what I posted above for UE5 so I guess this is what we should expect in general for this gen ?
Target for quality mode 1440p.
Target for perfomance 900p.
Sometimes a bit below that, sometime a bit above that.
I´m ok with that as long as performance is solid.
Fingers crossed that the stuttering issue gets solved sooner rather than later 👍Oh if you want shiny and high res graphics pc is your salvation. If you can live with the microstutters then you've nothing to fear (outside of the huge price for the hardware). Outside of microstutters i freaking love my pc games in ultrawide, on max details and 120+fps thanks to DLSS and frame generation.
Also microstutters are a big thing with Unreal Engine 4 but 5 should be way, way better as far as i've heard.
Looks quite clean on XSX
Wonder how many people here joining the dogpile have actually played it.
How much is the cowboy skin?Playing the game? People can watch the game on twitch or youtube and objectively see for themselves how bad the game looks/performs.
No one has time to play games anymore, but sure as hell you will hear my opinion on it.
I mean, look at how bad Remnant 2 looks:
/s
Both versions are exactly the same, the little PS5 differences sound like small bugs that will surely be patched.
Man, people love to say "i want shorter games with worse graphics made by people who are paid more to work less and i'm not kidding" and then lose their minds when AA developers exist. I know this is a strawman but at the same time I just know this applies to some of you here. This game uses the modern feature set of unreal engine, minus lumen. It's heavy for a reason, and the reason you don't think the look of it justifies the performance/resolution is, well, the quote above. Either you want developers to slave over crafting the absolute perfect models, animations, and lighting, or you actually do "want shorter games with worse graphics made by people who are paid more to work less".
Game has no crossplay, go where your buds are at.
Talk is cheap.Man, people love to say "i want shorter games with worse graphics made by people who are paid more to work less and i'm not kidding" and then lose their minds when AA developers exist.
If Insomniac does this, I'm gonna riot.
They do not, but have put RT reflections in a specific RT performance mode in Rift Apart that performed well.
Forspoken ran at like 800p in performance mode if i remember correctlyI never expected to read about a game needing to run in ~720p to be at 60fps on PS5 ever lmao.
Just a reminder (for everyone) that Nanite isn't really about extreme level of detail. It's primarily an LOD function designed to eliminate pop-in. It does, however, facilitate using high detail meshes, and makes it a lot easier to just drop them in and have them look good at any distance.I think it looks pretty good in the video, it's definitely taking advantage of Nanite, with all of the geometrical detail it's packing.
Forspoken has been patched up on PS5. It hits 60 more consistently at higher resolutions with better looking AO now.Forsepoken ran at like 800p in performance mode if i remember correctly
yea... its still hitting 800p at timesForspoken has been patched up on PS5. It hits 60 more consistently at higher resolutions with better looking AO now.
Also curious why the OP left out the fact that the 720p internal resolution is being used with UE5's built in alternative to DLSS/FSR, when they put it for both Balanced and Quality mode bullet points.
Looks fine to me.
That's different than running at 800p at most times
You forgot to add TSR being used under performance mode, that's why I wanted to post the screenshot lolAh I left out the image as one can watch the video themselves to see how the image resolves. Compared to the native 1440p on PC, it looks quite good!
I did note that TSR is used though :).