Next-gen PS5 and next Xbox speculation launch thread |OT5| - It's in RDNA

What do you think could be the memory setup of your preferred console, or one of the new consoles?


  • Total voters
    1,379
Status
Not open for further replies.

anexanhume

Member
Oct 25, 2017
4,248
FUCK LG. the real story is OLED Capacity will be times 10 by 2024.

By next year we will have the first Printed OLED panels. And TCL should be among the first alongside Japan Display.

Cheaper, more competition, more reliable, and faster. LG display won't have a stranglehold on the market leading to cheaper prices for everyone.


That means the 88 inch oled for sub 5k in a wallpaper like format


Yeah, fuck LG for popularizing a new display technology and challenging Samsung's dominance at the top-end of the market?
 

sncvsrtoip

Member
Apr 18, 2019
544
John does not have the hardware :D Neither do I (yet).
But we will probably do something like this - the reason we did Journey is because I think the game deserves it and because I needed something smaller (the game is just 2 hours long) to do before I hit a small vacation since I am hanging around with my mother :D
Good to hear, I think also worth to check 1800mhz 5700 oc in firestrike
 

sncvsrtoip

Member
Apr 18, 2019
544
GameStop exec Frank Hamlin "File sizes for games are indeed growing. Just recently, it was reported that Cyberpunk 2077 will be 80 GB on PS4. One of 2018's biggest and most impressive games, Red Dead Redemption 2, is over 100 GB. With the kind of 8K graphics that Project Scarlett and the PS5 can offer, the file size for a game like Red Dead Redemption 2 could balloon to 400 GB, Hamlin estimated." link
8k cofirmed ;d
 

Snakeeee

Member
Jan 20, 2019
1,272
GameStop exec Frank Hamlin "File sizes for games are indeed growing. Just recently, it was reported that Cyberpunk 2077 will be 80 GB on PS4. One of 2018's biggest and most impressive games, Red Dead Redemption 2, is over 100 GB. With the kind of 8K graphics that Project Scarlett and the PS5 can offer, the file size for a game like Red Dead Redemption 2 could balloon to 400 GB, Hamlin estimated." link
8k cofirmed ;d
Companys need to start finding new aways of compresing data or the Blue Ray will no longer help peopel with a terrible internet.
 

BreakAtmo

Member
Nov 12, 2017
3,421
GameStop exec Frank Hamlin "File sizes for games are indeed growing. Just recently, it was reported that Cyberpunk 2077 will be 80 GB on PS4. One of 2018's biggest and most impressive games, Red Dead Redemption 2, is over 100 GB. With the kind of 8K graphics that Project Scarlett and the PS5 can offer, the file size for a game like Red Dead Redemption 2 could balloon to 400 GB, Hamlin estimated." link
8k cofirmed ;d
Between 8K being nothing more than an output resolution for these consoles, the fact that resolution and file size are unrelated, and the various ways we know file size growth can be held back on the new consoles, this is one of the dumber things I've read in a while.
 

dgrdsv

Member
Oct 25, 2017
2,490
Msk / SPb, Russia
I think the best we could hope for is somewhere in between GTX 1080 and 1080Ti performance.

With raw specs like the 1080 ( 8.8 TF ) and equivalent performance of 1080Ti thanks to RDNA.

I don’t want to compare with RTX/Turing GPUs because I don’t believe AMD will have comparable raytracing HW, even for Scarlett.
Console GPUs should be comparable to Turing feature wise, with the only exception being their lack of tensor cores and whatever FP16/INT matrix math acceleration Turing may get from them.

Performance of said features is a totally different matter of course but I don't expect it to be as bad as, say, running RT on Pascal via pure compute. So generally it's likely more accurate to compare NG GPUs with Turing than with Pascal.
 
Oct 27, 2017
2,927
Somewhere South
Companys need to start finding new aways of compresing data or the Blue Ray will no longer help peopel with a terrible internet.
There's some interesting research with NN compression: they train a network on the original textures, create a model that can replicate them with some arbitrarily defined fidelity threshold when fed a simple input seed and a very compressible residual that, when merged with the model replica, brings the fidelity up to par with the original texture.

They then ship the model + compressed residuals. Textures are reconstructed on installation. You're obviously trading space for processing time, as this takes quite a bit. The bigger the number of textures you're compressing, the better the compression ratio, as model size doesn't scale linearly like that.
 

Pheonix

Member
Dec 14, 2018
1,193
St Kitts
Its not checkerboarding at all, its just after upscaling to 4k performs sharpening based off contrast and it looks surprisingly close to native 4k.
I have no idea whether it could be combined with checkerboard, but technically i dont see why not, its just a filter in the end, but isn't checkerboard already sharpening the image anyway?
I am beginnin to fel like some sort of next gen prophet lol. Pretty muh everything I said back before we knew anything has been coming true. I honestly do't know how most people missed it considerring it was all just right there fo us to see.

I believe CBR is better than RIS and DLSS. And all are just ways of running faux 4K. CBR still maintains a native 4K buffer. Its basically like rendering a full 4K image in 2 frames instead of one. And I can only imagine that with improved hardware and better AA it will be far better on the PS5 than it was in the PS4pro. I've been saying this non-stop, and while a lot of people just don't want to hear it, the simple truth of the matter is that native 4K is a gluttonous waste of resources for what is in effect minimal IQ gain over a great 1440p image. Sony knows it, Nvidia knows it and now AM is showing that they know it too.

It also ties itself nicely to everything that a GPU actually is. People seem to forget that GPUs are really all about smoke and mirrors. They have found brilliant ways to "simulate" extremely otherwise processor-heavy tasks and reduce their performance hit (bump mapping anyone?) why anyone thinks the actual rendered resolution would b exempt to this is beyond me.
John does not have the hardware :D Neither do I (yet).
But we will probably do something like this - the reason we did Journey is because I think the game deserves it and because I needed something smaller (the game is just 2 hours long) to do before I hit a small vacation since I am hanging around with my mother :D
es... once again DF coming through to do God's work. Take your time man, don't mind my Journey shaming lol, knowing you guys will at least do this is good enough for me. And pls while you guys are at it, can you address things like the whole GCN vs RDNA TFs thing, and things like DLSS, CBR and RIS and what their implications for next gen consoles ould be?
 

Blizzje

Member
Jul 7, 2018
13
The benchmarks for the RX 5700 are looking fantastic, often keeping up or even beating the 2060 super and regular 2070. Is this the baseline we can expect for the PS5 in terms of the card that can be fitted? Or will it likely be downclocked? Would be awesome if this is the minimum we can expect.
 

More Butter

Member
Jun 12, 2018
861
Next gen is already looking promising. I think we are going to get a couple of balanced and capable machines. I hope they’re both very close in power so we can avoid some of the dick wagging. Obviously some dicks will wag regardless of what happens but maybe more uniform consoles will keep some of the dicks where they belong.
 

RoboPlato

Member
Oct 25, 2017
2,372
The benchmarks for the RX 5700 are looking fantastic, often keeping up or even beating the 2060 super and regular 2070. Is this the baseline we can expect for the PS5 in terms of the card that can be fitted? Or will it likely be downclocked? Would be awesome if this is the minimum we can expect.
I think we’ll get something between the 5700 and the XT with added raytracing support and a few other key customizations/optimizations. Should be very nice, especially if they have a lot of bandwidth to work with.
 

Noctis114

Member
Jan 25, 2019
719
Thinking about how the next gen consoles offer performance equal to an RTX 2070 with added console optimisation is mouth watering to picture.

I i'm convinced that next gen will have games that will look as visually amazing as the Unity Heretic demo rendered in checkerboard 4K.


Let's not forget that this demo was running at [email protected] on an RX 5700XT i believe back in the E3 conference, with dynamic resolutions and AMD's new image sharpening technique we could bump that up to [email protected]
 

Lokimaster

Member
May 12, 2019
136
This video could almost be used as a next gen tech showcase:


Although it is exclusively UE4.

Lol there is not going to be a video game out here especially from 3rd party devs that look, ANYTHING like that. Not happening.
 

Metalane

Member
Jun 30, 2019
184
Massachusetts, USA
Yes, that's what I meant. I thought you meant something else. It looks like were getting 12-14 GCN teraflop equivalent GPU, not RDNA. That's what me and some others hoped for back in the day. Whether the TF figure actually matters or not. ;)

Anyways, with that, a huge CPU improvement, and SSD, I can't even begin to imagine how the games will look next gen. Particularly first party titles, which will push the hardware more. I can't wait to see what a studio like Guerrilla will bring to the table. I'm expecting to be at least as amazed as I was when I saw Killzone: Shadow Fall.
I was actually just watching the reveal trailer for KZ: Shadowfall last night. I love the chills we get when we see these next gen experiences!
 

Metalane

Member
Jun 30, 2019
184
Massachusetts, USA
FUCK LG. the real story is OLED Capacity will be times 10 by 2024.

By next year we will have the first Printed OLED panels. And TCL should be among the first alongside Japan Display.

Cheaper, more competition, more reliable, and faster. LG display won't have a stranglehold on the market leading to cheaper prices for everyone.


That means the 88 inch oled for sub 5k in a wallpaper like format


I’ve never personally owned an OLED tv. Is it that much better than a LED?
 

Metalane

Member
Jun 30, 2019
184
Massachusetts, USA
Lol there is not going to be a video game out here especially from 3rd party devs that look, ANYTHING like that. Not happening.
That’s what they say every gen...

EDIT: Here are some examples of what faces already look like this gen -






It won’t let me post the images directly here for some reason.
 
Last edited:

Metalane

Member
Jun 30, 2019
184
Massachusetts, USA
Thinking about how the next gen consoles offer performance equal to an RTX 2070 with added console optimisation is mouth watering to picture.

I i'm convinced that next gen will have games that will look as visually amazing as the Unity Heretic demo rendered in checkerboard 4K.


Let's not forget that this demo was running at [email protected] on an RX 5700XT i believe back in the E3 conference, with dynamic resolutions and AMD's new image sharpening technique we could bump that up to [email protected]
Ya, I wouldn’t be surprised if the first ND game next gen surpassed this!
 

Colbert

Member
Oct 27, 2017
3,837
Germany
I’ve never personally owned an OLED tv. Is it that much better than a LED?
Look into the tests that are available at rtings.com or on TV-centric channels on youtube and you will learn where OLED is better than LED. It depends what you wanna do with your TV and the environment the TV is in.

For example this review highlights pro and cons of an OLED quite good:
If you want the absolute best TV, check out the LG B8. This is an outstanding 4K OLED TV that delivers exceptional picture quality and has perfect dark room performance and great wide viewing angles. It also performs well in bright rooms, although it can't overcome really bright glare if you have a lot of windows.

This TV has excellent low input lag and nearly instantaneous response time, so motion looks crystal clear with almost no blur. This, however, can create stutter, especially when watching 24p movies on Blu-ray; this can be bothersome to some people. If stutter bothers you, the TV has a black frame insertion feature and a motion interpolation feature that can help improve this.

Unfortunately, just like all OLED TVs, it has the risk of temporary image retention and permanent burn-in when displaying static images for extended periods of time. However, with varied enough usage, we don't expect this to be an issue for most people. Overall, this television and the other LG OLEDs, including the LG C8 and LG E8, are outstanding TVs that should please most people.
 
Last edited:

Metalane

Member
Jun 30, 2019
184
Massachusetts, USA
Look into the tests that are available at rtings.com or on TV-centric channels on youtube and you will learn where OLED is better than LED. It depends what you wanna do with your TV and the environment the TV is in.

For example this review highlights pro and cons of an OLED quite good:
Thanks! I was always cautious of OLED because don’t they slow “die” over time?
 

dgrdsv

Member
Oct 25, 2017
2,490
Msk / SPb, Russia
If we're purely fantasizing here I'd say that CBR won't be widely used in next gen if at all. Resolution reconstruction will be handled by a mix of temporal accumulation (already used in many current gen games) and wide usage of variable rate shading in both texture and screen spaces. Current implementations of VRS are rather "lite" and naive since they are essentially tackled onto console renderers without going through the whole pipeline looking into what may in fact be shaded at a lower rate. A renderer which will be built with VRS being used throughout the whole pipeline will provide a lot more performance benefits than what we see right now in Wolf2 and CivVI, for example.

Sharpening always was and always will be a part of your regular TAA, whether its coupled with resolution upscaling or not. "RIS" doesn't bring anything new into console space at all since it's nothing more than a "hack" provided by PC GPU drivers to be used in games which have sub-par (i.e. not sharp enough) TAA/post-AA implementations.

And as for DLSS and other AI powered image enhancement techniques I doubt that next gen consoles will have enough h/w power to handle such AI approaches at speeds necessary even for 33.3ms frames.
 

AegonSnake

Member
Oct 25, 2017
4,577
I was actually just watching the reveal trailer for KZ: Shadowfall last night. I love the chills we get when we see these next gen experiences!
Speculation with a sprinkle of hype!
I’ve never personally owned an OLED tv. Is it that much better than a LED?
Ya, I wouldn’t be surprised if the first ND game next gen surpassed this!
That’s what they say every gen...
lol 5 posts in a row. you do know, you can add multiple quotes in one post right? you can also edit your post to insert more quotes.

dont want to be a forum nazi, but typically double posts are frowned upon. never even seen 5 posts in a row before.
 

Metalane

Member
Jun 30, 2019
184
Massachusetts, USA
Even way better, nothing particular is happening here, no interaction or IA.
The animation on this video are not at the level of motion matching we saw on TLoU II.
Exactly. Unity’s tech demos within the past few years haven’t impressed me that much. God Of War 2 or HZD 2 will likely match or surpass it, unless they choose to favor physics, asset streaming, AI, etc. In that case I would prefer it to better visuals.
 

severianb

Member
Nov 9, 2017
706
I think we’ll get something between the 5700 and the XT with added raytracing support and a few other key customizations/optimizations. Should be very nice, especially if they have a lot of bandwidth to work with.
One thing I don't see mentioned a lot that could be called an "optimization" over the PC components is that the consoles use a monolithic chip and won't need the "Infinity Fabric" or PCIe lanes tying all the parts together. That's one of the reasons the 6TF X1X punches above it's weight class.
 

Pheonix

Member
Dec 14, 2018
1,193
St Kitts
This is like saying that burgers are better than bunnies and space ships.
I can't tell if you are agreeing with me or disagreeing cause both could apply.

But I'll elaborate. And feel free to correct m if I am wrong.

All are techniques designed to achieve the same end product. And that is taking a natively lower rez image and making it appear higher rez. All at a minimal performance hit to the GPU.
  • RIS sharpens an upscaled lower rez native image.
  • Nvidia uses "pixel training" to generate "additional pixels" to give an image the "impression" of being higher rez than its native resolution. Its targeted for games already struggling to hit higher frame rates (either cause the game just demands that much or cause you are using RT) because it has a fixed GPU load so gains are only seen when a game is struggling already.
  • Sony's CBR renders out half of the targeted native rez per frame and uses pixel ID tagging and tracking to syn new and old pixels towards generating a completed image.

So yes, they all are trying to do the same thing albeit going about it differently.

Now why I say CBR is better, its because CBR is the only one of those methods that can be built on top the "targeted" native resolution. So unlike the other methods that start at a lower rez's framebuffer before doing their thing, CBR starts at the frame buffer of the intended final rez. Which means that things like on-screen furniture (text and HUD...etc) are also rendered at the native resolution.

over to you....
 
Last edited:
Jun 18, 2018
689
A renderer which will be built with VRS being used throughout the whole pipeline will provide a lot more performance benefits than what we see right now in Wolf2 and CivVI, for example.
The two examples you mentioned are using hardware supported VRS - it's not just a different approach in software. AMD have a patent for VRS, but it's not a feature on the 5xxx cards. Here's hoping it turns for the consoles and the next generation of RDNA hardware.

Sharpening always was and always will be a part of your regular TAA, whether its coupled with resolution upscaling or not. "RIS" doesn't bring anything new into console space at all since it's nothing more than a "hack" provided by PC GPU drivers to be used in games which have sub-par (i.e. not sharp enough) TAA/post-AA implementations.
I'm not sure what you mean by "always was and always will". It should be used in conjunction with AA (especially TAA) to preserve finger details, but many games have TAA and no sharpening.
 

modiz

Member
Oct 8, 2018
4,506
Weird that RIS being referred to a "hack" as a bad thing tbh. Console games are made in spite of the tech, not because of it. Developers are trying to make the best looking games they can for the console budget, that means using any "hack" you can get your hands on. RIS is a perfect fit for console games, same as checkerboard, because it lets the developers push the game resolution as high as possible for the lowest performance cost, with a close enough final result.

EDIT:
I will add that for the same reason i predict most developers wont utilize ray tracing as much as some here hope.
Ray Tracing is too big of a cost for its payoff. Most developers will still opt for faked GI in games, SSR etc. We might see ray tracing in certain scenes but it will rarely become a main feature in the games
 
Last edited:

RoboPlato

Member
Oct 25, 2017
2,372
The two examples you mentioned are using hardware supported VRS - it's not just a different approach in software. AMD have a patent for VRS, but it's not a feature on the 5xxx cards. Here's hoping it turns for the consoles and the next generation of RDNA hardware.
I wouldn’t be surprised if it made it into consoles. They had a similar quote about VRS to their raytracing responses. Something along the lines of they know it’s a highly requested feature but they’re only talking about current 5700 GPUs right now.
 

FSavage

Member
Oct 30, 2017
459
Weird that IRS being referred to a "hack" as a bad thing tbh. Console games are made in spite of the tech, not because of it. Developers are trying to make the best looking games they can for the console budget, that means using any "hack" you can get your hands on. IRS is a perfect fit for console games, same as checkerboard, because it lets the developers push the game resolution as high as possible for the lowest performance cost, with a close enough final result.
Sounds... taxing.
 

Thera

Member
Feb 28, 2019
726
Anyways, with that, a huge CPU improvement, and SSD, I can't even begin to imagine how the games will look next gen. Particularly first party titles, which will push the hardware more. I can't wait to see what a studio like Guerrilla will bring to the table. I'm expecting to be at least as amazed as I was when I saw Killzone: Shadow Fall.
The draw distance, lighting effects, reflexion, smoke and fire. It was a slapp in the face for a console game. Just rewatched it and the big leap since this reveal is animation. Even pre rendered ones are really not great.
 

dgrdsv

Member
Oct 25, 2017
2,490
Msk / SPb, Russia
All are techniques designed to achieve the same end product.
This is incorrect.

RIS is just a post process sharpening filter with an option of running said filter after a rather basic upscale routine which just gives you a better end result than upscaling an already sharpened image. The upscaling in it is rather straight forward from what I can tell and isn't remarkable in any way compared to other such upscalers, including those which are used by modern console h/w on display output (when you're outputting a 1440p game to a 4K TV for example).

CBR is a resolution reconstruction technique which can be used instead of a naive upscaler mentioned above but it requires quite a lot of renderer work and breaks a lot of things alongside it. You can just as well run a sharpening filter on top of CBR resolve if you feel that its too soft.

DLSS is a combination of image reconstruction with antialiasing as it tries to reconstruct a supersampled image from an aliased one. It's closer to CBR in a sense that it also tries to reconstruct a higher resolution image from a lower one but the approach is completely different, and it also provides AA while doing such reconstruction. You can run DLSS in final output resolution and in this case it will try to reconstruct supersampled AA only, without upscaling the image. And again, you can just as well run sharpening filter on top of any DLSSed image if you feel that it's necessary from IQ perspective.

So technically nothing stops you from, say, rendering in 4K CBR (1/2 4K native), running DLSS on that resolved image for some AI anti-aliasing and then applying RIS on top of resulting frame to make it a bit sharper. All three can be used simultaneously as they essentially handle different parts of final frame rendering.

Now why I say CBR is better, its because CBR is the only one of those methods that can be built on top the "targeted" native resolution. So unlike the other methods that start at a lower rez's framebuffer before doing their thing, CBR starts at the frame buffer of the intended final rez. Which means that things like on-screen furniture (text and HUD...etc) are also rendered at the native resolution.
Not 100% sure what you mean but all of these can be used before rendering HUD elements without affecting their rendering in display final resolution. CBR isn't unique in this.

The two examples you mentioned are using hardware supported VRS - it's not just a different approach in software.
VRS is h/w, it can't be a different approach in s/w. CBR is h/w too btw as it's using MSAA h/w. TAA upscaling is pure s/w though.

AMD have a patent for VRS, but it's not a feature on the 5xxx cards. Here's hoping it turns for the consoles and the next generation of RDNA hardware.
I'm about 99% sure that it will.

I'm not sure what you mean by "always was and always will". It should be used in conjunction with AA (especially TAA) to preserve finger details, but many games have TAA and no sharpening.
All implementations of TAA which I've seen are using some level of post-resolve sharpening as TAA generally produce a very blurry image without it. It's possible of course that there are games which use TAA without any sharpening passes but I personally haven't ran into such implementations being presented anywhere.

Weird that RIS being referred to a "hack" as a bad thing tbh.
It's not a "bad thing", it's just not relevant for consoles where you don't have a user driver option of running RIS on top of any game you'd like. And most console games already use sharpening as a part of their TAA / postprocessing routines - maybe these give worse or better sharpening than RIS does but they are there already nevertheless. Considering that CAS is free to use I'd imaging that some games will use it instead of their own sharpening filters down the line, similarly to how FXAA essentially killed off most in-house PPAA solutions at the end of previous console generation.
 
Status
Not open for further replies.