• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.

Murfield

Member
Oct 27, 2017
1,425
I think we are approaching a point of diminishing returns on visual fidelity, I don't feel a need to go beyond 1080p and 4k textures.

What I would like to see is GPUs being used more for real time rendering of things with computational expensive physics. Stuff like hair, water, lighting.

We are probably several gens away from having water that acts like water though.
 

GhostofWar

Member
Apr 5, 2019
512
Just look at the AMD Radeon HD 7950 and 7970 GPUs, these GPUs launched around 1 year and 9 months before the PS4 and Xbox One and were more powerful than them. As games were developed with the new hardware target of the 8th generation consoles, they were able to better utilize PC hardware, so more things have been done with the given hardware.

I don't think it's that cut and dry, it seems some games low settings creep up over time so that old hardware could be struggling because of that.


It's probably not the reason all the time but it does happen.
 

Md Ray

Member
Oct 29, 2017
750
Chennai, India
'Will be reserved'. Just an educated guess because consoles will, in all likelihood, still support sharing/recording features and may expand to bring new features. I can be wrong but lets see.
Hardly educated at all when you forgot to consider that a Zen CPU core is faster than a Jaguar core by a huge margin, and currently only 1 core is reserved on PS4/Xbox for OS functionality
 

Firefly

Member
Jul 10, 2018
8,621
Hardly educated at all when you forgot to consider that a Zen CPU core is faster than a Jaguar core by a huge margin, and currently only 1 core is reserved on PS4/Xbox for OS functionality
Actually the seventh core was not fully available for games as the core is still split between OS and game usage, so technically "1.5" cores were still reserved for the OS on both Xbox One and PS4.
 

Firefly

Member
Jul 10, 2018
8,621
1 core on PS4. 1.5 cores on Xbox.

Also 1.5 cores is still 1 core
Incorrect. Even on the PS4 the seventh core was not fully available for games.
As with the current PlayStation 4, one core and a time-slice from another is reserved for the operating system.

Atleast in words of developers who spoke about it, the unlocked performance on Xbox One was minor at best. (Yes I know how Zen 2 performs before you mention it again)
"Yes, we are using it. [On benefits] Not a lot apparently but we are using it. You can only use 60 or 70% of it so that is not big of a difference. Essentially it won't make much of an impact," Swen stated to GamingBolt.

Let me rephrase then, all 8 Zen 2 cores might not be available for games in PS5/Scarlett.
 

Deleted member 12790

User requested account closure
Banned
Oct 27, 2017
24,537
Actually the seventh core was not fully available for games as the core is still split between OS and game usage, so technically "1.5" cores were still reserved for the OS on both Xbox One and PS4.

A core is either available to developers or not, there is no such thing as "1.5" cores, that is gibberish. When people talk about reserved cores, they are speaking about a developer's ability to access it to run a thread. It is the purpose of context switching inside of any CPU to facilitate multitasking in a core, games themselves are not monolithic entities. A single game can be made up of lots of threads on a single core, which the context switching piece of the kernel handles automatically. It doesn't matter if some of those tasks are things the operating system uses, because even the "game" itself can be a collection of threads full of automated tasks beyond their control that are context switched in and out anyways. Like when you hand your program over to, say, a shared object that's part of some external library.

There is only one core reserved, as in inaccessible to programmers, not "1.5".
 

Firefly

Member
Jul 10, 2018
8,621
A core is either available to developers or not, there is no such thing as "1.5" cores, that is gibberish. When people talk about reserved cores, they are speaking about a developer's ability to access it to run a thread. It is the purpose of context switching inside of any CPU to facilitate multitasking in a core, games themselves are not monolithic entities. A single game can be made up of lots of threads on a single core, which the context switching piece of the kernel handles automatically. It doesn't matter if some of those tasks are things the operating system uses, because even the "game" itself can be a collection of threads full of automated tasks beyond their control that are context switched in and out anyways. Like when you hand your program over to, say, a shared object that's part of some external library.

There is only one core reserved, as in inaccessible to programmers, not "1.5".
The quotation was suppose to convey that I did not mean 1.5 in literal terms.
I guess the devs at Larian don't know what they're talking about when they say they can't utilize 100% of the unlocked core.
 

Deleted member 12790

User requested account closure
Banned
Oct 27, 2017
24,537
I guess the devs at Larian don't know what they're talking about when they say they can't utilize 100% of the unlocked core.

...the devs at larian said the exact same thing I did. That's the entire point of context switching, to divvy up a core among multiple threads to provide the illusion of multitasking. That's a fundamental part of a kernel, it's why kernels exist in the first place. A core is either reserved or not. 1.5 cores are not reserved, only 1 core is reserved, the others are general purpose cores that context switch threads like any other. That the operating system is one of the threads running on those general purpose cores is completely unremarkable, not out of the ordinary, and not at all relevant when talking about reserved cores. Reserved, when talking about a core, refers to one's ability to execute a thread on said core. It's a binary proposition. Can the developer access that core, push a thread to the context switcher, and have the kernel switch execution to it? Yes, they can, thus the core is not reserved.
 

Firefly

Member
Jul 10, 2018
8,621
That the operating system is one of the threads running on those general purpose cores is completely unremarkable, not out of the ordinary, and not at all relevant when talking about reserved cores.
Its relevant to the fact that select number of threads are not accessible by the games. And thats what I said before, that the seventh core is split between OS and game resources.
 

Deleted member 12790

User requested account closure
Banned
Oct 27, 2017
24,537
Its relevant to the fact that select number of threads are not accessible by the games.

lol what? This sentence is actual giberish. The games are the threads. You're saying that running different threads on a core means different threads are sometimes running which is... yeah? of course? That's what context switchers do.

Not all threads executed on a core are under explicit control of the programmer, Again, shared objects dynamically loaded at execution do this all the time. Your program will, without your explicit command, spawn other threads countlessly, which live and die on cores, and get context switched in and out by the kernel billions of times per second.

That other programs execute on the same core doesn't mean that core is in any way restricted. That's just not what the word restricted means. If you're going to use this general definition where restricted means any time you, the actual programmer, is having his thread execution taken away by the context switcher for some external program, then literally every single processor ever since like 1990 is "restricted."
 

Maple

Member
Oct 27, 2017
11,721
Graphically you won't get the huge jumps. Yes, there will be major improvements, but the components are going to allow for astronomical jumps in other areas.

The Zen 2 CPUs, the 16+ GB of GDDR6, and the implementation of the SSDs are going to allow for far more complex worlds and simulations. AI, mob density, mob behavior, world size and complexity, how quickly all of it loads, dynamic systems, etc. It's going to be a massive jump over what we have now.
 

PHOENIXZERO

Member
Oct 29, 2017
12,066
I think we are approaching a point of diminishing returns on visual fidelity, I don't feel a need to go beyond 1080p and 4k textures.

What I would like to see is GPUs being used more for real time rendering of things with computational expensive physics. Stuff like hair, water, lighting.

We are probably several gens away from having water that acts like water though.
People said the exact same thing five years age but yeah, this time much of the increase GPU power is going to be devoted to hitting or trying to hit 4K native at 30fps, especially in certain genres.
 

Firefly

Member
Jul 10, 2018
8,621
That other programs execute on the same core doesn't mean that core is in any way restricted. That's just not what the word restricted means. If you're going to use this general definition where restricted means any time you, the actual programmer, is having his thread execution taken away by the context switcher for some external program, then literally every single processor ever since like 1990 is "restricted."
Would you explain how the word reserved is used in this context for my understanding?
As with the current PlayStation 4, one core and a time-slice from another is reserved for the operating system.
 
Nov 8, 2017
957
The obvious advances will be with things like load times and overall UI performance improvements. With games? Next gen will be a smaller leap than we've ever seen. TDP and cost restraints will put us in the 2060 Super, 5700X range PC wise. That's midrange now, and will be meh this time next year. Expect sub native 4K, and/or sub 60fps again next gen.
 
Dec 4, 2017
3,097
Next gen will be a smaller leap than we've ever seen. TDP and cost restraints will put us in the 2060 Super, 5700X range PC wise. That's midrange now, and will be meh this time next year.
2060 Super/5700X midrange? Unless I'm mistaken, the vast majority of gamers still play on GTX10##, or RX5## series. The 2070/2080 cards are still "i hab teh big pipi" territory. The only time 2070/80 might feel "mainstream performant" is when you switch on raytracing, and that's because it's an awful resource hog even with dedicated cores. I hope the next implementation version is less sloppy in terms of compute utilization.
 

Md Ray

Member
Oct 29, 2017
750
Chennai, India
2060 Super/5700X midrange? Unless I'm mistaken, the vast majority of gamers still play on GTX10##, or RX5## series. The 2070/2080 cards are still "i hab teh big pipi" territory. The only time 2070/80 might feel "mainstream performant" is when you switch on raytracing, and that's because it's an awful resource hog even with dedicated cores. I hope the next implementation version is less sloppy in terms of compute utilization.
RTX 2070 and Radeon RX 5700 XT are both mid range graphics cards. It's to do with the die size. For example, 5700 XT 250mm2 die size. The pricing is irrelevant, both are overpriced anyway. 200-250mm2 die size has always been mid range when it comes to graphics cards.
 
Oct 25, 2017
17,897
Graphically you won't get the huge jumps. Yes, there will be major improvements, but the components are going to allow for astronomical jumps in other areas.

The Zen 2 CPUs, the 16+ GB of GDDR6, and the implementation of the SSDs are going to allow for far more complex worlds and simulations. AI, mob density, mob behavior, world size and complexity, how quickly all of it loads, dynamic systems, etc. It's going to be a massive jump over what we have now.
I'm hyped for all that. That is all the stuff I will be examining closely.
 

icecold1983

Banned
Nov 3, 2017
4,243
The obvious advances will be with things like load times and overall UI performance improvements. With games? Next gen will be a smaller leap than we've ever seen. TDP and cost restraints will put us in the 2060 Super, 5700X range PC wise. That's midrange now, and will be meh this time next year. Expect sub native 4K, and/or sub 60fps again next gen.

it isnt midrange. Keep in mind the top end 1200$+ 2080ti is only 50% faster than these
 

Astra Planeta

Member
Jan 26, 2018
668
I think we are approaching a point of diminishing returns on visual fidelity, I don't feel a need to go beyond 1080p and 4k textures.

What I would like to see is GPUs being used more for real time rendering of things with computational expensive physics. Stuff like hair, water, lighting.

We are probably several gens away from having water that acts like water though.

I mean why are a long way from the Lion king CG quality in games. We have a ways to go still.

Raytracing will be a pretty big step in the right direction - hopefully both next consoles have the proper hardware for it.
 

VariantX

Member
Oct 25, 2017
16,880
Columbia, SC
Thread started off discussing how next gen would look without getting into technical jargon, and now we're knee deep in jargon xD

I think Its a lot harder to give a frame of reference without that jargon because for people who know their shit can give you a ballpark of what is potentially possible versus what exists now. Unless you want the thread to decline into dragonball powerlevel comparisons.
 

TheMadTitan

Member
Oct 27, 2017
27,208
I think we are approaching a point of diminishing returns on visual fidelity, I don't feel a need to go beyond 1080p and 4k textures.

What I would like to see is GPUs being used more for real time rendering of things with computational expensive physics. Stuff like hair, water, lighting.

We are probably several gens away from having water that acts like water though.
Higher resolutions makes for quick and easy anti-aliasing; it's why so many people (me included) downsample. I turn off AA, run my games at 1440p or 4k if possible, and then display them on my 1080p monitor. Jaggies are drastically reduced. Emulated games are another good example; assets haven't changed, but just increasing the rendering resolution to 1440p or 4k does wonders for image quality.

So even if nothing else changed, increasing resolutions beyond 1080p is great. Once everything is 4k native, then yeah, resolutions will have hit diminishing returns Not going to get many jaggies after that. And if you do, the lowest quality AA will take care of it.
 

GameAddict411

Member
Oct 26, 2017
8,513
Hmmm. Thanks. I did not know that
not entirely accurate. The FE has a very slight overclock. Enough for regular cards to match the FE with a better cooler. From testing, my none founder edition RTX 2080 ti is matching or exceeding the average clock of the RTX 2080 ti founder edition. It's not enough to put on a different league. Not even the RTX Titan which is about 10% faster then the RTX 2080 ti.
 

Isayas

Banned
Jun 10, 2018
2,729
not entirely accurate. The FE has a very slight overclock. Enough for regular cards to match the FE with a better cooler. From testing, my none founder edition RTX 2080 ti is matching or exceeding the average clock of the RTX 2080 ti founder edition. It's not enough to put on a different league. Not even the RTX Titan which is about 10% faster then the RTX 2080 ti.

Oh ok. Thanks for clarifying.
 

JahIthBer

Member
Jan 27, 2018
10,376
Hardly educated at all when you forgot to consider that a Zen CPU core is faster than a Jaguar core by a huge margin, and currently only 1 core is reserved on PS4/Xbox for OS functionality
I dunno about OS bloat, 1 core will be locked off most likely anyway because when these parts are made, quite commonly a core or 2 won't work or won't perform well enough at it's clock speeds, so instead of throwing away the whole SOC in the PS5's case because 1 cpu core didn't work, they will lock off 1 core in all of them, this was the case with PS3 too, the Cell had it's 8th SPE locked off because on some units it didn't work. It's a good way to save money i suppose.
 

Deleted member 49535

User requested account closure
Banned
Nov 10, 2018
2,825
It's not just the graphics. Part of the reason levels/worlds this gen have been so small/claustrophobic is the underpowered jaguar CPU both consoles shared (and the slow hdd strangling asset streaming).

Honestly, I'm more excited for larger, fully realized worlds than I am the graphical improvement.
We've had small worlds this gen?