I still can't decide which one was worse: this trailer or AssCreed Unity's.
Provide evidence 2 Zen 2 cores are reserved on PS5 and Xbox ScarlettThis. Resources aren't locked out of usage on Windows. While 2 out of 8 cores on next gen consoles will be reserved for OS related and sharing, recording tasks and such. It's essentially a 6 core Zen 2 cpu.
'Will be reserved'. Just an educated guess because consoles will, in all likelihood, still support sharing/recording features and may expand to bring new features. I can be wrong but lets see.Provide evidence 2 Zen 2 cores are reserved on PS5 and Xbox Scarlett
Just look at the AMD Radeon HD 7950 and 7970 GPUs, these GPUs launched around 1 year and 9 months before the PS4 and Xbox One and were more powerful than them. As games were developed with the new hardware target of the 8th generation consoles, they were able to better utilize PC hardware, so more things have been done with the given hardware.
Hardly educated at all when you forgot to consider that a Zen CPU core is faster than a Jaguar core by a huge margin, and currently only 1 core is reserved on PS4/Xbox for OS functionality'Will be reserved'. Just an educated guess because consoles will, in all likelihood, still support sharing/recording features and may expand to bring new features. I can be wrong but lets see.
Actually the seventh core was not fully available for games as the core is still split between OS and game usage, so technically "1.5" cores were still reserved for the OS on both Xbox One and PS4.Hardly educated at all when you forgot to consider that a Zen CPU core is faster than a Jaguar core by a huge margin, and currently only 1 core is reserved on PS4/Xbox for OS functionality
1 core on PS4. 1.5 cores on Xbox.Actually the seventh core was not fully available for games as the core is still split between OS and game usage, so technically "1.5" cores were still reserved for the OS on both Xbox One and PS4.
Incorrect. Even on the PS4 the seventh core was not fully available for games.1 core on PS4. 1.5 cores on Xbox.
Also 1.5 cores is still 1 core
As with the current PlayStation 4, one core and a time-slice from another is reserved for the operating system.
"Yes, we are using it. [On benefits] Not a lot apparently but we are using it. You can only use 60 or 70% of it so that is not big of a difference. Essentially it won't make much of an impact," Swen stated to GamingBolt.
Actually the seventh core was not fully available for games as the core is still split between OS and game usage, so technically "1.5" cores were still reserved for the OS on both Xbox One and PS4.
The quotation was suppose to convey that I did not mean 1.5 in literal terms.A core is either available to developers or not, there is no such thing as "1.5" cores, that is gibberish. When people talk about reserved cores, they are speaking about a developer's ability to access it to run a thread. It is the purpose of context switching inside of any CPU to facilitate multitasking in a core, games themselves are not monolithic entities. A single game can be made up of lots of threads on a single core, which the context switching piece of the kernel handles automatically. It doesn't matter if some of those tasks are things the operating system uses, because even the "game" itself can be a collection of threads full of automated tasks beyond their control that are context switched in and out anyways. Like when you hand your program over to, say, a shared object that's part of some external library.
There is only one core reserved, as in inaccessible to programmers, not "1.5".
I guess the devs at Larian don't know what they're talking about when they say they can't utilize 100% of the unlocked core.
Its relevant to the fact that select number of threads are not accessible by the games. And thats what I said before, that the seventh core is split between OS and game resources.That the operating system is one of the threads running on those general purpose cores is completely unremarkable, not out of the ordinary, and not at all relevant when talking about reserved cores.
Its relevant to the fact that select number of threads are not accessible by the games.
People said the exact same thing five years age but yeah, this time much of the increase GPU power is going to be devoted to hitting or trying to hit 4K native at 30fps, especially in certain genres.I think we are approaching a point of diminishing returns on visual fidelity, I don't feel a need to go beyond 1080p and 4k textures.
What I would like to see is GPUs being used more for real time rendering of things with computational expensive physics. Stuff like hair, water, lighting.
We are probably several gens away from having water that acts like water though.
Would you explain how the word reserved is used in this context for my understanding?That other programs execute on the same core doesn't mean that core is in any way restricted. That's just not what the word restricted means. If you're going to use this general definition where restricted means any time you, the actual programmer, is having his thread execution taken away by the context switcher for some external program, then literally every single processor ever since like 1990 is "restricted."
As with the current PlayStation 4, one core and a time-slice from another is reserved for the operating system.
2060 Super/5700X midrange? Unless I'm mistaken, the vast majority of gamers still play on GTX10##, or RX5## series. The 2070/2080 cards are still "i hab teh big pipi" territory. The only time 2070/80 might feel "mainstream performant" is when you switch on raytracing, and that's because it's an awful resource hog even with dedicated cores. I hope the next implementation version is less sloppy in terms of compute utilization.Next gen will be a smaller leap than we've ever seen. TDP and cost restraints will put us in the 2060 Super, 5700X range PC wise. That's midrange now, and will be meh this time next year.
RTX 2070 and Radeon RX 5700 XT are both mid range graphics cards. It's to do with the die size. For example, 5700 XT 250mm2 die size. The pricing is irrelevant, both are overpriced anyway. 200-250mm2 die size has always been mid range when it comes to graphics cards.2060 Super/5700X midrange? Unless I'm mistaken, the vast majority of gamers still play on GTX10##, or RX5## series. The 2070/2080 cards are still "i hab teh big pipi" territory. The only time 2070/80 might feel "mainstream performant" is when you switch on raytracing, and that's because it's an awful resource hog even with dedicated cores. I hope the next implementation version is less sloppy in terms of compute utilization.
I'm hyped for all that. That is all the stuff I will be examining closely.Graphically you won't get the huge jumps. Yes, there will be major improvements, but the components are going to allow for astronomical jumps in other areas.
The Zen 2 CPUs, the 16+ GB of GDDR6, and the implementation of the SSDs are going to allow for far more complex worlds and simulations. AI, mob density, mob behavior, world size and complexity, how quickly all of it loads, dynamic systems, etc. It's going to be a massive jump over what we have now.
The obvious advances will be with things like load times and overall UI performance improvements. With games? Next gen will be a smaller leap than we've ever seen. TDP and cost restraints will put us in the 2060 Super, 5700X range PC wise. That's midrange now, and will be meh this time next year. Expect sub native 4K, and/or sub 60fps again next gen.
50% is a significant percentage.it isnt midrange. Keep in mind the top end 1200$+ 2080ti is only 50% faster than these
the fastest GPU available being only 50% faster is a pretty good place to be. worst case is next year the fastest gpu will be maybe twice as fast50% is a significant percentage.
And the 5700XT most definitely is mid range. There's the upcoming 5800XT, 5900XT, and then the 5600 and 5500
Goddamn @ 2080 Ti and Titan RTX.2080, 2080 super and 2080 Ti are high end. 2080 Ti and TITAN RTX is very high end.
I think we are approaching a point of diminishing returns on visual fidelity, I don't feel a need to go beyond 1080p and 4k textures.
What I would like to see is GPUs being used more for real time rendering of things with computational expensive physics. Stuff like hair, water, lighting.
We are probably several gens away from having water that acts like water though.
I mean why are a long way from the Lion king CG quality in games. We have a ways to go still.
Raytracing will be a pretty big step in the right direction - hopefully both next consoles have the proper hardware for it.
2080, 2080 super and 2080 Ti are high end. 2080 Ti founder edition and TITAN RTX are very high end.
What the differences between the 2080TI and the 2080TI founders edition?
Thread started off discussing how next gen would look without getting into technical jargon, and now we're knee deep in jargon xD
Higher resolutions makes for quick and easy anti-aliasing; it's why so many people (me included) downsample. I turn off AA, run my games at 1440p or 4k if possible, and then display them on my 1080p monitor. Jaggies are drastically reduced. Emulated games are another good example; assets haven't changed, but just increasing the rendering resolution to 1440p or 4k does wonders for image quality.I think we are approaching a point of diminishing returns on visual fidelity, I don't feel a need to go beyond 1080p and 4k textures.
What I would like to see is GPUs being used more for real time rendering of things with computational expensive physics. Stuff like hair, water, lighting.
We are probably several gens away from having water that acts like water though.
not entirely accurate. The FE has a very slight overclock. Enough for regular cards to match the FE with a better cooler. From testing, my none founder edition RTX 2080 ti is matching or exceeding the average clock of the RTX 2080 ti founder edition. It's not enough to put on a different league. Not even the RTX Titan which is about 10% faster then the RTX 2080 ti.
not entirely accurate. The FE has a very slight overclock. Enough for regular cards to match the FE with a better cooler. From testing, my none founder edition RTX 2080 ti is matching or exceeding the average clock of the RTX 2080 ti founder edition. It's not enough to put on a different league. Not even the RTX Titan which is about 10% faster then the RTX 2080 ti.
I dunno about OS bloat, 1 core will be locked off most likely anyway because when these parts are made, quite commonly a core or 2 won't work or won't perform well enough at it's clock speeds, so instead of throwing away the whole SOC in the PS5's case because 1 cpu core didn't work, they will lock off 1 core in all of them, this was the case with PS3 too, the Cell had it's 8th SPE locked off because on some units it didn't work. It's a good way to save money i suppose.Hardly educated at all when you forgot to consider that a Zen CPU core is faster than a Jaguar core by a huge margin, and currently only 1 core is reserved on PS4/Xbox for OS functionality
We've had small worlds this gen?It's not just the graphics. Part of the reason levels/worlds this gen have been so small/claustrophobic is the underpowered jaguar CPU both consoles shared (and the slow hdd strangling asset streaming).
Honestly, I'm more excited for larger, fully realized worlds than I am the graphical improvement.
We haven't