• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.
  • We have made minor adjustments to how the search bar works on ResetEra. You can read about the changes here.

When will the first 'next gen' console be revealed?

  • First half of 2019

    Votes: 593 15.6%
  • Second half of 2019(let's say post E3)

    Votes: 1,361 35.9%
  • First half of 2020

    Votes: 1,675 44.2%
  • 2021 :^)

    Votes: 161 4.2%

  • Total voters
    3,790
  • Poll closed .
Status
Not open for further replies.
Oct 25, 2017
17,934
But it will also be a colossal CPU jump next gen, which will do wonders for scene complexity, draw calls, AI, NPC count, physics etc. Combine that with a baseline GPU improvement of 1.3 TF to 8 TF (and that's worst case scenario remember) as well as at least doubling RAM to 16GB (which likely means around triple RAM availability to devs, assuming the OS will use roughly the same, maybe another 1-2GB), and I still think that puts us on course for games that will look truly next gen. Regardless of how that shapes up % wise to previous gen over gen upgrades.

We've got to remember that God of War, Spider-Man, Uncharted 4, and Horizon Zero Dawn, all run on 1.8 TF machines with a god awful CPU, and look absolutely stunning. If an 8 TF machine means we'll still need dynamic res with checkerboarding until another mid gen refresh... honestly I don't think that's so bad at all.

EDIT: And I wouldn't be surprised if this is Microsoft's strategy with multiple boxes: Here's the 8 TF baseline box that upscales. Here's your 12 TF native 4K box, but you'll be paying a lot more for it...
There really is no argument against this. It happens gen after gen. The games end up being leagues ahead of what one would expect given the specs. It won't matter.

The complaining is unnecessary.
 

Deleted member 12635

User requested account closure
Banned
Oct 27, 2017
6,198
Germany
Obviously there is a lot which goes into the design of a compute unit than the amount of ALUs.
The competition just went down to 64 ALUs per compute unit with Turing and the perfromance/watt was retained.
The performance/mm² decreased a lot but Turing also has twice as many registers and larger caches, together with Tensor and Ray Tracing cores.
You could cut that down and you would improve the perf/mm² , at least for current applications.
Whatever the changes are I hope they implement "Concurrent Execution of Floating Point and Integer Instructions".
 
Jan 21, 2019
2,903
There really is no argument against this. It happens gen after gen. The games end up being leagues ahead of what one would expect given the specs. It won't matter.

The complaining is unnecessary.

The problem is not the specs, the problem is the resolution. 4k is extremely power hungry. I'd be wetting my pants if the PS5 was 6Tf at 1080p but it is probably 8TF at 4k which is not that impressive. Yeah the games will look great but I can't imagine the jump being crazy. Red Dead runs at 4k on the One X with 6TF, the PS5 having TF in 2020 is suspicious. I trust the devs and the console makers to make great products, they just have to prove that 4k is not sucking the life blood out of next gen.
 

Deleted member 40133

User requested account closure
Banned
Feb 19, 2018
6,095
Those comments by Pennell are very telling. Hes no longer employed by them and has no skin in the game anymore, so he could hype people with vague hints without getting in shit. But instead he's tempering expectations
 

Papacheeks

Banned
Oct 27, 2017
5,620
Watertown, NY
He's so right. Every single console reveal I can recall has been met with frustration and concerns from hardware enthusiasts. And yet every console gen has produced phenomenal results with said hardware.

Thinking about it... while the Pro and X have 4-6 TF the games they run look equally gorgeous on the respective 1-2 TF machines. If 8 TF becomes the new baseline now instead of 1.3 TF... coupled with more RAM and a colossal CPU upgrade (the one thing we are guaranteed lol) there really isn't anything to worry about. Devs will still produce mindblowingly gorgeous games, and maybe checkerboarding will just still be a thing for performance profiles trying to hit higher frame rates.

This has been my understanding. Like people that are so focused on the TF forget that the developers have been able to give us gorgeous looking games that look great on the base consoles that are 5 years old if not older tech wise.

Good point, the weak CPU of the consoles allowed for 144fps to flourish because the game complexity was made for consoles and not for i5 or i7. Does this mean that high end on PC will become even more expensive?

No if anything it will get cheaper. Right now it seems the core wars has started between cpu's. And that means cheaper prices for more cores and scalability. GPU will start to get better when NAVI comes out. More efficiency for less. And that goes hand in hand with the chips for consoles. Depending on how well consoles sell this new gen, you will see more 7nm being adopted for cpu/gpu's. If navi is going to be a replacement for polaris but gives you a RTX 2070 PERFORMANCE WISE with small watt imprint it will disrupt what nvidia is doing. They will have to either bring out lower prices, or bring out 7nm cards that offer better performance per watt for the price.

2020 is where we will start to see better pricing across the whole board. Especially since by then GDDR6 will be more widely produced and in larger amounts if they are being used by console.

But consoles being made for next gen if anything will push intel to do what AMD is doing. 7nm with a lot of cores, better version of hyperthreading or somthing equivalent to infinity fabric.
We were in this stagnant place couple years ago, part of that was because of consoles were on older nodes. But bigger issue was intel wasn't pushing for having more cores. Only in their highend products that started out at $800 and up for cpu's that now get outperformed by cpu's that are $300.
But to answer your initial question, PC gaming will be cheaper and I think with MS's push this gen will become even more mainstream.
 
Last edited:

Sowrong

Banned
Oct 29, 2017
1,442
Those comments by Pennell are very telling. Hes no longer employed by them and has no skin in the game anymore, so he could hype people with vague hints without getting in shit. But instead he's tempering expectations
He showed he's not really a technical guy back in the gaf days, not sure why his opinion would mean much in regards to this.
 
Oct 26, 2017
6,151
United Kingdom
On the subject of the Nextgen strategy discussion, it's somewhat amusing that so many in this thread have been bullish on the importance of hardware power and services over games in consumer puchasing decisions. whereas, this thread clearly indicates otherwise.

And we're ERA, not even the casual gaming majority audience.
 
Oct 27, 2017
7,163
Somewhere South
TFlop flat-Earther :D

On the subject of the Nextgen strategy discussion, it's somewhat amusing that so many in this thread have been bullish on the importance of hardware power and services over games in consumer puchasing decisions. whereas, this thread clearly indicates otherwise.

As long as you have a reasonably priced box (and the PS3 casts some shadows even on that), software always been and always will be king. I mean, Nintendo is still in the game and the sole reason for that is that their software is desirable.
 

Thorrgal

Member
Oct 26, 2017
12,468
Don't know if this has been posted

Edit: it seems it has been translated from Japanese so there are some spelling mistakes


https://www.reddit.com/r/PS5/comments/akqdr8/new_ps5_info_leaked/


Summary:

PS5 official information from Sony not soon (Sony will not participate in E3). Announcement - "Apple" type more than release. I will even say official announcement in PSX in the first quarter of 2020.
Sony will be a little quiet this year

- PS5 99% release, November 2020
- Backward compatibility
- Physics game & PS store
-ps plus and ps plus premium (premium beta early access, private server creation, 4k free remaster)
-specs custom CPU ryzen [email protected]
GPU Navi architecture 10TF, Sony in conjunction with Amd for Navi, more raytracing, but that is not the focus.
os 16GB Gddr6 + 4gb ddr4, a new mem controller for os ram, we already have 32 GB development kits
- 8k up scaling
-2 tb hdd Some NAND flash
2020 - PSVR 2, ps 5, resolution boost, 90 - 120 hz, 220 view, eye tracking, some foveate rendering, wireless, headphone integration, less movement, no breaker box, cable management for much less aaa games Focus on VR, price is around 349 $
-dualshock 5, inside the camera for VR, usb-c
- Price 399 dollars, original loss of 40 to 60 dollars per console

Ps4 exclusive release game I know

Gran Turismo Sports has better graph.fidelity, ray tracing, complete VR support.
Pubg remaster 4k f2p using ps + only with ps5
Our last 2 ps 4 / ps 5
Tsushima Ghost ps 4 / ps 5
First 6 months + 2 AAA games + psvr 2 games

Non-limiting ps5 game 2020

Battlefield Bad Company 3 established in Vietnam, already developed for 2 years
Harry potter
cyber punk
Gta 6 has heard rumors about Miami and New York, the two largest cities so far probably have not heard anything related to ps 4, a 2021 holiday

If that's not legit I think it's nevertheless right on the money

Perfect price, and best specs you can expect for that price. It's even conservative on the loss they'll take.

Why does people think is fake? I missed all the convo.

Never heard about the Battlefield Bad Company 3 based in Vietnam rumor before, maybe that's been debunked?
 

MrKlaw

Member
Oct 25, 2017
33,235
It's rather unusual for all games to make full use of peak math throughput. Certainly doesn't happen often on current gen h/w.

I think the post wasn't making the assertion that it would be 100% utilised, more that the majority of games would get full benefit from 100% of the silicon - not having a significant amount being used for a specialised feature that many devs would not leverage
 

Locuza

Member
Mar 6, 2018
380
My point about power efficiency was in relation to GDDR6 versus 5. My comment about bandwidth was GDDR6 versus HBM2, which is why I cited the stack count. If you normalized the speed, the efficiency would be even higher than 16%. The fact that it can run at twice the speed at lower voltages than GDDR5 is significant. It also has a granularity advantage. It's a game-changing difference between the technologies. If we were still stuck with GDDR5, we'd be wringing our hands about how even with a 384-bit bus, we still can't feed Vega/GCN derivatives with the bandwidth they obviously seem to need based on desktop parts.
I asked the german user dargo if he could run some scaling tests with his Vega64 LCE.
He run the HBM2 memory at three different speeds with 384 GB/s (100%), 448 GB/s (~117%) and 512 GB/s (~133%).
Now the core clock suffered here and there because of the power budget but under three games the difference between 384GB/s and 512GB/s (+33% more BW) was fairly low.
The Witcher 3 runs 6% faster (104 vs. 98 FPS), Wolfenstein II ~7% and For Honor ~9%.
The difference between 448GB/s (100%) and 512GB/s (114%) is really low.
https://www.forum-3dcenter.org/vbulletin/showthread.php?p=11739346#post11739346

You would get 384GB/s with a 384-bit @8Gbps GDDR5 interface and 320 GB/s if you went for 256-bit GDDR5 @10Gbps (1,55V).
Obviously GDDR6 makes it so much easier to reach the bandwidth without a wide interface and efficiency wise you would take everything you could get but the improvement is far lower than the memory bandwidth some anticipate.
Without higher energy consumption for the memory you won't get much higher (>20%) bandwidth than before.

I think without blowing the consumption and production costs up, many technical aspects will be far lower than some imagine.
Or the price will need to get up, which the customer and/or the producer needs to pay.
 
Oct 26, 2017
6,151
United Kingdom
As long as you have a reasonably priced box (and the PS3 casts some shadows even on that), software always been and always will be king. I mean, Nintendo is still in the game and the sole reason for that is that their software is desirable.

I'd argue PS3 still proves the point, since global sales only blew up after Sony's agressive cost reductions moved the PS3 into a more reasonable pricing bracket.

It's rather unusual for all games to make full use of peak math throughput. Certainly doesn't happen often on current gen h/w.

I think you know that wasn't my intended meaning, good sir.

I think the post wasn't making the assertion that it would be 100% utilised, more that the majority of games would get full benefit from 100% of the silicon - not having a significant amount being used for a specialised feature that many devs would not leverage

Thank you.
 

Kyoufu

Member
Oct 26, 2017
16,582
The best thing to happen in this thread is 8TFers being recognised as the equivalent of Flat-Earthers lmao
 

Deleted member 5764

User requested account closure
Banned
Oct 25, 2017
6,574
Finally got around to watching that DF video and it was interesting! Doesn't feel like we learned a ton aside from knowing that both manufacturers have dev kits out there in some form by now. I don't see why people are so focused on the "prepare for disappointment line". Reasonable folks in this thread have been saying that for quite some time. Literally every console that comes out has people fawning over the revolutionary tech that "could" be inside. The raw specs almost always disappoint on paper, but what devs accomplish with said hardware is always impressive to me.
 

NoTime

Member
Oct 30, 2017
250
I like the demo but it is nowhere as complex as a scene for a modern game.. BF5 is a much better example. After shading is not optimize for raytracing. All modern engine have tons of specialized shaders and it is a problem for cache instruction. For raytracing they need only a few generalized shaders and it will improve performance but currently we can't have multple rt effect with software raytracing on the same time. It is a demo at 1080p and the GDC demo on Volta was running with 1 ray per pixel when you improve the quality with more ray per pixel it becomes to be difficult for Volta.

Reading devs they expect raytraced shadow and raytraced ao for next generation. Because it is the easier to do and shadow secondary ray are memory coherent(cache friendly).

With Power VR solution Raytraced shadows are two times faster than shadow maps with better quality. Imo ugly shadow maps artifact are the worst artifact in real time rendering:

https://www.imgtec.com/blog/ray-traced-shadows-vs-cascaded-shadow-maps/

Other artifacts non solved by raytracing are undersampling, motion blur quality, depth of field quality and much better geometry complexity*.

* in Dreams you can do much better hair, round object and better foliage easily compared to current rasterization

EDIT: Explanation about shaders and raytracing by a dev





I watched the whole DD presentation it actually runs only 1 ray per 4 pixels and after quite a few approximation and upscaling tricks gives a result equal to around 1ray per pixel. Their ray budget for the demo is just 0.5 gigarays. Also, Tomasz (lead programmer on this thing) says that they actually need more raw shading power to get RT stuff better, not more rays per se.

Personally, I'm not sold on RT yet. Yeah, new tech is cool and all, but we have only 1 game with HW RT so far and it is really underwhelming. I know that it not only can make games more photorealistic and such, but also will make lighting artists' work easier. But so far handmade "hacked" solutions looks just as good. Lighting in ND games or reflections in new Hitman 2. I won't be upset if we don't get any real RT nextgen
 

anexanhume

Member
Oct 25, 2017
12,918
Maryland
I asked the german user dargo if he could run some scaling tests with his Vega64 LCE.
He run the HBM2 memory at three different speeds with 384 GB/s (100%), 448 GB/s (~117%) and 512 GB/s (~133%).
Now the core clock suffered here and there because of the power budget but under three games the difference between 384GB/s and 512GB/s (+33% more BW) was fairly low.
The Witcher 3 runs 6% faster (104 vs. 98 FPS), Wolfenstein II ~7% and For Honor ~9%.
The difference between 448GB/s (100%) and 512GB/s (114%) is really low.
https://www.forum-3dcenter.org/vbulletin/showthread.php?p=11739346#post11739346

You would get 384GB/s with a 384-bit @8Gbps GDDR5 interface and 320 GB/s if you went for 256-bit GDDR5 @10Gbps (1,55V).
Obviously GDDR6 makes it so much easier to reach the bandwidth without a wide interface and efficiency wise you would take everything you could get but the improvement is far lower than the memory bandwidth some anticipate.
Without higher energy consumption for the memory you won't get much higher (>20%) bandwidth than before.

I think without blowing the consumption and production costs up, many technical aspects will be far lower than some imagine.
Or the price will need to get up, which the customer and/or the producer needs to pay.
That would only be good news then because they could get away with a 256-bit bus, even with CPU bandwidth needs. We'll have to see how games' demands on memory change over time. TW3 will be what, 5 years old by next gen hits?
 

Marble

Banned
Nov 27, 2017
3,819
A coworker today explained ray tracing to me and showed me this video.

I was quite impressed by it.

Do these graphics seem possible to y'all on the new consoles or are these too advanced for the rumored specs?

Sorry, I'm kind of a noob when it comes to this stuff.



I am still not sure what I am looking at. It's basically better (and real-time) reflections, right? That's nice and all, but do we really think we are gonna miss this when playing a fast paced shooter? Or is it more than that?
 
Oct 26, 2017
6,151
United Kingdom
The best thing to happen in this thread is 8TFers being recognised as the equivalent of Flat-Earthers lmao

Lol

Finally got around to watching that DF video and it was interesting! Doesn't feel like we learned a ton aside from knowing that both manufacturers have dev kits out there in some form by now. I don't see why people are so focused on the "prepare for disappointment line". Reasonable folks in this thread have been saying that for quite some time. Literally every console that comes out has people fawning over the revolutionary tech that "could" be inside. The raw specs almost always disappoint on paper, but what devs accomplish with said hardware is always impressive to me.

Couldn't agree more.

I watched the whole DD presentation it actually runs only 1 ray per 4 pixels and after quite a few approximation and upscaling tricks gives a result equal to around 1ray per pixel. Their ray budget for the demo is just 0.5 gigarays. Also, Tomasz (lead programmer on this thing) says that they actually need more raw shading power to get RT stuff better, not more rays per se.

Personally, I'm not sold on RT yet. Yeah, new tech is cool and all, but we have only 1 game with HW RT so far and it is really underwhelming. I know that it not only can make games more photorealistic and such, but also will make lighting artists' work easier. But so far handmade "hacked" solutions looks just as good. Lighting in ND games or reflections in new Hitman 2. I won't be upset if we don't get any real RT nextgen

Agreed.
 

Deleted member 5764

User requested account closure
Banned
Oct 25, 2017
6,574
Those comments by Pennell are very telling. Hes no longer employed by them and has no skin in the game anymore, so he could hype people with vague hints without getting in shit. But instead he's tempering expectations

I don't think Albert wants to ruin any relationships with folks at Microsoft just for 5 minutes of internet fame in a next-gen speculation thread. He's also posted in this thread already with a similar goal. Make people aware that consoles have to compromise to get to an acceptable price point. His last set of posts also did a great deal to explain why BoM isn't the only factor in deciding MSRP.

I am still not sure what I am looking at. It's basically better (and real-time) reflections, right? That's nice and all, but do we really think we are gonna miss this when playing a fast paced shooter? Or is it more than that?

That's what honestly confuses me about RT. I know that it's an exciting concept to talk about right now, but I just don't see it feeling revolutionary in the long run. If anything, I think we'd just get used to it and maybe be a bit confused when we go back to games that don't have it.
 
Oct 27, 2017
4,018
Florida
Ray tracing is a huge leap in realism for me. It's the little things like realistic reflections and real time dynamic shadows that make a game come alive and can lead to some crazy gameplay elements. Imagine a Splinter Cell game where gameplay changes with each playthrough depending on where real time shadows are cast. Or a Jason and The Argonauts game where you battle Medusa and have to rely on the reflection on your shield because you cant look at her directly. Some really cool shit will be done in the future for sure.
 
Oct 27, 2017
7,163
Somewhere South
That's what honestly confuses me about RT. I know that it's an exciting concept to talk about right now, but I just don't see it feeling revolutionary in the long run. If anything, I think we'd just get used to it and maybe be a bit confused when we go back to games that don't have it.

Biggest thing about RT is not how much better it will make stuff look, but that it's about as simple and straight-forward as it gets. No hacks, no shortcuts, just pure brute force. Place a light and you'll get accurate shadows, indirect lighting, AO etc, for it, without need to compute and bake GI, bake shadow maps, etc.

You can achieve 95% of the visuals you'll get from RT by going with some other technique, like SVOGI, for a fraction of the computational cost, but then you have to deal with implementation, with how things will interact with it, etc.
 
Oct 26, 2017
6,151
United Kingdom
Ray tracing is a huge leap in realism for me. It's the little things like realistic reflections and real time dynamic shadows that make a game come alive and can lead to some crazy gameplay elements. Imagine a Splinter Cell game where gameplay changes with each playthrough depending on where real time shadows are cast. Or a Jason and The Argonauts game where you battle Medusa and have to rely on the reflection on your shield because you cant look at her directly. Some really cool shit will be done in the future for sure.

None of your examples require RT.

RT is purely about lighting and graphics and primarily the MASSIVE improvements in developer workflow during development.

In a way, it means that you could easily see smaller developers able to effortlessly push out RDR2 level visuals because of how drastically the level of development iteration reduces. You need only create the art assets, place the lighting in a scene and tweak in realtime. No need for expensive baking and the iterative back and forth between art and tech in order to get a scene to approach a desired look.

You could even start seeing bigger and richer, more detailed game worlds as a result, due to the dev time savings being used to focus on content creation primarily.

It is revolutionary technology, when you have the hardware to provide it fully. At the moment, however, especially with next-gen console discussion, I feel the hype is premature.
 

Deleted member 5764

User requested account closure
Banned
Oct 25, 2017
6,574
Biggest thing about RT is not how much better it will make stuff look, but that it's about as simple and straight-forward as it gets. No hacks, no shortcuts, just pure brute force. Place a light and you'll get accurate shadows, indirect lighting, AO etc, for it, without need to compute and bake GI, bake shadow maps, etc.

You can achieve 95% of the visuals you'll get from RT by going with some other technique, like SVOGI, for a fraction of the computational cost, but then you have to deal with implementation, with how things will interact with it, etc.

Thanks for this! That's the kind of explanation I hadn't seen before and it makes a ton of sense.
 

Deleted member 40133

User requested account closure
Banned
Feb 19, 2018
6,095
I don't think Albert wants to ruin any relationships with folks at Microsoft just for 5 minutes of internet fame in a next-gen speculation thread. He's also posted in this thread already with a similar goal. Make people aware that consoles have to compromise to get to an acceptable price point. His last set of posts also did a great deal to explain why BoM isn't the only factor in deciding MSRP.



That's what honestly confuses me about RT. I know that it's an exciting concept to talk about right now, but I just don't see it feeling revolutionary in the long run. If anything, I think we'd just get used to it and maybe be a bit confused when we go back to games that don't have it.

Pennello is 100% not burning any bridges with his comments and definitely did not intend to comment so forums would pick up with it. All he is saying is temper your expectations, which is something he would never say as an employee of MS, history shows that
 

chris 1515

Member
Oct 27, 2017
7,075
Barcelona Spain
I watched the whole DD presentation it actually runs only 1 ray per 4 pixels and after quite a few approximation and upscaling tricks gives a result equal to around 1ray per pixel. Their ray budget for the demo is just 0.5 gigarays. Also, Tomasz (lead programmer on this thing) says that they actually need more raw shading power to get RT stuff better, not more rays per se.

Personally, I'm not sold on RT yet. Yeah, new tech is cool and all, but we have only 1 game with HW RT so far and it is really underwhelming. I know that it not only can make games more photorealistic and such, but also will make lighting artists' work easier. But so far handmade "hacked" solutions looks just as good. Lighting in ND games or reflections in new Hitman 2. I won't be upset if we don't get any real RT nextgen

For raytracing you need at least to have full second ray be raytraced for being as efficient as in offline rendering for lightning artist. There is another production problem I hope, GPU maker will try to find a solution asap. It is than in offline rendering modelistation is more efficient because artist can use the high polygon models in rendering, no creation of normal maps, uv unwrapped. I hope one day it will be solved too. And no more card hair and better foliage too and better fur shading...

https://clearcut-tutorials.xsollasitebuilder.com/

cost 9.99 dollars to see the tutorial

It is a tutorial in realtime(no timelapse) for create a fire hydrant with AAA quality doing the high polygon modelisation and doing all the work to have the low poly model + normal maps. it goes faster to do the high poly model and it is 08h30 for doing a model of fire hydrant for a AAA game. I heard from a friend working inside the industry depending of the model complexity it can reduce the modelisation time by 50 to 75% of the current time. He told me a character can take three weeks to be finalized.

In 2009 and 2010, Intel was thinking about solve the problem and many researches were done but it was not push further after the Larrabee fiasco.

http://graphics.stanford.edu/~kayvonf/


And other people were doing some research about this in 2009/2010 and it was not only having high geometry but add some of the REYES advantages like a crazy AA, high quality motion blur and depth of field. Maybe the next step after raytracing. One of the paper was imagining a new GPU architecture with 16xMSAA or SSAA, with 16 sample for motion blur and depth of field. And it can work with current rasterization and raytracing but need to modify GPU architecture. It was inspired by REYES and one of it property call decoupled sampling from visibility. They tried the magic number 64 SSAA, 64 sample for motion blur and depth of field, like the minimum quality for offline rendering it can goes up to 256 samples.

http://people.csail.mit.edu/jrk/decoupledsampling/ds.pdf
 
Last edited:

modiz

Member
Oct 8, 2018
17,905
Biggest thing about RT is not how much better it will make stuff look, but that it's about as simple and straight-forward as it gets. No hacks, no shortcuts, just pure brute force. Place a light and you'll get accurate shadows, indirect lighting, AO etc, for it, without need to compute and bake GI, bake shadow maps, etc.

You can achieve 95% of the visuals you'll get from RT by going with some other technique, like SVOGI, for a fraction of the computational cost, but then you have to deal with implementation, with how things will interact with it, etc.
in that case, why not just use ray tracing in dev kits, have developers render the lights in real time and do all the needed adjustments and then switch off the RTX for the game itself to run at a steady performance?
 
Oct 25, 2017
17,934
The problem is not the specs, the problem is the resolution. 4k is extremely power hungry. I'd be wetting my pants if the PS5 was 6Tf at 1080p but it is probably 8TF at 4k which is not that impressive. Yeah the games will look great but I can't imagine the jump being crazy. Red Dead runs at 4k on the One X with 6TF, the PS5 having TF in 2020 is suspicious. I trust the devs and the console makers to make great products, they just have to prove that 4k is not sucking the life blood out of next gen.
I don't see that being an issue either.
 
Jan 21, 2019
2,903
Soooo, ray tracing excluded, what kind of rendering techniques can we expect on next gen hardware. I just looked up SVOGI and it looks really cool. Anything else I can look up to hype myself beyond redemption?
 
Feb 10, 2018
17,534
Biggest thing about RT is not how much better it will make stuff look, but that it's about as simple and straight-forward as it gets. No hacks, no shortcuts, just pure brute force. Place a light and you'll get accurate shadows, indirect lighting, AO etc, for it, without need to compute and bake GI, bake shadow maps, etc.

You can achieve 95% of the visuals you'll get from RT by going with some other technique, like SVOGI, for a fraction of the computational cost, but then you have to deal with implementation, with how things will interact with it, etc.

So the benefit of RT is more so to ease game development.
Because while RTX reflections look like very good reflections in terms of the overall visual look its not a drastic improvement.

After watching RTX quake 2 gameplay it seems that RT is essentially behaves like real lighting, so in quake 2 case you turn RT on and the light sources behave like they would in real life, they change the look and colour of surfaces and produce reflections on reflective surfaces, where other methods are done by trickery and reflections and certain textures have to be manually placed to produce a realistic look.

The best comparison I can think of, is a character in a game with ragdoll physics, you can manipulate that character in 1000s of different ways and get 1000s of different results, where pre canned animations have to be individually created.
So RT is like the ragdoll?
 

Deleted member 12635

User requested account closure
Banned
Oct 27, 2017
6,198
Germany
So the benefit of RT is more so to ease game development.
Because while RTX reflections look like very good reflections in terms of the overall visual look its not a drastic improvement.

After watching RTX quake 2 gameplay it seems that RT is essentially behaves like real lighting, so in quake 2 case you turn RT on and the light sources behave like they would in real life, they change the look and colour of surfaces and produce reflections on reflective surfaces, where other methods are done by trickery and reflections and certain textures have to be manually placed to produce a realistic look.

The best comparison I can think of, is a character in a game with ragdoll physics, you can manipulate that character in 1000s of different ways and get 1000s of different results, where pre canned animations have to be individually created.
So RT is like the ragdoll?
RT produces a better more accurate representation of the reality without the need to rely on graphical workarounds and hacks like the methods mentioned above. This is why it is called "the holy grail of graphics". Though it requires more performance compared to the methods we are used to.
 
Last edited:

Deleted member 38397

User requested account closure
Banned
Jan 15, 2018
838
I think whoever said cheap Xbox next (Lockhart) is going to be 6TF (an Xbox One X with better CPU), PS5 8TF and Xbox (Anaconda) 10TF is probably right on the money.
 

chris 1515

Member
Oct 27, 2017
7,075
Barcelona Spain
Biggest thing about RT is not how much better it will make stuff look, but that it's about as simple and straight-forward as it gets. No hacks, no shortcuts, just pure brute force. Place a light and you'll get accurate shadows, indirect lighting, AO etc, for it, without need to compute and bake GI, bake shadow maps, etc.

You can achieve 95% of the visuals you'll get from RT by going with some other technique, like SVOGI, for a fraction of the computational cost, but then you have to deal with implementation, with how things will interact with it, etc.

Good explanation, but SVOGI is not a low computation rendering technics. It needs many memory for the octree, and browse inside the octree is not the best quality of a GPU, RT core are perfect for browse the octree. This why they use a voxel cone tracing in The Tomorrow children.

Raytracing is simplier than other hacks used now but it is slower. If it was not the case we would have raytracing since a long time in realitme rendering engine. The problem of raytracing many part of it aren't memory coherent not cache friendly. And I know BVH helps but it will always been slower than rasterization. If one day someone find a solution to have a BVH so small than you can use a GPU ESRAM type cache fo having it all inside a few dozen of megabyte it will solve all problems. But it is probably a non existent solution.

https://www.cs.utah.edu/~shirley/irt/RT06_Course_LOD.pdf

Slide 10

Rasterization


Advantage:
– Use graphics hardware / GPUs
(fast, growing faster than Moore's Law)
– 1-2 orders of magnitude faster than ray tracing

• Disadvantages:
– Local illumination
– Performance ~ linear to # triangles

slide 26

• Well studied for 25+ years
• 1-2 orders of magnitude slower than rasterization
• But: asymptotic performance ~ logarithmic
 
Last edited:
Feb 10, 2018
17,534
RT produces a better representation of the reality without the need to rely on graphical workarounds and hacks like the methods mentioned above. This is why it is called "the holy grail of graphics".

It's not really the holy grail of graphics, it just seems a vastly more efficient way of doing things.
To me the Holy grail of graphics is a 5000tflop gpu, 500tb ram and a CPU 20 gens ahead of the best we have now.
 

Carn

Member
Oct 27, 2017
11,990
The Netherlands
I believe that their next proper microarchitecture will be considerably different and, quite possibly, revolutionary. Their patents kinda point at something with much more flexibility. If I'm correct (and I might very well NOT be), each SP (or whatever they'll call them) will be able to behave a bit like the RT cores, so, in thesis, the entire GPU could be leveraged for RT acceleration instead of using fixed function units.

I don't expect it any time before 2021, though.

Larrabee (or Cell) lives!
 
Status
Not open for further replies.