• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.

IMACOMPUTA

Member
Oct 27, 2017
2,529
as·sump·tion
/əˈsəm(p)SH(ə)n/
noun
plural noun: assumptions
  1. 1.
    a thing that is accepted as true or as certain to happen, without proof.



So where's the proof? I pointed out that an absence of evidence argument is not a valid proof and cannot be used to refute the author's assumption and you lot lost your heads. We had people talking about breaking space and time and some other shit because it seems nobody knows what the word assumption means.
This next post made possible by your logic:

PS5 is actually going to be called PS9.
Aliens are among us.
Scalebound is a PS9 exclusive.
I'm Gabe Newell.
I posted this from the PS9.

Prove me wrong.


EDIT:
While you do raise a fair argument, some of his other colleagues from DF who post on here have opted not to participate in the speculation, which is wise. If you don't know, it's best not to comment. As it stands, a select group are privy to the ps5's hardware details. It's just like the next gen speculation thread.... People are predicting and asserting specs/services based on a foundation of assumptions.

COME ON, MAN!
 
Last edited:

Sqrt

Member
Oct 26, 2017
5,880
Nah, Komachi has been tracking a custom soc linked to the ps5 since January. This is an old tweet but there are newer ones tracking the revisions to the chip. Just because you don't know doesn't mean the information doesn't exist.

With regards to the RT implementation and your lack of understanding, If you followed the conversation, it's clear to see that I was simply describing why the author's assumption would have to be correct if the RT implementation is hardware based.


I know that info. According to that the GPU is around 8Tflops. Are you sure you want to go that way?
 

Deleted member 18161

user requested account closure
Banned
Oct 27, 2017
4,805
While you do raise a fair argument, some of his other colleagues from DF who post on here have opted not to participate in the speculation, which is wise. If you don't know, it's best not to comment. As it stands, a select group are privy to the ps5's hardware details. It's just like the next gen speculation thread.... People are predicting and asserting specs/services based on a foundation of assumptions.

Even though I now use PC as my main platform (having just spent £530 on an RTX 2070 no less!) I always find it funny the insane levels of defensiveness that goes on with big time PC gamers when new console hardware is officially announced. All we heard for the months leading up to the PS4/XB1 release after the spec reveals was how they were "low end desktops GPU's with notebook CPU's!!" yet here we are with games that look as great as Ryse, The Order, FH4, GT Sport, Gears 4, Horizon, Spider-Man and God of War on base consoles (most of them running at native 1080p no less).

I still believe that artistic talent and budget are far more important than specs (within reason). Whoever spends the most money usually has the best looking games regardless of hardware.
 

fiendcode

Member
Oct 26, 2017
24,909
Was RROD caused by AMD's chip design or MS's system design? (Spoiler: It was a system design flaw and MS has stated as much.)

Also, we have 6 years of the current console generation to look to for data. That clearly supersedes whatever happened 14 years ago.
Gamecube's GPU was prone to overheating in launch models. That's the only "AMD" chip I can think of with any history of an unusual failure rate in the console sphere.

I don't think Nvidia has any though if we're comparing.
 

JahIthBer

Member
Jan 27, 2018
10,376
Even though I now use PC as my main platform (having just spent £530 on an RTX 2070 no less!) I always find it funny the insane levels of defensiveness that goes on with big time PC gamers when new console hardware is officially announced. All we heard for the months leading up to the PS4/XB1 release after the spec reveals was how they were "low end desktops GPU's with notebook CPU's!!" yet here we are with games that look as great as Ryse, The Order, FH4, GT Sport, Gears 4, Horizon, Spider-Man and God of War on base consoles (most of them running at native 1080p no less).

I still believe that artistic talent and budget are far more important than specs (within reason). Whoever spends the most money usually has the best looking games regardless of hardware.
Problem with this line of thinking is Navi is not PS5 exclusive, so if Navi is super powerful & has ray tracing like the hype is saying, it benefits PC gamers too & Nvidia will have to answer for their Ray traced GPU's being so expensive. I would like this to be the case but Nvidia wouldn't overprice their GPU's if they knew AMD was going to counter them with cheap ray traced Navi GPU's.
 

TooBusyLookinGud

Graphics Engineer
Verified
Oct 27, 2017
7,937
California
I think it's safe to assume that AMD's ray tracing technology that will be incorporated in a late 2020 console has a high chance of being more efficient than what we find in current turing architecture from Nvidia, which came out in late 2018. When the PS5 will come out, it will be more than 2 years since the release of the RTX series. Now, if we assume this, we must also assume that Nvidia, on their side, will have 2 years to improve their technology. Basically, I think that the PS5 will have better performance for ray tracing compared to what we have nowadays but Nvidia products from 2020 will beat it hands down.
What? How did you come to this conclusion? I'd assume that since Nvidia has the lead on this tech, they would have the benefit of introducing a more efficent solution. They've already made great improvements in this space - BFV for example.
 

Sqrt

Member
Oct 26, 2017
5,880
Oh you do? Cool, you should know what an engineering sample is and how to find the revisions that have been made since the data was initially published. Go look that up then we can talk.
You are either arguing on bad faith or have absolutelty no idea how the concept of burden of proof works. Im gona give you the benefit of the doubt and say is the second...

Better yet, I will argue like you. I assume you are wrong on all your assumptions. Prove me wrong. The information why you are wrong is out there. Go look it up. Have a good day.

Welcome to my ignore list.
 

IMACOMPUTA

Member
Oct 27, 2017
2,529
It seems the goalposts have moved, but:

How does one prove that someone has no proof?
I'd say the lack of proof is proof.

How dumb does this all sound?
 

JahIthBer

Member
Jan 27, 2018
10,376
Except the Ryzens are great?
Not to dunk on AMD or anything, but i would say Ryzen merely almost caught up to Intel 2015 & Intel hasn't been progressing much due to internal issues, they are struggling to get on 7nm for one.
I don't think AMD should be celebrated for catching up to Intel half a decade later, but they deserve praise for their prices.
 

kraftdinner

Alt account
Banned
Mar 8, 2019
254
What? How did you come to this conclusion? I'd assume that since Nvidia has the lead on this tech, they would have the benefit of introducing a more efficent solution. They've already made great improvements in this space - BFV for example.

I think you misunderstood what I said. I believe Nividia will still have an edge in 2020, hands down. They have the lead on ray tracing and I think they will keep it. I'm merely saying that I believe AMD's technology from 2020 will be more efficient than Nvidia's technology from 2018.

Basically, I think:

2020 Nvidia Ray Tracing technology >> 2020 AMD Ray Tracing technology
2020 AMD Ray Tracing technology > 2018 Nvidia Ray Tracing technology

Considering the history of both companies and their tendencies, It certainly isn't far-fetched to think so.
 

Deleted member 34239

User requested account closure
Banned
Nov 24, 2017
1,154
You are either arguing on bad faith or have absolutelty no idea how the concept of burden of proof works. Im gona give you the benefit of the doubt and say is the second...

Better yet, I will argue like you. I assume you are wrong on all your assumptions. Prove me wrong. The information why you are wrong is out there. Go look it up. Have a good day.

Welcome to my ignore list.
First you claimed:
You make no sense. There's absolutely no information to give the ballpark of PS5 GPU performance, let alone its RT implementation.
I provided the quickest link I could find. You then shifted goal posts to:
I know that info. According to that the GPU is around 8Tflops. Are you sure you want to go that way?
Within the first sentence of your response, you let me know that either your initial statement was a lie or your subsequent statement is a lie. Now you want to talk about burden of proof and arguing in good faith? Really man, really.....
 

Sqrt

Member
Oct 26, 2017
5,880
Not to dunk on AMD or anything, but i would say Ryzen merely almost caught up to Intel 2015 & Intel hasn't been progressing much due to internal issues, they are struggling to get on 7nm for one.
I don't think AMD should be celebrated for catching up to Intel half a decade later, but they deserve praise for their prices.
Ryzen is good tech. The slight difference in single threaded performance are not worth giving up the price advantage and the lossibility of an APU, specially when the consoles wont use high clocks anyway.
 

JahIthBer

Member
Jan 27, 2018
10,376
Ryzen is good tech. The slight difference in single threaded performance are not worth giving up the price advantage and the lossibility of an APU, specially when the consoles wont use high clocks anyway.
Im not really saying Consoles should use Intel, but Ryzen does lose to 2015's Skylake in many scenario's even if it has a 4 core advantage, if Intel kept competing, Ryzen would look like Bulldozer again.
Seriously i dunno what the hell Intel is doing.
 

Sqrt

Member
Oct 26, 2017
5,880
Im not really saying Consoles should use Intel, but Ryzen does lose to 2015's Skylake in many scenario's even if it has a 4 core advantage, if Intel kept competing, Ryzen would look like Bulldozer again.
Seriously i dunno what the hell Intel is doing.
You can say that intell hasnt moved much from 2015 either. No reason to upgrade from my Broadwell system, for one.
 

JahIthBer

Member
Jan 27, 2018
10,376
You can say that intell hasnt moved much from 2015 either. No reason to upgrade from my Broadwell system, for one.
That's what i am saying, Intel has been stuck in 2015, by 2019 Intel should have something that makes Skylake look like a joke instead of "oh we clocked it to 5ghz this time"
 

TooBusyLookinGud

Graphics Engineer
Verified
Oct 27, 2017
7,937
California
I think you misunderstood what I said. I believe Nividia will still have an edge in 2020, hands down. They have the lead on ray tracing and I think they will keep it. I'm merely saying that I believe AMD's technology from 2020 will be more efficient than Nvidia's technology from 2018.

Basically, I think:

2020 Nvidia Ray Tracing technology >> 2020 AMD Ray Tracing technology
2020 AMD Ray Tracing technology > 2018 Nvidia Ray Tracing technology

Considering the history of both companies and their tendencies, It certainly isn't far-fetched to think so.
Thanks for clearing that up.
 

Deleted member 18161

user requested account closure
Banned
Oct 27, 2017
4,805
Problem with this line of thinking is Navi is not PS5 exclusive, so if Navi is super powerful & has ray tracing like the hype is saying, it benefits PC gamers too & Nvidia will have to answer for their Ray traced GPU's being so expensive. I would like this to be the case but Nvidia wouldn't overprice their GPU's if they knew AMD was going to counter them with cheap ray traced Navi GPU's.

I was more just talking in general. It wasn't aimed at Dictator in particular. Sorry I should have made that clear.
 

JahIthBer

Member
Jan 27, 2018
10,376
I was more just talking in general. It wasn't aimed at Dictator in particular. Sorry I should have made that clear.
Oh right, i guess you are right, PS4 did indeed punch above it's weight & surprise everyone, i am worried PS5 targeting 4K will lower it's potential to surprise us with a next gen leap though. 8 million pixels is so much.
 

Sqrt

Member
Oct 26, 2017
5,880
That's what i am saying, Intel has been stuck in 2015, by 2019 Intel should have something that makes Skylake look like a joke instead of "oh we clocked it to 5ghz this time"
At least you can say that AMD is almost even with Intel. Sadly, we can't say that when it comes to AMD compared to Nvidia.
 

laser

Member
Feb 17, 2018
310
That's what i am saying, Intel has been stuck in 2015, by 2019 Intel should have something that makes Skylake look like a joke instead of "oh we clocked it to 5ghz this time"
Processor design and manufacturing is hard. AMD lost a generation with Bulldozer so I don't see why it's so inconceivable that Intel's doing the same thing and allowing AMD to catch back up.
 

Deleted member 18161

user requested account closure
Banned
Oct 27, 2017
4,805
If I'd wager I guess, I'd say we'll probably see something like player character shadows as well as close objects being raytraced and maybe some cool-looking reflections to be able to point at them and say "See? See? Raytracing!!1!" (Though apparently, you don't even need that and everyone buys into the hype).

I mean, even with GPUs with dedicated hardware, it's not like you can do that much more while running at the ever-important 4k resolution.

That's kind of what Ray Tracing is at the moment though even on an RTX 2080 Ti right? (Metro for lighting, BFV for reflections, Shadow of the Tomb Raider for shadows). Developers pick once facet of rendering or sometimes two (?) and use RT to enhance it.

I could be wrong but I think we're over a decade from developers using RT on every single part of a real time games rendering.
 

Pargon

Member
Oct 27, 2017
11,991
Not to dunk on AMD or anything, but i would say Ryzen merely almost caught up to Intel 2015 & Intel hasn't been progressing much due to internal issues, they are struggling to get on 7nm for one.
I don't think AMD should be celebrated for catching up to Intel half a decade later, but they deserve praise for their prices.
Uh, sure. "Almost caught up" by offering twice as many cores and >2x performance at the same price point as Intel when they launched in 2017 (R7-1700 vs i5-7600) while offering more PCIe lanes and also supporting ECC memory.
If the only thing you care about is IPC and clockspeed as it relates to gaming, I suppose it's 'underwhelming' that AMD are now merely a competitive option rather than being stuck half a decade behind as they were.

While it would have been nice if they had bested Intel on all fronts, and if Intel had not hit a wall, all things considered, I don't think you can call Ryzen disappointing - especially when it looks like they will be doubling core counts again this year with Zen 2, which should still be supported by existing AM4 motherboards.

Hopefully there will be advances that let both of them push past this IPC wall that CPUs seem to have hit. It sounds like there's some potentially interesting technology that will have the CPU split single-threaded code to run across many cores, but there's no way of knowing how far off that is, or if it will prove to be viable.
 

TheMadTitan

Member
Oct 27, 2017
27,206
If the consoles use ray tracing, I'm going to need to upgrade my GPU. Sure, ray tracing isn't efficient now, but it will be.
 

JahIthBer

Member
Jan 27, 2018
10,376
Uh, sure. "Almost caught up" by offering twice as many cores and >2x performance at the same price point as Intel when they launched in 2017 (R7-1700 vs i5-7600) while offering more PCIe lanes and also supporting ECC memory.
If the only thing you care about is IPC and clockspeed as it relates to gaming, I suppose it's 'underwhelming' that AMD are now merely a competitive option rather than being stuck half a decade behind as they were.

While it would have been nice if they had bested Intel on all fronts, and if Intel had not hit a wall, all things considered, I don't think you can call Ryzen disappointing - especially when it looks like they will be doubling core counts again this year with Zen 2, which should still be supported by existing AM4 motherboards.

Hopefully there will be advances that let both of them push past this IPC wall that CPUs seem to have hit. It sounds like there's some potentially interesting technology that will have the CPU split single-threaded code to run across many cores, but there's no way of knowing how far off that is, or if it will prove to be viable.
Well i am mainly referring to gaming performance wise & i did say AMD are good for their prices, im mainly thinking about how we got sandy bridge in early 2011 & how good that was, 8 years later & it seems we haven't improved so much, Intel especially haven't in the last 4 years. AMD has been great for competition, Intel trying to sell us 4 cores for a high price in like 2017? Thanks for stopping that AMD.
 

Ploid 6.0

Member
Oct 25, 2017
12,440
If the consoles use ray tracing, I'm going to need to upgrade my GPU. Sure, ray tracing isn't efficient now, but it will be.
I'll be waiting 2 or 3 graphic card cycles after ray tracing start picking up. The 20 cards will seem ancient when ray tracing is ever used frequently. Looking forward to the non Nvidia ray tracing though, the one that everyone will be able to use, just like I did when I knew I wouldn't get a Gsync monitor ever, or a Nvidia card unless Nvidia supported VRR and here I am using my 144hz Freesync monitor on my GTX card.
 

TheMadTitan

Member
Oct 27, 2017
27,206
I'll be waiting 2 or 3 graphic card cycles after ray tracing start picking up. The 20 cards will seem ancient when ray tracing is ever used frequently. Looking forward to the non Nvidia ray tracing though, the one that everyone will be able to use, just like I did when I knew I wouldn't get a Gsync monitor ever, or a Nvidia card unless Nvidia supported VRR and here I am using my 144hz Freesync monitor on my GTX card.
For sure. I have a 1080, so this card will be good even when ray tracing picks up, but jumping in during the 21 or 22 series makes the most sense.
 

ArnoldJRimmer

Banned
Aug 22, 2018
1,322
Even though I now use PC as my main platform (having just spent £530 on an RTX 2070 no less!) I always find it funny the insane levels of defensiveness that goes on with big time PC gamers when new console hardware is officially announced. All we heard for the months leading up to the PS4/XB1 release after the spec reveals was how they were "low end desktops GPU's with notebook CPU's!!" yet here we are with games that look as great as Ryse, The Order, FH4, GT Sport, Gears 4, Horizon, Spider-Man and God of War on base consoles (most of them running at native 1080p no less).

I still believe that artistic talent and budget are far more important than specs (within reason). Whoever spends the most money usually has the best looking games regardless of hardware.

What a decidely weird and biased look at those days, assuming yoi are talking about the board that shall not be named.

The pc voices were the only ones with even an ounce of realify attached to them. Console gamers were all claiming secret sauce would make the ps4 beat 780tis in sli and that pc gamesrs would need 1000 cpus to match.

Then like a few months after that there were posts of the 750ti beating the ps4 performance in several games. It didnt last of course due to the small videeo buffer, but still.

Same story again different gen.
 

Pargon

Member
Oct 27, 2017
11,991
Well i am mainly referring to gaming performance wise & i did say AMD are good for their prices, im mainly thinking about how we got sandy bridge in early 2011 & how good that was, 8 years later & it seems we haven't improved so much, Intel especially haven't in the last 4 years. AMD has been great for competition, Intel trying to sell us 4 cores for a high price in like 2017? Thanks for stopping that AMD.
Sandy Bridge was something special alright. Not only was it a huge leap in performance over anything else at the time, it also overclocked from ~3.5GHz to ~4.5GHz on all cores with ease - and higher if you got lucky or didn't mind pushing the voltage up beyond "safe" levels.
A lot of Intel's improvements since then have really been from them gradually increasing the stock clocks to what was previously achieved via overclocking - so if you overclocked, performance seemed fairly stagnant.
But their small IPC and clock gains over time have started to add up, and some games do finally benefit from >4 cores/threads. Sandy Bridge is not holding up in today's games if you want to stay above 60 FPS, and especially not if you're targeting 90+.

In a stress test of the CPU I get about 55 FPS with bad 1% frame times in Deus Ex: Mankind Divided on my i5-2500K system at 4.5GHz, and ~90 FPS with much more consistent frame times on my R7-1700X system running at ~3.9GHz.
That game really seems to really favor Intel CPUs based on The Tech Report's testing (though I question its relevancy to actual gameplay if they're using the game's own benchmark) so the gap should be even wider on something like a 9900K.
 

Ra

Rap Genius
Moderator
Oct 27, 2017
12,201
Dark Space
There's no way AMD just pulled a competitive hardware based RT solution out of a magic hat within 8 months, after being completely blindsided by Nvidia's RTX hardware solution. It's impossible. R&D just doesn't work that way.

Even though I now use PC as my main platform (having just spent £530 on an RTX 2070 no less!) I always find it funny the insane levels of defensiveness that goes on with big time PC gamers when new console hardware is officially announced. All we heard for the months leading up to the PS4/XB1 release after the spec reveals was how they were "low end desktops GPU's with notebook CPU's!!" yet here we are with games that look as great as Ryse, The Order, FH4, GT Sport, Gears 4, Horizon, Spider-Man and God of War on base consoles (most of them running at native 1080p no less).

I still believe that artistic talent and budget are far more important than specs (within reason). Whoever spends the most money usually has the best looking games regardless of hardware.
Don't enter technology discussions if your sensitivities can't handle the heat.
 

Sqrt

Member
Oct 26, 2017
5,880
What the thread should be about.
The answer to that is easy, though. If you want x86, which both MS and Sony would absolutely, you need to go AMD or Intel. Intel had no competitive GPU for a long time and their new GPU tech is an unknown. There was no other option. Done, close the thread. :P
 

Inuhanyou

Banned
Oct 25, 2017
14,214
New Jersey
The answer to that is easy, though. If you want x86, which both MS and Sony would absolutely, you need to go AMD or Intel. Intel had no competitive GPU for a long time and their new GPU tech is an unknown. There was no other option. Done, close the thread. :P

Pretty much. It helps that the chips are cheap and have good power to performance ratio
 

Arthands

Banned
Oct 26, 2017
8,039
I honestly doubt Nvidia was trying hard in the first place. They are cruising with Nintendo Switch and have near total domination on PC right now, and switching from AMD to Nvidia is going to cause some issues.
 

Sqrt

Member
Oct 26, 2017
5,880
I honestly doubt Nvidia was trying hard in the first place. They are cruising with Nintendo Switch and have near total domination on PC right now, and switching from AMD to Nvidia is going to cause some issues.
What I believe it happened is that Nvidia would indeed like the extra business, but weren't near as desperate as AMD might be for extra income so AMD was willing to go lower. Plus AMD had the advantage of designing their own CPUs therefore an unified design SoC. Nvidia would later get an ARM license and design their Denver cores, which ended not being that great TBF. After the PS4 and XB1 went x86, there was no other option for Sony and MS to going forward.
 

Escaflow

Attempted to circumvent ban with alt account
Banned
Oct 29, 2017
1,317
I won't lie, I'm biased, I had AMD stuff in my PC and it wasn't a pleasant experience, never again.

Don't generalize if it's only your end . I switch back and forth between AMD , Intel and Nvidia , never had any real issues between them . Been using Ryzen 1600 @ 3.6Ghz since day one and has been rock solid .
 
Nov 8, 2017
13,096
There's no way AMD just pulled a competitive hardware based RT solution out of a magic hat within 8 months, after being completely blindsided by Nvidia's RTX hardware solution. It's impossible. R&D just doesn't work that way.

You're definitely right thay they didn't do it in 8 months. Whether that means they had planned this much farther in advance or whether they are just being deliberately vague and it's actually being done in compute with maybe minor tweaks at best remains to be seen.
 

DonMigs85

Banned
Oct 28, 2017
2,770
I have a feeling AMD has a more smoke and mirrors approach to RT, but it should still look nice and fairly convincing
 

Oticon

Member
Oct 30, 2017
1,446
I'm optimistic about Navi, it's been in the pipeline for a while now. Finally, AMD is moving on from the GCN architecture. As for Zen 2 (3rd Gen Ryzen processors) are the least of my worries, I expect AMD to take the crown for single threaded performance this time around.