• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.

Papacheeks

Banned
Oct 27, 2017
5,620
Watertown, NY
Yes yes, and why isn't intel 9600K in these leaked benchmarks? Because maybe these leaked benchmarks are heavily GPU dependant (such as superposition 4K) and if it was compared to lower end intel CPUs then AMD's 200$ CPU wouldn't look as impressive? I mean look at what's happening with FC5.

Let's check TPU review for 9600K, in pretty much all games at 1440p 9600K is equal to 9900K. https://www.techpowerup.com/review/intel-core-i5-9600k/14.html

Perspective, man, it's all about perspective. You can easily make 9900K look bad if you try hard enough.

It is bad for the amount of money it costs, on top of how much power it draws, to how hot it gets being clocked high. It takes a big block heatsink or AIO to cool it properly.

The 3600 was running on amd's shitty spire cooler and not overclocked nor on a super great bios.

After 1903 update I've seen difference in performance not in games but in overall tasks on my x1600. Ryzen will get more optimizations to utilize those cores, and the way it works with infinity fabric.

It still shows how shitty Intel's proposition is. That 9600k is not 12 thread cpu, so trying to run a game and do a stream at the same time is not as smooth. I think the point of this early review is to show the improvements on IPC for single threaded apps and to show the gains in multi-threaded apps.

Games are starting to make use of more cores, and this lineup will push that further once the new consoles are released.
 

Serious Sam

Banned
Oct 27, 2017
4,354
It is bad for the amount of money it costs, on top of how much power it draws, to how hot it gets being clocked high. It takes a big block heatsink or AIO to cool it properly.

The 3600 was running on amd's shitty spire cooler and not overclocked nor on a super great bios.

After 1903 update I've seen difference in performance not in games but in overall tasks on my x1600. Ryzen will get more optimizations to utilize those cores, and the way it works with infinity fabric.

It still shows how shitty Intel's proposition is. That 9600k is not 12 thread cpu, so trying to run a game and do a stream at the same time is not as smooth. I think the point of this early review is to show the improvements on IPC for single threaded apps and to show the gains in multi-threaded apps.

Games are starting to make use of more cores, and this lineup will push that further once the new consoles are released.
9600K offers almost the same gaming performance as 9900K at fraction of the cost, how is that bad? After the upcoming price drop it will be a sweet mid-range CPU to get.
 

Deleted member 10847

User requested account closure
Banned
Oct 27, 2017
1,343
9600K offers almost the same gaming performance as 9900K at fraction of the cost, how is that bad? After the upcoming price drop it will be a sweet mid-range CPU to get.

If you only game, if the benchs we are seeing are correct even at the same price you would be better off with the double of the threads beeing offered by the 3600.
 

Green Yoshi

Attempted to circumvent ban with an alt account
Banned
Oct 27, 2017
2,597
Cologne (Germany)
I will buy a new PC later this year (it's nine years old and the support of Windows 7 will end in January). Should I buy an AMD CPU and an AMD GPU or is a RTX 2060 Super the better choice? Ryzen 3000 should be very similar to the CPU in PS5 and Xbox Scarlett but a 50% Navi-10-GPU might not have so much similarity to a 100% Navi-20-GPU. As a console gamer I'm not really an expert, but if I buy a PC that's as powerful as the Xbox Scarlett I only need a PlayStation 5.
 

Deleted member 25042

User requested account closure
Banned
Oct 29, 2017
2,077
I'll echo some of the comments in this thread by saying that the gaming benches aren't exacly exciting.
A 9600k would fare better going by those.
Still semi interested by a potential upgrade from my 6700k to maybe a 3700x so I'll wait for better reviews.
 

Ra

Rap Genius
Moderator
Oct 27, 2017
12,196
Dark Space
I find it amusing that every tech related sub on Reddit has a zero tolerance ban on WCCFTech, but we still rush to post them here like they are even a smidgen reliable.
 

Gestault

Member
Oct 26, 2017
13,352
FYI, the i7-6700k (in those first three benchmarks) was Intel's high end consumer chip... four years ago.

I was super confused by this as well. That's the CPU in my tower from a few years back, and it didn't exactly break the bank when I bought it.

Edit: Looking at it now, that's still a $300 CPU, meaning the 3600 is 33% cheaper. That's danged good.
 

Wraith

Member
Jun 28, 2018
8,892
I was super confused by this as well. That's the CPU in my tower from a few years back, and it didn't exactly break the bank when I bought it.

Edit: Looking at it now, that's still a $300 CPU, meaning the 3600 is 33% cheaper. That's danged good.
The reason these older chips show relatively high prices is due to supply. They're no longer in production anymore, so retailers that have them keep the prices high. (Anyone doing a new build today shouldn't buy a 6700k for $300.) And there's still some demand from people looking to upgrade from their old i3/i5, but they'd probably be better served buying used. (Seeing plenty on eBay that sold for ~$230.)
 
Last edited:

Deleted member 25042

User requested account closure
Banned
Oct 29, 2017
2,077
I was super confused by this as well. That's the CPU in my tower from a few years back, and it didn't exactly break the bank when I bought it.

Edit: Looking at it now, that's still a $300 CPU, meaning the 3600 is 33% cheaper. That's danged good.

As already said the comparison with a EOL 6700k doesn't make much sense.
A 9600k would be a better one
 

Jonnax

Member
Oct 26, 2017
4,918
Intel's deceitful marketing benchmarks with the footnote that they hadn't applied security patches to their processors whilst comparing them means I won't buy their CPUs next time I get one.
 

Pargon

Member
Oct 27, 2017
11,971
I hope so. I dont recall many games that scale particularly well passed 4 cores. Performance improves but returns get very diminished with each additional core. Also just in case i didnt word it clearly, when i say high framerate im referring to framerates well above 60
The main thing with recent games is that you need to get away from 4 threads.
4c4t CPUs perform quite a bit worse than 4c8t CPUs now - particularly if you are paying attention to 1% and 0.1% lows.

But some games do really benefit from having 8 physical cores. Deus Ex: Mankind Divided shows a significant improvement with 8c8t vs 4c8t in my testing.
What's also surprising is that it doesn't seem to scale well beyond 8 threads. 8c16t performs worse than 8c8t. I'd be interested in how that ran on a 3950X which has 16 cores - to see if it's an issue with SMT, or how the engine splits up jobs.

dxmd-scaling-5wjsz.png


It's the 1% and 0.1% Low values which really show a difference moving beyond 4 cores though - look at the 6c12t results compared to 4c8t. Minimum, Average, and Maximum frame rates are almost the same, but 0.1% Low is 37% higher.
Many sites are still only looking at average frame rate, or avg/max/min.
I've also done more testing which highlighted that a basic chart like the one above is insufficient for representing performance in many games. It worked for DXMD, but you really need percentile charts and "time spent beyond X" charts like the ones The Tech Report use. My testing in Watch Dogs 2 actually looked worse for 8c16t than 4c8t despite 8c16t clearly being better in-game. The way it scales with cores is quite interesting too. Maximum/average performance goes down a bit, but minimum and 1%/0.1% Lows go up - which is more important for how a game actually plays.

It still shows how shitty Intel's proposition is. That 9600k is not 12 thread cpu, so trying to run a game and do a stream at the same time is not as smooth.
I don't really think the "game and stream" tests that AMD push are really that valid.
For one thing, NVIDIA's GPU encoder now produces higher quality results than the x264 Fast preset which most people use for CPU-based encoding if it's a single-PC setup.
But I've been doing a lot of testing in OBS myself recently for recording and streaming games, and even though I have an 8c16t Ryzen 1700X, I'd never use CPU-based encoding. CPU-based encoding is far more likely to drop frames, and affects game performance considerably more than GPU-based encoding.
Though it does not apply to streaming, it's particularly noticeable once you start trying to record high frame rate gaming at 120 FPS or higher. 180 FPS seems to be about the limit for my GTX 1070 with the resolutions I'm recording at, but using CPU encoding means the games will never reach those frame rates in the first place.

Should I buy an AMD CPU and an AMD GPU or is a RTX 2060 Super the better choice? Ryzen 3000 should be very similar to the CPU in PS5 and Xbox Scarlett but a 50% Navi-10-GPU might not have so much similarity to a 100% Navi-20-GPU. As a console gamer I'm not really an expert, but if I buy a PC that's as powerful as the Xbox Scarlett I only need a PlayStation 5.
There are no special benefits to pairing an AMD GPU with an AMD CPU.
I would be hesitant to buy any PC hardware right now if you are planning on keeping it around for the entire next generation without upgrading it.
We don't have full specs for the consoles, and we don't know how current hardware will perform in next-gen games. By the time next-gen consoles are out, Ryzen 4/Zen 3 should be available.
If you have a high-end NVIDIA GPU like a 2080 Ti it will probably do just fine with next-gen games, but I'd be concerned about buying mid-range PC hardware today that is supposedly on-par with what the consoles might be offering based on leaked specs. It's probably going to under-perform a year or two after launch.

Personally, I'm planning on upgrading to a 16 core CPU for next-gen.
If the consoles are using 8c16t Zen 2 CPUs, I want the option of putting that work onto 16 physical cores instead - that should help push next-gen games to higher frame rates, since the gap between console and desktop CPU performance will be narrowing.
I do wonder what that means for Intel though. I'd be very surprised if they had 16-core mainstream CPUs, and if they did, they're unlikely to be using the standard ring bus design which has helped keep latency low and plays a big role in their gaming performance. I don't think they've had any CPUs beyond 10 cores which use a single ring bus.
 
Last edited:

Wraith

Member
Jun 28, 2018
8,892
As already said the comparison with a EOL 6700k doesn't make much sense.
A 9600k would be a better one
Looking back at the benchmarks, we'll probably see the 3600 land somewhere between the 2600X and 9600K in gaming benchmarks. The 9600k will probably have an edge in most games, but not as much as it used to. And for anyone looking to stream, or doing a budget build where the bundled cooler and -$20 sticker price helps them stay under budget, the 3600 will be an easy choice.
 

laser

Member
Feb 17, 2018
310
So when Intel refreshes everything they'll still likely beat them out if their processor that's based on pretty old generation stuff beats AMD's newer stuff.

This is AMD's mid range CPU. AMD's high end CPU will likely match Intel's high end in single-threaded workloads and destroy them in multi-threaded workloads, though waiting for benchmarks is needed. Intel's refresh will be just adding 2 more cores on an already hot 9900K. They've pretty much squeezed all they can out of the 14nm++++++++++ process. Intel's new architecture won't be coming to desktop until late 2020 or early 2021. By that time Zen 3 will be out.
 

JahIthBer

Member
Jan 27, 2018
10,372
I will buy a new PC later this year (it's nine years old and the support of Windows 7 will end in January). Should I buy an AMD CPU and an AMD GPU or is a RTX 2060 Super the better choice? Ryzen 3000 should be very similar to the CPU in PS5 and Xbox Scarlett but a 50% Navi-10-GPU might not have so much similarity to a 100% Navi-20-GPU. As a console gamer I'm not really an expert, but if I buy a PC that's as powerful as the Xbox Scarlett I only need a PlayStation 5.
Don't buy an AMD GPU at the moment, it's pretty likely at this point the GPU's in the Consoles will be Navi 10 (so 5700) with RDNA's ray tracing, if you buy a 5700 you are getting ripped off & it will be obsolete when PS5 comes out.
Nvidia GPU's with their own version of ray tracing will last longer, i would suggest a RTX 2070, you might be able to get one for $400 when the Super 2070 comes out.
 

JahIthBer

Member
Jan 27, 2018
10,372
Personally, I'm planning on upgrading to a 16 core CPU for next-gen.
If the consoles are using 8c16t Zen 2 CPUs, I want the option of putting that work onto 16 physical cores instead - that should help push next-gen games to higher frame rates, since the gap between console and desktop CPU performance will be narrowing.
I do wonder what that means for Intel though. I'd be very surprised if they had 16-core mainstream CPUs, and if they did, they're unlikely to be using the standard ring bus design which has helped keep latency low and plays a big role in their gaming performance. I don't think they've had any CPUs beyond 10 cores which use a single ring bus.

Intel needs to get their ass into gear & stop refreshing Skylake. AMD's 16 core processor looks very good for it's price, even as an Intel user, it's tempting.
Keep in mind with 16 cores, you can comfortably disable HT & OC it to 5ghz safely, you really don't need all those threads for gaming when you got 16 cores.
 

Trieu

Member
Feb 22, 2019
1,774
Looks like I'm buying a 3800X. Can finally retire my 2500K.

What makes you prefer the 3800X over the 3700X? Asking out of curiosity because I am going with either the 3700X or 3800X, but I don't see the value of the 3800X for $70 more. Well maybe if you keep both at stock the average clockspeed of the 3800X is higher but thats about it
 

LCGeek

Member
Oct 28, 2017
5,855
It still shows how shitty Intel's proposition is. That 9600k is not 12 thread cpu, so trying to run a game and do a stream at the same time is not as smooth. I think the point of this early review is to show the improvements on IPC for single threaded apps and to show the gains in multi-threaded apps.

I'm with this right here. So hate how gaming benchmarks are made not to address practical situations in moments like those. Be it streaming or if you multi task in some games once other apps are open it's a FFA on cpu cycles and windows management is bad.

Ryzen lets me ensure the OS, nic, and 3rd programs have their own cores while gaming has it own.
 

Wraith

Member
Jun 28, 2018
8,892
So, does Intel drop the 6700K, 7700K and higher by $100 now?
Intel doesn't sell the 6700k/7700k anymore. They're dropping MSRP on some of their current models 10-15%, per recent news.

Any price changes on out-of-production CPUs are up to individual retailers who still carry them. Doubt we'll see much movement there. (The pricing retailers set on these isn't really based on what newly-released CPUs are going for. It's what people might be willing to pay to upgrade their old build, and the relatively low availability of older CPUs. The prices for these older CPUs is already ridiculous vs. the price of current-gen stuff.)
 
Last edited:

Kormora

Member
Nov 7, 2017
1,413
What makes you prefer the 3800X over the 3700X? Asking out of curiosity because I am going with either the 3700X or 3800X, but I don't see the value of the 3800X for $70 more. Well maybe if you keep both at stock the average clockspeed of the 3800X is higher but thats about it

You are most likely correct, but I was looking at the wrong one. I may get the 12 core one which is actually the 3900X for encoding stuff or go for a 3700X and OC it.
 

Trieu

Member
Feb 22, 2019
1,774
You are most likely correct, but I was looking at the wrong one. I may get the 12 core one which is actually the 3900X for encoding stuff or go for a 3700X and OC it.

oh okay. Yeah the 3900X is a sweet one. Actually wondering if I should get that one. I bet the 12c/24t is going to last a long time
 

Bosch

Banned
May 15, 2019
3,680
I still need to upgrade, I will wait if Ryzen 7 3800X can compete on games( at least same perf than 9900K) if not I'm grabing a 9700K with 15% off

These benchs doesn't give me many hope...
 

Kormora

Member
Nov 7, 2017
1,413
oh okay. Yeah the 3900X is a sweet one. Actually wondering if I should get that one. I bet the 12c/24t is going to last a long time

I guess if I want to future proof could go for the 16 core one, but that means waiting till September, and I really want to switch badly from this 2500K in July.
 

RedSwirl

Member
Oct 25, 2017
10,048
Can you play any new Ubisoft game above or at 60?

I actually haven't been that interested in recent Ubisoft games. The most recent things I played on it from Ubisoft were Assassin's Creed Unity, The Division 1, and the beta for Wildlands. I have this thing hooked up to a TV so going above 60 isn't a factor for me. I don't remember performance being tough on any of them but it was a while ago when I played them. I do have the full Wildlands game installed now though so I guess I could try that.
 
Apr 9, 2018
368
I'm with this right here. So hate how gaming benchmarks are made not to address practical situations in moments like those. Be it streaming or if you multi task in some games once other apps are open it's a FFA on cpu cycles and windows management is bad.

Ryzen lets me ensure the OS, nic, and 3rd programs have their own cores while gaming has it own.

Yeah it surprises me that none of the major hardware review sites I read test CPU performance under realistic conditions. Out of interest I would like to see game benches with a torrent client, two game clients (Steam and Origin), music app and Chrome open, for example. This is my usual running conditions with more crap running in the background on top.
 

dgrdsv

Member
Oct 25, 2017
11,821
Yeah it surprises me that none of the major hardware review sites I read test CPU performance under realistic conditions. Out of interest I would like to see game benches with a torrent client, two game clients (Steam and Origin), music app and Chrome open, for example. This is my usual running conditions with more crap running in the background on top.
All this crap consumes about 10% of one CPU thread when it's in the background - unless it's actually doing something really heavy for some reason.
 
OP
OP

Deleted member 4783

Oct 25, 2017
4,531
All this crap consumes about 10% of one CPU thread when it's in the background - unless it's actually doing something really heavy for some reason.
For real. My i7 4790 is practically not used when I have those apps running (origin, steam, Discord, mouse software, keyboard software). If lucky it will reach 10% running Firefox
 

LCGeek

Member
Oct 28, 2017
5,855
All this crap consumes about 10% of one CPU thread when it's in the background - unless it's actually doing something really heavy for some reason.
Chrome or firefox depending on sites don't merely take up 10%. If the site has shit design like say certain path of exile sites or tumblr 10% isn't even minimum as to what certain pages will drain.
 
Apr 9, 2018
368

dgrdsv

Member
Oct 25, 2017
11,821
Chrome or firefox depending on sites don't merely take up 10%. If the site has shit design like say certain path of exile sites or tumblr 10% isn't even minimum as to what certain pages will drain.
Sure, but such sites are rare and modern browsers are getting pretty good at throttling all such sites to a zero when they are in the background.
 

LCGeek

Member
Oct 28, 2017
5,855
Sure, but such sites are rare and modern browsers are getting pretty good at throttling all such sites to a zero when they are in the background.

that's dependent on the site the browser is connect to. It being rare doesn't matter if I'm in a game or tool I use is spawning tabs. the problem with some of the sites I mentioned they arent inactive so their designs are constantly stressing your cpu.

This also doesn't address his steaming or torrent points. Networking is no joke no matter the platform and once you start high packet generation short of having very good offloading your cpu will get stressed the more demanding the task is. Some streamers will literally encoded, upload and play a game intel sucks at it and these ryzen are game changers for types like us.
 

Deleted member 24021

User requested account closure
Banned
Oct 29, 2017
4,772
Can't wait to upgrade to the 3900X, those sweet 12 cores are gonna be mine. Some of the CPU heavy games I've been playing have been murdering my i5 4690K.
 

dgrdsv

Member
Oct 25, 2017
11,821
that's dependent on the site the browser is connect to. It being rare doesn't matter if I'm in a game or tool I use is spawning tabs. the problem with some of the sites I mentioned they arent inactive so their designs are constantly stressing your cpu.

This also doesn't address his steaming or torrent points. Networking is no joke no matter the platform and once you start high packet generation short of having very good offloading your cpu will get stressed the more demanding the task is. Some streamers will literally encoded, upload and play a game intel sucks at it and these ryzen are game changers for types like us.
If rarity of something which can load up your CPU doesn't matter then it doesn't matter which CPU you have - you'll run out of it's resources no matter what.
 

LCGeek

Member
Oct 28, 2017
5,855
If rarity of something which can load up your CPU doesn't matter then it doesn't matter which CPU you have - you'll run out of it's resources no matter what.


I'm saying the rarity doesn't matter cause people doing functions like me or other similar power users will run in to those problems. So yeah for a consumer like myself it would be nice to know that the processor I'm getting is good at the task or what is best in the way I'm using the cpu I'm buying. I lack the means to test like benchmark sites and if I had the means I wouldn't care or need them.

Even if that wasn't the case it still doesn't address what me or the user mentioned besides browsers which is encoding and uploading both which can eat in to any cpu thsg they have access too. Affinity is a wonderful tool especially if you have a lot cores/threads which ryzen is making possible.

A ryzen will better distribute such programs wanting cpu power. A Ryzen cpu 12+ threads minimum will simply let me partition enough of my cores so that problematic programs can get nowhere near any of my gaming or os usage. I'm glad to be ditching any 4 core systems I have soon.
 
Last edited:

icecold1983

Banned
Nov 3, 2017
4,243
The main thing with recent games is that you need to get away from 4 threads.
4c4t CPUs perform quite a bit worse than 4c8t CPUs now - particularly if you are paying attention to 1% and 0.1% lows.

But some games do really benefit from having 8 physical cores. Deus Ex: Mankind Divided shows a significant improvement with 8c8t vs 4c8t in my testing.
What's also surprising is that it doesn't seem to scale well beyond 8 threads. 8c16t performs worse than 8c8t. I'd be interested in how that ran on a 3950X which has 16 cores - to see if it's an issue with SMT, or how the engine splits up jobs.

dxmd-scaling-5wjsz.png


It's the 1% and 0.1% Low values which really show a difference moving beyond 4 cores though - look at the 6c12t results compared to 4c8t. Minimum, Average, and Maximum frame rates are almost the same, but 0.1% Low is 37% higher.
Many sites are still only looking at average frame rate, or avg/max/min.
I've also done more testing which highlighted that a basic chart like the one above is insufficient for representing performance in many games. It worked for DXMD, but you really need percentile charts and "time spent beyond X" charts like the ones The Tech Report use. My testing in Watch Dogs 2 actually looked worse for 8c16t than 4c8t despite 8c16t clearly being better in-game. The way it scales with cores is quite interesting too. Maximum/average performance goes down a bit, but minimum and 1%/0.1% Lows go up - which is more important for how a game actually plays.


I don't really think the "game and stream" tests that AMD push are really that valid.
For one thing, NVIDIA's GPU encoder now produces higher quality results than the x264 Fast preset which most people use for CPU-based encoding if it's a single-PC setup.
But I've been doing a lot of testing in OBS myself recently for recording and streaming games, and even though I have an 8c16t Ryzen 1700X, I'd never use CPU-based encoding. CPU-based encoding is far more likely to drop frames, and affects game performance considerably more than GPU-based encoding.
Though it does not apply to streaming, it's particularly noticeable once you start trying to record high frame rate gaming at 120 FPS or higher. 180 FPS seems to be about the limit for my GTX 1070 with the resolutions I'm recording at, but using CPU encoding means the games will never reach those frame rates in the first place.


There are no special benefits to pairing an AMD GPU with an AMD CPU.
I would be hesitant to buy any PC hardware right now if you are planning on keeping it around for the entire next generation without upgrading it.
We don't have full specs for the consoles, and we don't know how current hardware will perform in next-gen games. By the time next-gen consoles are out, Ryzen 4/Zen 3 should be available.
If you have a high-end NVIDIA GPU like a 2080 Ti it will probably do just fine with next-gen games, but I'd be concerned about buying mid-range PC hardware today that is supposedly on-par with what the consoles might be offering based on leaked specs. It's probably going to under-perform a year or two after launch.

Personally, I'm planning on upgrading to a 16 core CPU for next-gen.
If the consoles are using 8c16t Zen 2 CPUs, I want the option of putting that work onto 16 physical cores instead - that should help push next-gen games to higher frame rates, since the gap between console and desktop CPU performance will be narrowing.
I do wonder what that means for Intel though. I'd be very surprised if they had 16-core mainstream CPUs, and if they did, they're unlikely to be using the standard ring bus design which has helped keep latency low and plays a big role in their gaming performance. I don't think they've had any CPUs beyond 10 cores which use a single ring bus.
thx for providing those benches. Ya 4c 8t is what i was referring too. Avg framerates just dont scale well passed that number at this point
 

Cipherr

Member
Oct 26, 2017
13,418
Yeah it surprises me that none of the major hardware review sites I read test CPU performance under realistic conditions. Out of interest I would like to see game benches with a torrent client, two game clients (Steam and Origin), music app and Chrome open, for example. This is my usual running conditions with more crap running in the background on top.

Yep, most people I know that game like myself are running a similar gamut of software while gaming. This idea that we shouldn't value more cores/threads should be dying if not already dead. How many of us are PC gaming on a single monitor versus gaming on one monitor and multitasking on the other?

The idea that you don't need more cores unless you are encoding or something needs to seriously die. Reminds me of the old "Oh, don't bother with more than a few gigs of ram for gaming".... That one aged equally bad too, now you got people crying about gaming with a Chrome window open and a few tabs, lol.

Core scaling for gaming is only going to continue to improve. If you are building a system now you plan to use for 3 years, consider the rate of change and how quickly things are shifting.