Which 2070 do you have If you don't mind me asking? Gonna get a 2070 come February and I'm trying to gauge which one is the best of the ones available.
They both had the same brightness settings... PC uses different gamma...They couldn't match the brightness between the two so really hard to tell. You really need everything equalized for a comparison like that.
Runs flawlessly for me, 1440p60 all Ultra, nice work DICE
i5 6600K at 4.4 GHz + GTX 1070
Give me your secret sauce. I have the identical setup and I wouldn't even call it close to flawless at 1080p let alone 1440p.
Give me your secret sauce. I have the identical setup and I wouldn't even call it close to flawless at 1080p let alone 1440p.
dropping frequently below 60 is flawless for some people. they consider hitting 60 sometimes to be 60 fps
I bet they mostly play 32p/Frontlines games. FPS is a lot better there. The i5 6600K is quite a bit faster at multi core stuff than the i5 I had so maybe that is the reason.Give me your secret sauce. I have the identical setup and I wouldn't even call it close to flawless at 1080p let alone 1440p.
I have an i7 2700k and a 980 ti and its really inconsistent for me.Anyone running anything like a i7 2600k and 970 to post what performance is like?
No idea. GSync monitor, 16 GB of DDR4, and it's installed on an SSD. Drop the FPS limit to 60 in the game as keeping it at 144 makes the CPU usage lock at 100% and then I get occasional stutters, although lots of time my FPS is in the low 70s.
Lol, I use Afterburner to monitor my performance enough to know it's stable. Whatever few drop to 55 I get are not noticeable thanks to GSync.
I'm a bit bummed my game can't play a smooth 60 fps with my i5 4670k. Just watched some Xbox one x gameplay and it looks like such a better experience than I'm having. I get bad framerates and some stuttering. Don't see any of that in the xbox footage. Shouldn't my cpu still be good enough to match up? Plus my GPU blows the xbox one x away so I feel it should help the performance more than it does.
Just idk what the difference is and maybe they just optimized harder on the console version compared to PC? People are know to just but more expensive PC parts to brute force performance so maybe that drives them to not worry about it too much.
I have an i7-6700k and from what I can see it's definitely taking more cores/threads. Well, looks like there's a game that definitely benefits from more than 4 cores now.
Ok, that's cool. Glad BFV seems to love HT at least.I had hyperthreading off for a while because Destiny 2 performs better with it disabled for me, and when I tried the BFV beta it ran like garbage. 40 fps often. Turning hyperthreading back on nearly doubled my FPS. Game really loves threads.
I have the same CPU and a RX 480 @1440p. You'll be fine in MP too (never drops below 70-80 on everything high). The 2600 X and non X are awesome for that price. What temperatures do you get on the Vega 64 though?This is my system.
Ryzen 5 2600X
Strix Vega 64 (A goddamn furnace)
16 gig of RAM
3400 x 1400 Freesync Ultrawide
I've only played a few single players missions so far. But, performance is great! I'm happy with locking it to 60 and it pretty much stays there. Minimal dips into the 50s that I don't notice due to freesync doing its job. I think we can notch this one up as a rare win for AMD cards.
This is my system.
Ryzen 5 2600X
Strix Vega 64 (A goddamn furnace)
16 gig of RAM
3400 x 1400 Freesync Ultrawide
I've only played a few single players missions so far. But, performance is great! I'm happy with locking it to 60 and it pretty much stays there. Minimal dips into the 50s that I don't notice due to freesync doing its job. I think we can notch this one up as a rare win for AMD cards.
I'm a bit bummed my game can't play a smooth 60 fps with my i5 4670k. Just watched some Xbox one x gameplay and it looks like such a better experience than I'm having. I get bad framerates and some stuttering. Don't see any of that in the xbox footage. Shouldn't my cpu still be good enough to match up? Plus my GPU blows the xbox one x away so I feel it should help the performance more than it does.
Just idk what the difference is and maybe they just optimized harder on the console version compared to PC? People are know to just but more expensive PC parts to brute force performance so maybe that drives them to not worry about it too much.
I only have the non-K version of the 6600 paired with an R9 390. I'll probably won't enjoy this as well.i5 6600k
GTX 970
16 GB of Ram
I can run Medium on all settings but the Frame Rate varies to 45-70 FPS :(
My i5 is definitely showing it's age now.
The thing is -- and this is what I always said about low level APIs -- even when done very well what will happen is that it will be efficient on one or two GPU architectures.im not an AMD aficionado but id love to see pcs start to approach the efficiency level of consoles regardless of gpu vendor so im still for properly implemented low level apis. id tech 6 is a good example
The thing is -- and this is what I always said about low level APIs -- even when done very well what will happen is that it will be efficient on one or two GPU architectures.
A well-made game on a higher level API, with well-made driver support, can run just a bit less efficiently, but on tons of architectures, including ones that potentially aren't out at the time when the game is released.
In a way that might make the GPU market harder for a new/underdog player to be competitive in.
I have the same CPU and a RX 480 @1440p. You'll be fine in MP too (never drops below 70-80 on everything high). The 2600 X and non X are awesome for that price. What temperatures do you get on the Vega 64 though?
This is the video I watched. I would love for my game to run that consistently. Even if I turn all settings to low and resolution scale all the way to 50% of 1080p and it looks terrible, it still runs about the same. Just completely CPU bound. Guess the game really just needs more cores/threads even if they are alot slower. That console footage is a dream compared fo when i play, even if i can make the graphics look a little better on the PC.whats your gpu and which X footage are you referring to. but yes generally consoles achieve much better cpu efficiency than pcs. especially under dx11
This is the video I watched. I would love for my game to run that consistently. Even if I turn all settings to low and resolution scale all the way to 50% of 1080p and it looks terrible, it still runs about the same. Just completely CPU bound. Guess the game really just needs more cores/threads even if they are alot slower. That console footage is a dream compared fo when i play, even if i can make the graphics look a little better on the PC.
https://youtu.be/D23lqQYnCKg
My GPU is a GTX 1080. I've always been aware I'll be CPU bound since I upgraded to a 1080 2 years ago but no other game is really a problem since I only play at 1080p 60fps with sometimes using DSR. Battlefield is basically the only thing that ruins my performance to where 1080p 60fps isn't achievable. Except for something like Planet Coaster with tons of stuff in the level. But I don't feel bad when that game runs slower of course.
We need to get GameGPU some RTX 2000 series cards.
The same for me. It is, most probably, caused by Future Frame Rendering = On
Deactivating it at DX11 will massively drop GPU load and the game starts to hit below 60 fps for me (vs 80-140).
Deactivating it at DX12 doesn't introduce the GPU load stuck at 70% problem, but it stutters like crazy.
Runs flawlessly for me, 1440p60 all Ultra, nice work DICE
i5 6600K at 4.4 GHz + GTX 1070
dropping frequently below 60 is flawless for some people. they consider hitting 60 sometimes to be 60 fps
I bet they mostly play 32p/Frontlines games. FPS is a lot better there. The i5 6600K is quite a bit faster at multi core stuff than the i5 I had so maybe that is the reason.
The consoles can't keep 60fps either. Even the X.It's pretty wild that consoles with essentially bad laptop CPUs can run this engine well but much more powerful desktop CPUs have a lot of trouble
There are at least 5 GPU architectures currently in use by PC gamers.theres only ever realistically going to be 2 architectures any way.
I'm simply pointing out that having games optimize for a very specific GPU architecture (or a small number of architectures) is absolutely not something that is exclusively advantageous on PC.Yeah screw DX12 and Vulkan, let's stay on DX11 another 7 years. Very forward thinking...
Same, except at 1080p.
I also have flawless performance on my i5-6600k at 4.3 on 64 player servers, max detail (with Shadowplay recording simultaneously). Also have Afterburner running usually. I think there's a few people here with problems with their rigs. Might be software misconfiguration, slow RAM or something.
There are at least 5 GPU architectures currently in use by PC gamers.
I'm simply pointing out that having games optimize for a very specific GPU architecture (or a small number of architectures) is absolutely not something that is exclusively advantageous on PC.
Hardware abstraction is something that exists for a reason, and moving to higher or lower levels of abstraction introduces a complex set of tradeoffs. I miss that aspect in the general discussion of it.
Maybe BF V has the same issue that I once had with BF 1 and Battlefront 2 where I needed to disable Windows 10's fullscreen optimization on the game's exe. Without doing that, I was getting insane stuttering and hitching.the gameplay I see in this video:
https://www.youtube.com/watch?v=D23lqQYnCKg
looks way smoother. Even when I implement a 60fps cap or run my monitor at 60fps and vsync, I can't get rid of the random hitching/stutters
the gameplay I see in this video:
https://www.youtube.com/watch?v=D23lqQYnCKg
looks way smoother. Even when I implement a 60fps cap or run my monitor at 60fps and vsync, I can't get rid of the random hitching/stutters