Hmm. Using v-sync in CP halves the refresh rate. Knocks it down to 50hz instead of 100hz. In game v-sync keeps it at 100hz (off in both leads to severe tearing). Pretty sure I just have triple buffering on in CP.
Do other games play ok in 1440p and HDR? I am uncertain if HDR supports 1440p. I know HDR doesn't support custom resolutions.the hdr is so broken.
can't disable it. works ok at 1080p and 4k, but totally broken at 1440p.
EDIT: i7 4790k and a 1070ti plays the intro missions at 1080p 60fps well. I can't try 1440p because the hdr makes my entire screen pink.
Do other games play ok in 1440p and HDR? I am uncertain if HDR supports 1440p. I know HDR doesn't support custom resolutions.
Yep. Destiny 2 is fine in 1440p.they absolutely do. I just played Resident Evil 2 and HDR works great at 1440, so imagine my surprise that Anthem is totally busted.
Yes and yes. I have an EVGA GQ 850W, but G3s are even better and 750W is plenty.Doing a little research it seems like this might be good, and something I could reuse if I do a new build down the road?
EVGA Supernova G3 750W
Sorry for the slightly off topic question, but this is the first game that's done this to my PC.
Well, I wonder why it's not supported in Anthem? I remember having issues with MA: Andromeda and HDR. I can't imagine it'd be a hard fix, surely? Question is would they fix it?they absolutely do. I just played Resident Evil 2 and HDR works great at 1440, so imagine my surprise that Anthem is totally busted.
Well, I wonder why it's not supported in Anthem? I remember having issues with MA: Andromeda and HDR. I can't imagine it'd be a hard fix, surely? Question is would they fix it?
Had read there are more fixes before full launch next week such as an HDR toggle. Looks like Early Access players have the short end of the stick.they need to fix a lot. even if i slam settings to low or medium, targeting a stable 60fps is a crapshoot. HDR is one of many problems.
Apparently CPU intensive.I have a Ryzen 1600 at 3.5Ghz and a 970. 16GB RAM. I'm struggling to get 60 fps at 1080p with everyone turned off and on low. What could be the problem?
How is the game running for people with a GTX 1060 at 1080p?
edit: welp, saw that post above.
Thank Nvidia for that. Anthem is a pack in for new Nvidia cards. I'm sure a Nvidia PR person or engineer told the Dev team that it would be a shame if an easy to use alternative to DLSS was easily accessible in game, ie, using resolution scaling at 80%, we need to push more new Nvidia card sales (and in turn more sales of Anthem). In fact, 80% resolution scaling looks more sharp than DLSS in existing games that use it so Nvidia doesn't want the superior option available in game. At least we can edit the INI file to give us the 80% resolution scaling.How on earth there isn't a resolution scale in a modern Frostbite game?
Thank Nvidia for that. Anthem is a pack in for new Nvidia cards. I'm sure a Nvidia PR person or engineer told the Dev team that it would be a shame if an easy to use alternative to DLSS was easily accessible in game, ie, using resolution scaling at 80%, we need to push more new Nvidia card sales (and in turn more sales of Anthem). In fact, 80% resolution scaling looks more sharp than DLSS in existing games that use it so Nvidia doesn't want the superior option available in game. At least we can edit the INI file to give us the 80% resolution scaling.
It's permanent when put in the ini. I'm still trying to figure out what settings to use. 80% resolution scaling looks great with a 1080 Ti and Ryzen 2700x and performs well in the initial missions, but once you get to the multiplayer, it drops below 60. Either I have to go with 75% resolution scaling (not nearly as nice) or try some settings at medium - I think lighting, post processing, and effects are the big culprits? I have Texture+Texture filtering at Ultra and rest at high, although I think I could probably put environment/plants at Ultra without much if any affect on frame rate.'
I'm still rocking an OC'ed GTX1080 @4K and setting resolution scale to 80 works just perfect for modern Frostbite titles. Does the game resets INI file everytime we open the game or is it permanent?
The triple buffering in CP is only for OpenGL games.Hmm. Using v-sync in CP halves the refresh rate. Knocks it down to 50hz instead of 100hz. In game v-sync keeps it at 100hz (off in both leads to severe tearing). Pretty sure I just have triple buffering on in CP.
Are you running this on an HDD or SSD?Ok need a little help here cause I'm not sure what the issue is. I have an issue (aside from the long loading times) where the assets of the game take forever to load in once I'm actually playing. Whether I'm out in the open world or in the Tarsis; floors, walls, npcs, etc always take a good couple seconds to load in. Out in the world it's more irritating since I see people shooting at things that arent there or me losing health and going down to enemies that havent even loaded in yet. I've done a clean install of my graphics driver (418.91) and I've even lowered all the settings down to low to see if that would effect it but no avail.
i7 4970k
RTX 2070
16GB Ram
Running game at 1440p (tried 1080p and still the same thing)
Any help would be appreciated, driving myself crazy.
Ah, ok, a SSD might be better for this game.
Ahh, ok. Wonder why setting the v-sync to on in CP reduced the refesh rate to 50hz then.
The toggle will be in on 2/22So is there no way to disable HDR in fullscreen mode? The HDR I think looks nice at times but it's practically making my eyes bleed.
That's how double-buffered Vsync works. If it can't maintain fps above your refresh rate then it lowers in 1/2 intervals. You need to use adaptive Vsync if you want variable fps below 100, but there will possibly still be tearing when it drops. There used to be really handy tools to easily force triple buffered Vsync in D3D games but it's now a bit of a crapshoot.Ahh, ok. Wonder why setting the v-sync to on in CP reduced the refesh rate to 50hz then.
Well, weird thing. I have a g-sync monitor and the in-game v-sync works fine. But using the CP v-sync halved it. And I've used the CP v-sync for other games fine. I'm just wondering if there is a setting in the CP that is causing it. Or an issue with Anthem? With in game v-sync I can get near 100fps in Tarsis and up-to 80+ in the world. I know what you're saying but I don't know why in-game v-sync gives me my refresh rate of 100hz and CP v-sync doesn't. I'll try again. Maybe I missed something.That's how double-buffered Vsync works. If it can't maintain fps above your refresh rate then it lowers in 1/2 intervals. You need to use adaptive Vsync if you want variable fps below 100, but there will possibly still be tearing when it drops. There used to be really handy tools to easily force triple buffered Vsync in D3D games but it's now a bit of a crapshoot.
You can try playing in borderless mode and capping frame rate in RTSS. Borderless will force triple buffering through Windows, but it might introduce frame pacing issues and/or decreased performance.
This is why VRR solutions (G-Sync/FreeSync) are so highly regarded.
You shouldn't be using V-Sync at all if you have G-Sync. It doesn't sound like you have G-Sync enabled.Well, weird thing. I have a g-sync monitor and the in-game v-sync works fine. But using the CP v-sync halved it. And I've used the CP v-sync for other games fine. I'm just wondering if there is a setting in the CP that is causing it. Or an issue with Anthem? With in game v-sync I can get near 100fps in Tarsis and up-to 80+ in the world. I know what you're saying but I don't know why in-game v-sync gives me my refresh rate of 100hz and CP v-sync doesn't. I'll try again. Maybe I missed something.
Im on med/low/off at 1440p with a 1070 and still getting drops to 35ish during really heavy fights :(even on high this game struggle really hard to maintain 60 fps, in open zone with combat the framerate go down in the 30ish...
do i really play this game on medium for an acceptable framerate?
the performance for me are the same as the demo unfortunately.
also why i can't disable the washed out hdr of this game with a proper ingame option?
You shouldn't be using V-Sync at all if you have G-Sync. It doesn't sound like you have G-Sync enabled.
Their non-existing HDR option is ridiculous. It just overwrites my Windows setting (actually other games do that too). Also HDR seems to be really bad implemented (if it is at all). Just looks washed out. I don't wanna be forced to play in borderless mode because of that.How do I turn off HDR or rather how the hell do I keep making it turn on?
yeah i stop here and wait for the performance patch later this week.Im on med/low/off at 1440p with a 1070 and still getting drops to 35ish during really heavy fights :(
I don't even enable V-Sync in CP. You don't need to anymore if you play in exclusive full screen.Yeah, turning off V-Sync in-game and enabling it CP along with G-Sync works fine for me. Every time I install new drivers the CP resets so it's always a good idea to check on that before you start your game.
I'll check. But still, all the same, in-game v-sync does not reduce my refresh rate. CP v-sync does.You shouldn't be using V-Sync at all if you have G-Sync. It doesn't sound like you have G-Sync enabled.
In-game Vsync is triple buffered or adaptive then. Likely triple buffered, since that's a native feature of the Source engine. Nvidia CP is standard, old school double buffered.I'll check. But still, all the same, in-game v-sync does not reduce my refresh rate. CP v-sync does.