• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.
  • We have made minor adjustments to how the search bar works on ResetEra. You can read about the changes here.
Oct 27, 2017
2,272
Pittsburgh
What are your specs again, and driver version?

Something else I forgot that you could try.

Find your ACOrigins.exe
Right Click
Properties
Compatibility
Check "Disable Fullscreen Optimizations"
Apply
Click Change settings for all users and do it again just to be safe.

I actually just did that before I shut down for the night, didn't help sadly.

Running a 4770k with a 1080ti, and the newest drivers. I actually tossed one driver back as well, same thing..

I've seen people mention fast vsync, would that perhaps change anything? Or possibly turning on vsync in ncp and letting it control it possibly?
 

MrBS

Member
Oct 27, 2017
6,236
7700K & 1080ti at 4K here, frame rate remains high most of the time but fluctuates wildly. I refuse to lower any setting though, looks too good! Loving the game, frequently just floored by the visuals.
 

criesofthepast

Crash Test Dummy
Member
Oct 25, 2017
1,056
Is this game more or less demanding than The Division? Because that game looks and runs great on my PC.
 

flipswitch

Member
Oct 25, 2017
3,959
7700K & 1080ti at 4K here, frame rate remains high most of the time but fluctuates wildly. I refuse to lower any setting though, looks too good! Loving the game, frequently just floored by the visuals.


Does lowering any settings make much of a difference in performance? I tried lowering AA and shadows and couldn't tell.
 

Ion Stream

Member
Oct 31, 2017
398
Is and i7 3770K OC'd to 4.4ghz going to run this ok? Fairly old CPU. Half thinking I should just grab the PS4 version in a sale and be done with it.

Got a GTX 970 here btw.
 

Deleted member 10601

User requested account closure
Banned
Oct 27, 2017
348
Is and i7 3770K OC'd to 4.4ghz going to run this ok? Fairly old CPU. Half thinking I should just grab the PS4 version in a sale and be done with it.

Got a GTX 970 here btw.

This is from the UBIsoft forum. If you care about 60 fps gaming, your i7 is maybe not ok.

pc.png
 
Last edited:

SeñorPig

Member
Oct 27, 2017
16
Is and i7 3770K OC'd to 4.4ghz going to run this ok? Fairly old CPU. Half thinking I should just grab the PS4 version in a sale and be done with it.

Got a GTX 970 here btw.
I have the same processor with 1080. I play at 3440x1440 ultra high with 45 FPS lock. While 60 FPS would be better, 45 is not bad.

My laptop is 980m, and I can play on ultra high with 45 FPS lock at 1080p.
 

JordanKZ

Member
Oct 27, 2017
226
I seem to hold a pretty steady 60-70fps on an i7 4770K and Titan X Pascal. Alexandria and other complex areas tank the frame rate down to the mid 40's though. Gsync helps, but man, its another cluster fuck from Ubisoft in terms of optimisation...

Looking at the stats and seeing my CPU pinned at 100% on all cores in some areas is insane.
 

ussjtrunks

Member
Oct 25, 2017
1,692
The cutscene stutter is the only thing stopping me playing the game atm hopefully they fix it before i finish south park
 

bargeparty

Member
Oct 30, 2017
504
I actually just did that before I shut down for the night, didn't help sadly.

Running a 4770k with a 1080ti, and the newest drivers. I actually tossed one driver back as well, same thing..

I've seen people mention fast vsync, would that perhaps change anything? Or possibly turning on vsync in ncp and letting it control it possibly?

I went back through your old posts... it must be something that happened as a result of the Fall Update, because you said performance was good before that, and I just watched a video of a person with your specs running native 4k 60fps. It wasn't in a super demanding area, but still. Their video was uploaded on Oct 30th.

You could try rolling back http://www.thewindowsclub.com/rollback-uninstall-windows-10-creators-update

I'll put a few others links out here, I don't know what research you guys have done on your own and I don't have the fall update so I can't really say.

this one includes other links to discussions:
https://www.reddit.com/r/Windows10/comments/79utn0/fall_creators_update_build_1709_causing/

this person says they switched their power mode:
https://www.reddit.com/r/Windows10/...ws_10_feature_update_from_1017_seems_to_have/
 

JD3Nine

The Fallen
Nov 6, 2017
1,866
Texas, United States
It's not so bad for me on a 6700k and 1070. Mostly 60 with everything maxed out besides shadows, AO and AA. CPU usage is high but not 100%. That cutscene stutter is nuts though. I don't get it.
 

TitanicFall

Member
Nov 12, 2017
8,277
They been asking people to post whether they play with an HDR capable TV or monitor, along with model numbers. I don't think they'd bother collecting the info if they weren't at least considering adding HDR support. Of course, it could just be for future titles.

Why even ask though and not just include it like they did with the consoles? You don't need to collect info on every TV or monitor in order to include HDR support. They didn't do that for the PS4 and Xbox One. It should just work.
 

scitek

Member
Oct 27, 2017
10,077
Why even ask though and not just include it like they did with the consoles? You don't need to collect info on every TV or monitor in order to include HDR support. They didn't do that for the PS4 and Xbox One. It should just work.
Honestly, I think it is coming, and I also think they were just ignorant to the fact that HDR on PC is a big deal now. It's pushed so heavily by the console manufacturers as a feature, of course they knew to push it there, but the "all supported platforms" blurb that's gotten them into hot water, along with the following reaction just comes across as them being surprised. How that could be the case when a thousand-plus people worked on the game, I don't really understand, but I think that's the case. I also think they should definitely take a step back and see why it's the case, because it definitely doesn't look good.
 

GrrImAFridge

ONE THOUSAND DOLLARYDOOS
Member
Oct 25, 2017
9,675
Western Australia
When's the next patch dropping?

Ubi decided against releasing v1.04 on the PC for some reason. Judging from v1.03, the PC version will be updated to v1.05 at ~midday UTC on the same day the console versions are updated, which should be later this week assuming said patch is currently going through cert (the "TU_5" branch on Steam hasn't been touched since Saturday).
 

Yibby

Member
Nov 10, 2017
1,780
Is this game more or less demanding than The Division? Because that game looks and runs great on my PC.

I think it's more demanding in certain areas. I get similar performance in Ghost Recon Wildlands, so i would compare it to that or Watch Dogs 2. The Division holds 60fps more often for me than AC Origins.
 

Klean

Banned
Nov 3, 2017
641
How did they manage to fuck up the 30fps cap by actually capping at 31fps instead?

Good lord.
 

catboy

Banned
Oct 25, 2017
4,322
Ubi decided against releasing v1.04 on the PC for some reason. Judging from v1.03, the PC version will be updated to v1.05 at ~midday UTC on the same day the console versions are updated, which should be later this week assuming said patch is currently going through cert (the "TU_5" branch on Steam hasn't been touched since Saturday).
I hope this doesn't introduce the weird LoD bugs on console
 

dgrdsv

Member
Oct 25, 2017
11,885
Why even ask though and not just include it like they did with the consoles? You don't need to collect info on every TV or monitor in order to include HDR support. They didn't do that for the PS4 and Xbox One. It should just work.
I don't think it's that easy. I have a feeling that consoles are doing some system level gamma adjustment depending on a TV model which is absent from PC right now and thus should be done by the application. It's just a thought I had seeing how people are reporting that consoles work fine with their HDR TVs while PCs aren't.
 

Shocchiz

Member
Nov 7, 2017
577
I don't think it's that easy. I have a feeling that consoles are doing some system level gamma adjustment depending on a TV model which is absent from PC right now and thus should be done by the application. It's just a thought I had seeing how people are reporting that consoles work fine with their HDR TVs while PCs aren't.
How could a console possibily know the screen it's hooked to?
Pc hdr mess is Windows fault as while in hdr mode the OS sends the wrong signal (it worked perfectly fine when nvidia handled it).
BUT if you have a hdr tv (it doesn't work with monitors or projectors)and force the pc input to console/game mode, hdr picture is fine on pc too.
If you don't, hdr is wrong on both colors and contrast.
Tested with my Samsung ks8000, it works.
See here
https://forums.geforce.com/default/topic/1003426/geforce-1000-series/hdr-problem/9/

So hdr on pc is something they could support, they just decided not to, and that's a shame as I was forced to buy the X version (hdr is 1000000 times better that sdr).
 

dgrdsv

Member
Oct 25, 2017
11,885
How could a console possibily know the screen it's hooked to?
Via EDID which provides unique display ID to the output device. There aren't a lot of HDR TVs out there, even less if you consider different sizes the same TV.

Pc hdr mess is Windows fault as while in hdr mode the OS sends the wrong signal (it worked perfectly fine when nvidia handled it).
It's not, it sends precisely the signal which you'd expect to work from UHD Premium specs. However many (if not most) "HDR" displays these days aren't actually UHD Premium compatible when it comes to HDR support.

BUT if you have a hdr tv (it doesn't work with monitors or projectors)and force the pc input to console/game mode, hdr picture is fine on pc too.
If you don't, hdr is wrong on both colors and contrast.
Tested with my Samsung ks8000, it works.
This is exactly the type of tweaking I'm talking about although in this case it seems to be TV side tweaking.
I'm assuming you're on Win10 1703 or 1709 and NV card?
You could try this:
- make sure that you've selected "Use default color settings" in NVCPL's "Change resolution" section
- turn Windows HDR option on
- launch the game you're having issues with and if it doesn't work in exclusive fullscreen try switching to borderless windowed
HDR should work in this case. But the results may be different from those you're getting from consoles due to the said tweaking on either console side or TV side.
 

datamage

Member
Oct 25, 2017
913
Via EDID which provides unique display ID to the output device. There aren't a lot of HDR TVs out there, even less if you consider different sizes the same TV.

I seriously doubt devs or even the console is taking the EDID information to compensate for their HDR output. That doesn't even sound practical.

I had HDR properly configured in Windows, it doesn't detract from the fact that HDR on PC is a mess currently. For starters, I don't want my desktop washed out when not in a game, nor do I wish to toggle HDR manually before I launch a game. Forza 7 requires HDR to be manually turned on before launch, however, something like Destiny 2 will toggle HDR on its own. Then I try the Injustice 2 beta, yes, beta mind you, however, the output was extremely washed out regardless of whether it turned on HDR automatically, (which it does), or whether I turned it on manually. HDR on PC right now is inconsistent and an after thought.

As a side note, HDR functioned as one would expect prior to the Creator's Update for Win10. Whether MS or NVIDIA are to blame here, I do not know. However, HDR should just work.
 

dgrdsv

Member
Oct 25, 2017
11,885
For starters, I don't want my desktop washed out when not in a game
That's why you want to have system level gamma calibration for stuff which is being shown in "SDR" space. Your desktop is washed out because Windows presume that your HDR display will show it properly at 100cd/m2 - which it obviously doesn't.

nor do I wish to toggle HDR manually before I launch a game
This design flaw is related to the first one. In a perfect world of Microsoft you don't need to turn off HDR at all since your SDR desktop is working fine on your ideal HDR display.

however, something like Destiny 2 will toggle HDR on its own
This seems like a hack on part of Destiny 2 devs tbh.

Then I try the Injustice 2 beta, yes, beta mind you, however, the output was extremely washed out regardless of whether it turned on HDR automatically, (which it does), or whether I turned it on manually. HDR on PC right now is inconsistent and an after thought.
...Or it's just not tweaked to the display by the system which outputs it. Which to me seems like a more possible option here because a) it only happens in some games and b) same games don't have any HDR issues on consoles.

As a side note, HDR functioned as one would expect prior to the Creator's Update for Win10.
Prior to Creators Update all HDR games used either NV or AMD HDR APIs. Said APIs could've done some tweaking on their own. It's also quite possible than NV and AMD were helping the devs with proper HDR implementation back then. Starting with 1703 though these APIs are blocked by MS and generic Windows HDR solution is put into their place. It's pretty easy to see who's to blame here.
 

datamage

Member
Oct 25, 2017
913
dgrdsv Here in lies the problem. Regardless of what is taking place behind the scenes, or what MS is trying to accomplish, to the end user, HDR is a clusterfuck on PC. The fact that some devs have no trouble implementing it on consoles, but won't bother on PC demonstrates this. Are you not seeing a problem with this?

Anyway, I don't want to deviate from the performance thread, in which I already have.
 

Shocchiz

Member
Nov 7, 2017
577
Via EDID which provides unique display ID to the output device. There aren't a lot of HDR TVs out there, even less if you consider different sizes the same TV.


It's not, it sends precisely the signal which you'd expect to work from UHD Premium specs. However many (if not most) "HDR" displays these days aren't actually UHD Premium compatible when it comes to HDR support.


This is exactly the type of tweaking I'm talking about although in this case it seems to be TV side tweaking.
I'm assuming you're on Win10 1703 or 1709 and NV card?
You could try this:
- make sure that you've selected "Use default color settings" in NVCPL's "Change resolution" section
- turn Windows HDR option on
- launch the game you're having issues with and if it doesn't work in exclusive fullscreen try switching to borderless windowed
HDR should work in this case. But the results may be different from those you're getting from consoles due to the said tweaking on either console side or TV side.
Well, the ks8000 sure is ultra hd premium, isn't that enough for Windows hdr?
And the edid thing it's something very unlikely and not needed at all.
Literally every screen on the planet worked with nvidia implementation and every screen doesn't work now, so something is wrong.
Not blaming Windows and blaming every single monitor and tv is a little too much.
Thank God the game mode trick solves the problem on the ks8000 (it's literally the only way to have the same, correct, output as the One X. Forza 7 is identical)
Buy my epson has no game setting, so I'm stuck. I also tried the hdfury linker, nothing.

EDIT: for reference while using the game mode trick desktop it's NOT whashed out.
 

Dmax3901

Member
Oct 25, 2017
7,899
i5 4690k and 1080ti at 1440p and dropping into the forties in Alexandria. Not the end of the world but it'd sure be swell if I saw improvements. Am I likely bottle necked by my CPU?

I keep getting this IRQL_ LESS THAN OR EQUAL blue screen of death which I can't pinpoint a cause for. Well, I get it say two or three times a week at random times.
 

dgrdsv

Member
Oct 25, 2017
11,885
The fact that some devs have no trouble implementing it on consoles, but won't bother on PC demonstrates this.
This is 99% an issue of market demands really. Devs avoid adding the feature to PC versions simply because most PC gamers don't have HDR monitors. The amount of people gaming on HDR 4K TVs on their PCs is very small.
1% remaining is the technical issues surrounding this. From a game dev point of view if HDR is already done for consoles adding it to PC version is very easy. They don't do it because well nobody ask them.

Well, the ks8000 sure is ultra hd premium, isn't that enough for Windows hdr?
Who knows? It should be as it has the certification but then again it's just edge lit and Samsung doesn't even specify things like peak brightness and contrast on its specs page.

And the edid thing it's something very unlikely and not needed at all.
On the contrary, it's very needed and is very likely. It would be even better if HDR displays would tell the output device their real specs so that the output device would be able to tweak to them automatically.
 

Rewind

Member
Oct 27, 2017
569
i5 4690k and 1080ti at 1440p and dropping into the forties in Alexandria. Not the end of the world but it'd sure be swell if I saw improvements. Am I likely bottle necked by my CPU?

I keep getting this IRQL_ LESS THAN OR EQUAL blue screen of death which I can't pinpoint a cause for. Well, I get it say two or three times a week at random times.
Probably CPU bound. Blue screen is probably from an unstable overclock on the cpu (assuming you have it overclocked).
 

Dmax3901

Member
Oct 25, 2017
7,899
Probably CPU bound. Blue screen is probably from an unstable overclock on the cpu (assuming you have it overclocked).
My limited research into that specific error told me it was likely driver related though.

If it is CPU related, how would I go about making the overclock more stable?
 

criesofthepast

Crash Test Dummy
Member
Oct 25, 2017
1,056
Stupid question but using F1 in game shows me some stats and was wondering what the numbers mean? I know FPS but what do the numbers beside GPU and CPU mean? What is the scale? Is it usage %? What is the max and minimum? 0-100?
 

Deleted member 30458

User requested account closure
Banned
Nov 3, 2017
205
6700K i7 here on 3440x1440 resolution
I ended up with 45fps lock, AA low, everything else maxxed out, Vsync through Nvidia control panel. It's not steady-steady but I'm not a fps nazi so it's alright for now. I won't be able to skip the big dips sometimes (fire? loading cities?) and the big differences between areas (rural v urban mainly)
100% CPU but I didn't benchmark it precisely, I don't want to spend hours trying to figure if I can win 5fps here and here and what is the best compromise.

It's more or like what I did get on Unity and Syndicate
 

Ether_Snake

Banned
Oct 29, 2017
11,306
Everyone should use dynamic resolution, just adjust what your target framerate for it to activate is at. You can practically not notice it.
 

criesofthepast

Crash Test Dummy
Member
Oct 25, 2017
1,056
I have no idea what the benchmark results mean all I know is I just started the game playing on my 2 year old laptop (960m 4gb i7-4720HQ) and running things on high and medium 1920x1080 and getting 30-40fps average unlocked which I am good with. And game looks really good. I am running an old nvidia driver that I keep getting warned about on game start up. I am a super noob and if the driver updates don't come through windows update then I have never updated my driver. (382.5 is my driver and have no idea when it was released)

Thing is I do not understand the GPU and CPU colors and percentages what they mean. I would assume green is low usage and red is high with yellow being middle. But the percentages make no sense to me.

Using the in game tools my cpu says average 15ms and is mostly green with a yellow percentage of 33% and green of 67%.

GPU is red at 36% and yellow at 64%.

Can someone please explain those stats to me? Would really appreciate it.
 

Carian Knight

Member
Oct 27, 2017
1,986
Turkey
I have a 4690K @4.4ghz and a GTX 1080 i'm using a 4K monitor can my pc do a better job then PS4 Pro or should i just buy it for PS4 Pro?
 

bargeparty

Member
Oct 30, 2017
504
Interesting. Is there anything to explain why this would start happening after a long period of there being no issues?

Does your BSOD have any other info in it? Sometimes there is a file name at the bottom.

That BSOD error is very common, even in computers that have never been overclocked. My first thought would be driver related, but it's also easy to back down any OC a bit and see what happens.

If it is OC related it could be that AC:O is pushing harder than other games you've used, so it never reached the point of instability.