From my experience, it doesn't matter what FPS you cap it on. It's always at 100% usage.
But that's not true for everyone.
From my experience, it doesn't matter what FPS you cap it on. It's always at 100% usage.
What are your specs again, and driver version?
Something else I forgot that you could try.
Find your ACOrigins.exe
Right Click
Properties
Compatibility
Check "Disable Fullscreen Optimizations"
Apply
Click Change settings for all users and do it again just to be safe.
So is HDR a definite no for the PC version?
My PS4 pro and PC would run this in the same ballpark (I'd have to lock it to 30 at 1440p on my PC), and I prefer PC, but her might tip me over to PS4
7700K & 1080ti at 4K here, frame rate remains high most of the time but fluctuates wildly. I refuse to lower any setting though, looks too good! Loving the game, frequently just floored by the visuals.
Sounds like a no to mehttps://forums.ubi.com/showthread.p...DR-Support?p=13077711&viewfull=1#post13077711
Sounds like it will come at some point.
Is and i7 3770K OC'd to 4.4ghz going to run this ok? Fairly old CPU. Half thinking I should just grab the PS4 version in a sale and be done with it.
Got a GTX 970 here btw.
I have the same processor with 1080. I play at 3440x1440 ultra high with 45 FPS lock. While 60 FPS would be better, 45 is not bad.Is and i7 3770K OC'd to 4.4ghz going to run this ok? Fairly old CPU. Half thinking I should just grab the PS4 version in a sale and be done with it.
Got a GTX 970 here btw.
I have the same processor with 1080. I play at 3440x1440 ultra high with 45 FPS lock. While 60 FPS would be better, 45 is not bad.
My laptop is 980m, and I can play on ultra high with 45 FPS lock at 1080p.
I actually just did that before I shut down for the night, didn't help sadly.
Running a 4770k with a 1080ti, and the newest drivers. I actually tossed one driver back as well, same thing..
I've seen people mention fast vsync, would that perhaps change anything? Or possibly turning on vsync in ncp and letting it control it possibly?
They been asking people to post whether they play with an HDR capable TV or monitor, along with model numbers. I don't think they'd bother collecting the info if they weren't at least considering adding HDR support. Of course, it could just be for future titles.
Honestly, I think it is coming, and I also think they were just ignorant to the fact that HDR on PC is a big deal now. It's pushed so heavily by the console manufacturers as a feature, of course they knew to push it there, but the "all supported platforms" blurb that's gotten them into hot water, along with the following reaction just comes across as them being surprised. How that could be the case when a thousand-plus people worked on the game, I don't really understand, but I think that's the case. I also think they should definitely take a step back and see why it's the case, because it definitely doesn't look good.Why even ask though and not just include it like they did with the consoles? You don't need to collect info on every TV or monitor in order to include HDR support. They didn't do that for the PS4 and Xbox One. It should just work.
Is this game more or less demanding than The Division? Because that game looks and runs great on my PC.
I hope this doesn't introduce the weird LoD bugs on consoleUbi decided against releasing v1.04 on the PC for some reason. Judging from v1.03, the PC version will be updated to v1.05 at ~midday UTC on the same day the console versions are updated, which should be later this week assuming said patch is currently going through cert (the "TU_5" branch on Steam hasn't been touched since Saturday).
I don't think it's that easy. I have a feeling that consoles are doing some system level gamma adjustment depending on a TV model which is absent from PC right now and thus should be done by the application. It's just a thought I had seeing how people are reporting that consoles work fine with their HDR TVs while PCs aren't.Why even ask though and not just include it like they did with the consoles? You don't need to collect info on every TV or monitor in order to include HDR support. They didn't do that for the PS4 and Xbox One. It should just work.
How could a console possibily know the screen it's hooked to?I don't think it's that easy. I have a feeling that consoles are doing some system level gamma adjustment depending on a TV model which is absent from PC right now and thus should be done by the application. It's just a thought I had seeing how people are reporting that consoles work fine with their HDR TVs while PCs aren't.
Via EDID which provides unique display ID to the output device. There aren't a lot of HDR TVs out there, even less if you consider different sizes the same TV.
It's not, it sends precisely the signal which you'd expect to work from UHD Premium specs. However many (if not most) "HDR" displays these days aren't actually UHD Premium compatible when it comes to HDR support.Pc hdr mess is Windows fault as while in hdr mode the OS sends the wrong signal (it worked perfectly fine when nvidia handled it).
This is exactly the type of tweaking I'm talking about although in this case it seems to be TV side tweaking.BUT if you have a hdr tv (it doesn't work with monitors or projectors)and force the pc input to console/game mode, hdr picture is fine on pc too.
If you don't, hdr is wrong on both colors and contrast.
Tested with my Samsung ks8000, it works.
Via EDID which provides unique display ID to the output device. There aren't a lot of HDR TVs out there, even less if you consider different sizes the same TV.
That's why you want to have system level gamma calibration for stuff which is being shown in "SDR" space. Your desktop is washed out because Windows presume that your HDR display will show it properly at 100cd/m2 - which it obviously doesn't.For starters, I don't want my desktop washed out when not in a game
This design flaw is related to the first one. In a perfect world of Microsoft you don't need to turn off HDR at all since your SDR desktop is working fine on your ideal HDR display.
This seems like a hack on part of Destiny 2 devs tbh.
...Or it's just not tweaked to the display by the system which outputs it. Which to me seems like a more possible option here because a) it only happens in some games and b) same games don't have any HDR issues on consoles.Then I try the Injustice 2 beta, yes, beta mind you, however, the output was extremely washed out regardless of whether it turned on HDR automatically, (which it does), or whether I turned it on manually. HDR on PC right now is inconsistent and an after thought.
Prior to Creators Update all HDR games used either NV or AMD HDR APIs. Said APIs could've done some tweaking on their own. It's also quite possible than NV and AMD were helping the devs with proper HDR implementation back then. Starting with 1703 though these APIs are blocked by MS and generic Windows HDR solution is put into their place. It's pretty easy to see who's to blame here.As a side note, HDR functioned as one would expect prior to the Creator's Update for Win10.
Well, the ks8000 sure is ultra hd premium, isn't that enough for Windows hdr?Via EDID which provides unique display ID to the output device. There aren't a lot of HDR TVs out there, even less if you consider different sizes the same TV.
It's not, it sends precisely the signal which you'd expect to work from UHD Premium specs. However many (if not most) "HDR" displays these days aren't actually UHD Premium compatible when it comes to HDR support.
This is exactly the type of tweaking I'm talking about although in this case it seems to be TV side tweaking.
I'm assuming you're on Win10 1703 or 1709 and NV card?
You could try this:
- make sure that you've selected "Use default color settings" in NVCPL's "Change resolution" section
- turn Windows HDR option on
- launch the game you're having issues with and if it doesn't work in exclusive fullscreen try switching to borderless windowed
HDR should work in this case. But the results may be different from those you're getting from consoles due to the said tweaking on either console side or TV side.
This is 99% an issue of market demands really. Devs avoid adding the feature to PC versions simply because most PC gamers don't have HDR monitors. The amount of people gaming on HDR 4K TVs on their PCs is very small.The fact that some devs have no trouble implementing it on consoles, but won't bother on PC demonstrates this.
Who knows? It should be as it has the certification but then again it's just edge lit and Samsung doesn't even specify things like peak brightness and contrast on its specs page.Well, the ks8000 sure is ultra hd premium, isn't that enough for Windows hdr?
On the contrary, it's very needed and is very likely. It would be even better if HDR displays would tell the output device their real specs so that the output device would be able to tweak to them automatically.And the edid thing it's something very unlikely and not needed at all.
Probably CPU bound. Blue screen is probably from an unstable overclock on the cpu (assuming you have it overclocked).i5 4690k and 1080ti at 1440p and dropping into the forties in Alexandria. Not the end of the world but it'd sure be swell if I saw improvements. Am I likely bottle necked by my CPU?
I keep getting this IRQL_ LESS THAN OR EQUAL blue screen of death which I can't pinpoint a cause for. Well, I get it say two or three times a week at random times.
My limited research into that specific error told me it was likely driver related though.Probably CPU bound. Blue screen is probably from an unstable overclock on the cpu (assuming you have it overclocked).
Cpu stability from my experience. Lower oc or raise voltageMy limited research into that specific error told me it was likely driver related though.
If it is CPU related, how would I go about making the overclock more stable?
Interesting. Is there anything to explain why this would start happening after a long period of there being no issues?
Sillicon degrades over time, faster when you run it out of specInteresting. Is there anything to explain why this would start happening after a long period of there being no issues?
Interesting. Is there anything to explain why this would start happening after a long period of there being no issues?
Everyone should use dynamic resolution, just adjust what your target framerate for it to activate is at. You can practically not notice it.