• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.
  • We have made minor adjustments to how the search bar works on ResetEra. You can read about the changes here.
Status
Not open for further replies.

Madness

Member
Oct 25, 2017
791
Buyers remorse setting in... texted DeWayne (D-Nice), he has both the new C8 and Q9FN in hand. Asked him if I should go for the 75 Samsung or 77 OLED, and he's like 'stick with OLED' LOL.
Oh well, I ain't backing out. It's just going to make the next 2-3 weeks all the more suspenseful!

Thank god for 30 days return if it doesn't work out

I mean you already have a C7. There is nothing revolutionary about the C8 as there hasn't been massive jumps from even the C6 OLED televisions whereas you are now getting a flagship LED/LCD. Yes the black levels are not as good but the brightness is ridiculous. I still say those who say OLED sight unseen have yet to experience a calibrated Z9D type television displaying HDR. Boot up your favorite HDR disc when you get your set and just sit back and look at the highlights. Like you said you still have 30 days to return...
 
OP
OP
Jeremiah

Jeremiah

Member
Oct 25, 2017
774
I mean you already have a C7. There is nothing revolutionary about the C8 as there hasn't been massive jumps from even the C6 OLED televisions whereas you are now getting a flagship LED/LCD. Yes the black levels are not as good but the brightness is ridiculous. I still say those who say OLED sight unseen have yet to experience a calibrated Z9D type television displaying HDR. Boot up your favorite HDR disc when you get your set and just sit back and look at the highlights. Like you said you still have 30 days to return...

Aye...

You know, and this is just wishful thinking on my part, but if Samsung's new game interpolation of 30 FPS titles is artifact free, might be a game changer for console gamers:

The NU8000 has a new 'Game Motion Plus' feature, which adds motion interpolation (soap opera effect) without adding too much input lag; this feature is useful when gaming on older consoles that can only output 30 fps, or for games that have frequent framerate dips. The 'Judder Reduction' slider interpolates content up to 60 fps, while the 'Blur Reduction' slider interpolates up to 120 fps. When 'Blur Reduction' is used the input lag for 4k increases from 23.8 ms to 29.3 ms, but this increase shouldn't be noticeable during gaming.

On the C7, interpolating 30 FPS titles makes them look like 60 FPS titles. The illusion breaks when you start playing of course -- the input lag is shit, and artifacts manifest themselves during fast camera movements. But if it works... i mean, i'd play 30 FPS console titles with interpolation if there were no artifacts and no added input lag!
 

Edgar

User requested ban
Banned
Oct 29, 2017
7,180
I fell down deep into rabbit hole looking for hdr monitors. So if someone don't mind indulgini me. I would apappreciate it.
1. Hardware and software based hdr. Is price difference worth it? Are there diminishing returns when not using for professional work? Just gaming and films
2. The 10bit v 12bit panels . It seems you need 12 to fully see the hdr effect?
3. I have around £600 for monitor if anyone has suggestions i appreciate it. The one someone recommended was pretty good. But I'm not a fan of curved monitors. Don't care about response time and 144hz n such. Image quality / color reproduction /blacks are most important.
4. Will I need to configure hdr for every individual game /film?
 

GreenMonkey

Member
Oct 28, 2017
1,862
Michigan
question here,

i have the 2017 B7A LG . oled, 4k , i notice that everyone always says for movies trumotion to OFF is best, but when i have it off, i see images stutter when they move, or small ships in Star Wars stutter as they fly, , is this just normal for TV content now? the only time i can ever really get it smooth is when i set the trumotion to Smooth, the options are smooth, clear, and user, and there are almost always artifacts,

what do you do to get a clear movie image? or is it because i am watching a regular blu ray on a 4k tv and not a 4k blu ray ?

Judder bugs me because I thought with 24hz support I wouldn't really see it. But it is more obvious on my OLED than my old plasma which didn't have it.

The funny thing is, pay attention in the theater and you'll see it there, too. It is highly content related, judder can be seen in a lot of panning scenes.
 

tokkun

Member
Oct 27, 2017
5,413
Some really good VRR info here

Biggest tidbit I noticed is that Freesync sits separate and distinct from the HDMI forums on VRR tech. So maybe that doesn't completely rule out team green...

It will be an interesting couple months for sure, esp if Turing/Ampere stuff launches by June/July

https://www.flatpanelshd.com/news.php?subaction=showfull&id=1522745972


I'm skeptical. Nvidia's motivation will be to keep user lock-in and to keep getting those Gsync royalties. It doesn't seem like VRR and Freesync being slightly different is going to change that issue. It may mean that their cards won't work with current Freesync monitors, but monitor makers can just add VRR compatibility to their next model.

Overall, it doesn't seem like the market of users connecting their PCs to televisions is large enough to make much difference. TV-related stuff gets busted in Nvidia drivers and takes months to fix, like HDR and setting the correct color space over HDMI.
 

RedlineRonin

Member
Oct 30, 2017
2,620
Minneapolis
I'm skeptical. Nvidia's motivation will be to keep user lock-in and to keep getting those Gsync royalties. It doesn't seem like VRR and Freesync being slightly different is going to change that issue. It may mean that their cards won't work with current Freesync monitors, but monitor makers can just add VRR compatibility to their next model.

Overall, it doesn't seem like the market of users connecting their PCs to televisions is large enough to make much difference. TV-related stuff gets busted in Nvidia drivers and takes months to fix, like HDR and setting the correct color space over HDMI.

i know the chance is slim. It just seems like a(nother) shitty NV business practice (like the GPP shit that's hitting the fan currently)
 

Smokey

Member
Oct 25, 2017
4,176
I fell down deep into rabbit hole looking for hdr monitors. So if someone don't mind indulgini me. I would apappreciate it.
1. Hardware and software based hdr. Is price difference worth it? Are there diminishing returns when not using for professional work? Just gaming and films
2. The 10bit v 12bit panels . It seems you need 12 to fully see the hdr effect?
3. I have around £600 for monitor if anyone has suggestions i appreciate it. The one someone recommended was pretty good. But I'm not a fan of curved monitors. Don't care about response time and 144hz n such. Image quality / color reproduction /blacks are most important.
4. Will I need to configure hdr for every individual game /film?

HDR monitors are a crapshoot right now. There's a bunch that will accept HDR , but the panel is 8-bit and the nits aren't there to give a true representation of HDR.

I'd wait until the Asus/Acer line of 4k / HDR / 144hz / Local Dimming monitors hit. Supposedly that's in April, after numerous delays. Unfortunately they are probably going to cost $1,500+. But we have to start somewhere.
 

Edgar

User requested ban
Banned
Oct 29, 2017
7,180
HDR monitors are a crapshoot right now. There's a bunch that will accept HDR , but the panel is 8-bit and the nits aren't there to give a true representation of HDR.

I'd wait until the Asus/Acer line of 4k / HDR / 144hz / Local Dimming monitors hit. Supposedly that's in April, after numerous delays. Unfortunately they are probably going to cost $1,500+. But we have to start somewhere.
So TVs are actually better when it comes that?
 

m23

Member
Oct 25, 2017
5,422
Finally picked up another TV set after my previous one was destroyed by a ball. I don't have that much money to spend at the moment so I picked up the UN55MU7000 (NA), which I understand to be a budget Samsung TV.

It seems to have HDR10 as well as WCG (no idea what this really means), so hopefully it satisfies my needs. I hear the brightness isn't very bright, but it'll be sitting in the basement where it's dark with very little sunlight. The low input lag was a plus for me as I intend to play games on the Xbox X on it.

Any thoughts/impressions on this set? I think this is is the 6400 in UK?
 

jstevenson

Developer at Insomniac Games
Verified
Oct 25, 2017
2,042
Burbank CA
I mean you already have a C7. There is nothing revolutionary about the C8 as there hasn't been massive jumps from even the C6 OLED televisions whereas you are now getting a flagship LED/LCD. Yes the black levels are not as good but the brightness is ridiculous. I still say those who say OLED sight unseen have yet to experience a calibrated Z9D type television displaying HDR. Boot up your favorite HDR disc when you get your set and just sit back and look at the highlights. Like you said you still have 30 days to return...

We have a Z9D in the office, and it is awesome and cool, but I don't think the extra brightness is THAT much better than the blacks of the OLED. In a proper viewing environment I'd still take the OLED.

That said, the Z9D is a good choice, and if it's a particularly bright room, it might be the better selection.

It kind of goes back to when I used to sell Plasmas, people would want them on walls with windows and wondering if it was bright enough / glare issues. That may not be the best application.


I will say overall, the combination of Dolby Vision and OLEDs has made for cinematic viewing that eclipses a fair number of my actual theatrical experiences in recent years, which is kind of the first time I've actually thought that despite chasing it for 15+ years now
 

CRIMSON-XIII

Member
Oct 25, 2017
6,182
Chicago, IL
Judder bugs me because I thought with 24hz support I wouldn't really see it. But it is more obvious on my OLED than my old plasma which didn't have it.

The funny thing is, pay attention in the theater and you'll see it there, too. It is highly content related, judder can be seen in a lot of panning scenes.
i notice it in blu rays tbh, i think when i recall netflix or hulu or youtube, i do not see it ?

so i cant watch the last jedi on my oled 4k without seeing a stutter of some sort? or should i just play with the smooth or user settings more as far as de judder and de blur, which is a fight of artifacts in itself.


kinda hoping that i dont have to get an HDR oled 4k like 5-6 years from now in order to experience no stutters in my star wars ships flying
 

Arih

Member
Jan 19, 2018
471
I will say overall, the combination of Dolby Vision and OLEDs has made for cinematic viewing that eclipses a fair number of my actual theatrical experiences in recent years, which is kind of the first time I've actually thought that despite chasing it for 15+ years now

Totally agree. When i got my C7 i watched Altered Carbon on Netflix and i was astounded. It looked better than my recent trip to the movies. So good.

Hopefully more movies/series come with dolby vision in the future.
 

GreenMonkey

Member
Oct 28, 2017
1,862
Michigan
i notice it in blu rays tbh, i think when i recall netflix or hulu or youtube, i do not see it ?

so i cant watch the last jedi on my oled 4k without seeing a stutter of some sort? or should i just play with the smooth or user settings more as far as de judder and de blur, which is a fight of artifacts in itself.


kinda hoping that i dont have to get an HDR oled 4k like 5-6 years from now in order to experience no stutters in my star wars ships flying

Try watching a movie on Netflix, or something like House of Cards, filled at 24fps. Should be the same.

TV content filmed at 30fps has it also, it is a lot less noticeable, but it is still there.

TVs can do interpolation to guess what should be in be in between frames, but it can lead to artifacts. Turn up the motion settings enough and you'll probably see those. They drive me crazier than the frame judder.

Word is that Sony has better image processing for this kinda thing but I can't speak to that myself.

Next time you are at the theater look carefully during panning scenes - same effect is there in the theaters.
It's a source problem.

There's a reason 48fps high frame rate movies have been tried in theaters (like the Hobbit).
 
OP
OP
Jeremiah

Jeremiah

Member
Oct 25, 2017
774
We have a Z9D in the office, and it is awesome and cool, but I don't think the extra brightness is THAT much better than the blacks of the OLED. In a proper viewing environment I'd still take the OLED.

That said, the Z9D is a good choice, and if it's a particularly bright room, it might be the better selection.

It kind of goes back to when I used to sell Plasmas, people would want them on walls with windows and wondering if it was bright enough / glare issues. That may not be the best application.


I will say overall, the combination of Dolby Vision and OLEDs has made for cinematic viewing that eclipses a fair number of my actual theatrical experiences in recent years, which is kind of the first time I've actually thought that despite chasing it for 15+ years now

Sooooo... how does Spiderman look on your OLED?


Yeah just saw that, thanks. Not feeling the comment about the AR reflecting whites... like what?
 

-PXG-

Banned
Oct 25, 2017
6,186
NJ
Cross post from Destiny 2 Community thread:

Alright tell me what y'all see.

I can't tell the damn difference. Keep in mind, whenever I do turn on HDR, I also use different calibration settings. My usual settings whenever HDR is active (for anything, not just D2) is way over exposed and oversaturated.

So, whenever HDR content is being displayed I change to my dedicated HDR configuration. Problem is, at least with D2, it looks almost exactly as it normally does, without HDR, just with very minor differences.

1pMWxMu.jpg
HDR

NFxlwrf.jpg
No HDR

r5cOPvu.jpg

HDR

e09hmNj.jpg
No HDR
 

MazeHaze

Member
Nov 1, 2017
8,584
Cross post from Destiny 2 Community thread:

Alright tell me what y'all see.

I can't tell the damn difference. Keep in mind, whenever I do turn on HDR, I also use different calibration settings. My usual settings whenever HDR is active (for anything, not just D2) is way over exposed and oversaturated.

So, whenever HDR content is being displayed I change to my dedicated HDR configuration. Problem is, at least with D2, it looks almost exactly as it normally does, without HDR, just with very minor differences.

1pMWxMu.jpg
HDR

NFxlwrf.jpg
No HDR

r5cOPvu.jpg

HDR

e09hmNj.jpg
No HDR
Off screen pics don't ever help anything. For what it's worth though, I don't find the HDR implementation in Destiny 2 to be particularly impressive.
 

DOTDASHDOT

Helios Abandoned. Atropos Conquered.
Member
Oct 26, 2017
3,076
It's cool that Vincent did that video - there are a lot of people who will be cross-shopping the 930E (as it presumably goes to clearance pricing) and the 900F.

I'm sure that some, who are simply fans of specific teknology, will ignore this video as always.

I'm definitely not one of them, I'd consider myself a TV fan in general, and find it all pretty interesting, regardless of what I own.
 

Deleted member 4346

User requested account closure
Banned
Oct 25, 2017
8,976
I fell down deep into rabbit hole looking for hdr monitors. So if someone don't mind indulgini me. I would apappreciate it.
1. Hardware and software based hdr. Is price difference worth it? Are there diminishing returns when not using for professional work? Just gaming and films
2. The 10bit v 12bit panels . It seems you need 12 to fully see the hdr effect?
3. I have around £600 for monitor if anyone has suggestions i appreciate it. The one someone recommended was pretty good. But I'm not a fan of curved monitors. Don't care about response time and 144hz n such. Image quality / color reproduction /blacks are most important.
4. Will I need to configure hdr for every individual game /film?

The monitor market is in a bad place right now. I've ranted about this numerous times so I won't get into it too much, but for monitors right now you have a wide range of poor compromised options. I would wait, if you possibly can, until next year. Personally I bought a cheap 32" VA-based HP Omen over the holiday sales, running that and my old AMD R9 290X, until RX Vega prices stabilize and a clear acceptable option (32"+, VRR, 1440p/4K, 144Hz, <$1000, VA-based) emerges.

Some really good VRR info here

Biggest tidbit I noticed is that Freesync sits separate and distinct from the HDMI forums on VRR tech. So maybe that doesn't completely rule out team green...

It will be an interesting couple months for sure, esp if Turing/Ampere stuff launches by June/July

https://www.flatpanelshd.com/news.php?subaction=showfull&id=1522745972

NVIDIA isn't giving up their fat GSYNC royalties without being dragged kicking and screaming. It helps that AMD powers the biggest game consoles but I'm not sure that's enough. Lots of gamers are locked into the NVIDIA/GSYNC ecosystem with no escape, hard to turn that entrenched market around.

Sad thing is that there are no technical hurdles, the entire thing is artificial- just vendor lock-in and licensing profit.
 

Lucentto

Member
Oct 25, 2017
363
Just got the 65b7 for $1700 thanks to Best Buy price matching and Microcenter. That deal was too good to pass up.
 

BAD

Member
Oct 25, 2017
9,565
USA
I'll say that officially of all the 4K content I tested this week, Murder on the Orient Express (2017) and Dunkirk are the most reference quality and impressive to me.

Both gorgeous if you want demonstrations of real 4K transfers.
 

-PXG-

Banned
Oct 25, 2017
6,186
NJ
Off screen pics don't ever help anything. For what it's worth though, I don't find the HDR implementation in Destiny 2 to be particularly impressive.

I was thinking the same. Some HDR content I've seen is poorly implemented. I guess it's still relatively new, so content creators are getting used to it.
 

Edgar

User requested ban
Banned
Oct 29, 2017
7,180
The monitor market is in a bad place right now. I've ranted about this numerous times so I won't get into it too much, but for monitors right now you have a wide range of poor compromised options. I would wait, if you possibly can, until next year. Personally I bought a cheap 32" VA-based HP Omen over the holiday sales, running that and my old AMD R9 290X, until RX Vega prices stabilize and a clear acceptable option (32"+, VRR, 1440p/4K, 144Hz, <$1000, VA-based) emerges.



NVIDIA isn't giving up their fat GSYNC royalties without being dragged kicking and screaming. It helps that AMD powers the biggest game consoles but I'm not sure that's enough. Lots of gamers are locked into the NVIDIA/GSYNC ecosystem with no escape, hard to turn that entrenched market around.

Sad thing is that there are no technical hurdles, the entire thing is artificial- just vendor lock-in and licensing profit.
In 2017 I got Dell U2417H-2 Ultra Sharp 24-Inch , paid £200. I needed something serviceable fast with my ps4 pro. And I really liked it . But a week or two ago it suddenly is getting sorta kinda crushed blacks? But not fully, like its not overbearing for the most part, but its a lot darker than it used to be. And tried reseting stuff, changing hdmi cables nothing. And AFAIK warranty is expired. And it seems it only me thats noticing, since other people are saying its fine .
 

GearDraxon

Member
Oct 25, 2017
2,786
I'll say that officially of all the 4K content I tested this week, Murder on the Orient Express (2017) and Dunkirk are the most reference quality and impressive to me.

Both gorgeous if you want demonstrations of real 4K transfers.
The power of filming in 70mm. Just watched The Master for the first time this weekend, on a regular Blu-ray, and it was still visually stunning.
 

ShadowRunner

Member
Oct 29, 2017
166
So i bought a c7 recently but am struggling somewhat with brightness settings in games. I cant work out if i should be using in-game brighness sliders or not especially in hdr games, should i just leave it on defaults. Shadow of the Collossus seems way too washed out in dark areas when doing this though
 

Manac0r

Member
Oct 30, 2017
435
UK
So i bought a c7 recently but am struggling somewhat with brightness settings in games. I cant work out if i should be using in-game brighness sliders or not especially in hdr games, should i just leave it on defaults. Shadow of the Collossus seems way too washed out in dark areas when doing this though

Make sure black levels are correct as well as colour gamut - Shadow of Colossus is somewhat mute by artistic design - sombre by choice.

Always check in game options and adjust accordingly.
 

MazeHaze

Member
Nov 1, 2017
8,584
So i bought a c7 recently but am struggling somewhat with brightness settings in games. I cant work out if i should be using in-game brighness sliders or not especially in hdr games, should i just leave it on defaults. Shadow of the Collossus seems way too washed out in dark areas when doing this though
Use the in game adjustment for SotC. For that game in HDR, dark areas are properly dark, like hard to see dark. I actually bumped it up a click or two once I calibrated it, just enough to make dark sections a little more visible without washing out the image too much.
 

CRIMSON-XIII

Member
Oct 25, 2017
6,182
Chicago, IL
Try watching a movie on Netflix, or something like House of Cards, filled at 24fps. Should be the same.

TV content filmed at 30fps has it also, it is a lot less noticeable, but it is still there.

TVs can do interpolation to guess what should be in be in between frames, but it can lead to artifacts. Turn up the motion settings enough and you'll probably see those. They drive me crazier than the frame judder.

Word is that Sony has better image processing for this kinda thing but I can't speak to that myself.

Next time you are at the theater look carefully during panning scenes - same effect is there in the theaters.
It's a source problem.

There's a reason 48fps high frame rate movies have been tried in theaters (like the Hobbit).
so what i am noticing, everyone notices anyway? and some movies simply wont have the stuttering?
 

RedlineRonin

Member
Oct 30, 2017
2,620
Minneapolis
It's cool that Vincent did that video - there are a lot of people who will be cross-shopping the 930E (as it presumably goes to clearance pricing) and the 900F.

I'm sure that some, who are simply fans of specific teknology, will ignore this video as always.

A) Lol

B) what's the consensus (can't watch rn)? 930E overall > 900F?
NVIDIA isn't giving up their fat GSYNC royalties without being dragged kicking and screaming. It helps that AMD powers the biggest game consoles but I'm not sure that's enough. Lots of gamers are locked into the NVIDIA/GSYNC ecosystem with no escape, hard to turn that entrenched market around.

Sad thing is that there are no technical hurdles, the entire thing is artificial- just vendor lock-in and licensing profit.

Right. It honestly sucks. the input lag created by being forced to use v sync sucks in and of itself, but then often times the v sync implementations are bad (if you can't lock 60fps, the game bumps down to 30fps for example) and I also have really bad luck every time I end up screwing around with RTSS, and then sometimes that doesn't even work. This is literally, short of more raw GPU power to lock 4k60, my single biggest gripe with PC gaming on my OLED right now and VRR solves a world of problems for me (theoretically, on AMD).

If AMD makes even a remotely competitive GPU to counter Ampere/Turing stuff, I may need to seriously consider it (and then probably have a whole slew of new issues related to AMD's generally inferior drivers).

Ugh.
 
OP
OP
Jeremiah

Jeremiah

Member
Oct 25, 2017
774
A) Lol

Right. It honestly sucks. the input lag created by being forced to use v sync sucks in and of itself, but then often times the v sync implementations are bad (if you can't lock 60fps, the game bumps down to 30fps for example) and I also have really bad luck every time I end up screwing around with RTSS, and then sometimes that doesn't even work. This is literally, short of more raw GPU power to lock 4k60, my single biggest gripe with PC gaming on my OLED right now and VRR solves a world of problems for me (theoretically, on AMD).

If AMD makes even a remotely competitive GPU to counter Ampere/Turing stuff, I may need to seriously consider it (and then probably have a whole slew of new issues related to AMD's generally inferior drivers).

Ugh.

In 1080p@120fps on the OLED, when you set RTSS to limit frames to 119 globally, i get no tearing (at least visible to me) with vsync off... sure there is judder, but wayyyyyy less noticeable than judder below 60 fps.

Once 4k@120fps is a thing, i'm sure the 1180ti will be able to keep pretty high frame rates in games... i wish SLI was still a real thing and or worked better.

So i bought a c7 recently but am struggling somewhat with brightness settings in games. I cant work out if i should be using in-game brighness sliders or not especially in hdr games, should i just leave it on defaults. Shadow of the Collossus seems way too washed out in dark areas when doing this though

I always adjust the in-game brightness for HDR on my C7. My two test games are HZD and TLL. For the former i set the in-game to 65% and for the latter it's set at 6. This is in a room with no lights on.
 

RedlineRonin

Member
Oct 30, 2017
2,620
Minneapolis
In 1080p@120fps on the OLED, when you set RTSS to limit frames to 119 globally, i get no tearing (at least visible to me) with vsync off... sure there is judder, but wayyyyyy less noticeable than judder below 60 fps.

Once 4k@120fps is a thing, i'm sure the 1180ti will be able to keep pretty high frame rates in games... i wish SLI was still a real thing and or worked better.

I probably need to spend more time just learning how to use RTSS properly. I tried to get it working on Wildlands on PC a few weeks ago and just gave up after 20 mins. I am this weird 3rd segment of the market where I would happily pay for a $1K console, if it meant it would match PC performance with zero effing around. I don't mind tinkering (i built the PC I have) but I feel like I'm having to do wayyyy more of it since I got a 4k set and HDR was thrown into the mix


*fart noise*
 
OP
OP
Jeremiah

Jeremiah

Member
Oct 25, 2017
774
I probably need to spend more time just learning how to use RTSS properly. I tried to get it working on Wildlands on PC a few weeks ago and just gave up after 20 mins. I am this weird 3rd segment of the market where I would happily pay for a $1K console, if it meant it would match PC performance with zero effing around. I don't mind tinkering (i built the PC I have) but I feel like I'm having to do wayyyy more of it since I got a 4k set and HDR was thrown into the mix



*fart noise*

Hahaha, i 100% relate man.
 

RedlineRonin

Member
Oct 30, 2017
2,620
Minneapolis
Hahaha, i 100% relate man.

Honestly tho. Shit, I would probably pay more than that. I just want to maximize the potential of the display and my enjoyment and minimize the fucking around. And it's totally one or the other. With the weak ass CPUs in this gen of consoles, that's off the table, so PC is pretty much the only way to go. I used to have more time to eff around with all that stuff but i'm doing the classic move to the right on the time versus money curve where I would gladly just throw money at it to not have to mess around with so much of it.

To be fair, PCs have gotten a lot better overall in the simplicity of building and getting stuff to work generally, but 4K and HDR (as well as HDMI versioning) have just thrown entirely new wrenches into the whole process.
 

DOTDASHDOT

Helios Abandoned. Atropos Conquered.
Member
Oct 26, 2017
3,076
Honestly tho. Shit, I would probably pay more than that. I just want to maximize the potential of the display and my enjoyment and minimize the fucking around. And it's totally one or the other. With the weak ass CPUs in this gen of consoles, that's off the table, so PC is pretty much the only way to go. I used to have more time to eff around with all that stuff but i'm doing the classic move to the right on the time versus money curve where I would gladly just throw money at it to not have to mess around with so much of it.

To be fair, PCs have gotten a lot better overall in the simplicity of building and getting stuff to work generally, but 4K and HDR (as well as HDMI versioning) have just thrown entirely new wrenches into the whole process.

I have to confess, that my PC is gathering dust, and probably for similar reasons that you have stated, the whole HDR thing has really pissed me off, and since I bought an X, the PC has more or less gone unused, and that houses an 1080ti too. Not good.
 

Kudo

Member
Oct 25, 2017
3,893
I've only got 1070 but still find myself getting most HDR enabled games on Pro.
Usually a fight between higher fps and graphics or ease of HDR and use. If there was console that did 4K60HDR I'd be willing to get second job for it.
 

GearDraxon

Member
Oct 25, 2017
2,786
I am this weird 3rd segment of the market where I would happily pay for a $1K console, if it meant it would match PC performance with zero effing around.

I have to confess, that my PC is gathering dust, and probably for similar reasons that you have stated, the whole HDR thing has really pissed me off, and since I bought an X, the PC has more or less gone unused, and that houses an 1080ti too. Not good.
Meanwhile, I just traded in my X1S and PS4 Pro for Steam cards. I realized that the load times and frequent 30fps content annoyed me enough to where I hadn't used them for gaming in months.

I'll admit that for me, the tinkering is totally part of the hobby. If that weren't the case, I can see myself feeling otherwise.
 

MrBob

Member
Oct 25, 2017
6,670
I'm considering buying Far Cry 5 on xbox right now instead of PC because I fear I won't be able to get HDR to work properly. The thought of being stuck at 30FPS hurts though.

PC why you suck at HDR.

I think I'll buy Far Cry 5 on Steam and if I can't get HDR to work I return it. Hooray for digital returns.
 
Status
Not open for further replies.