Jeffram

Member
Oct 29, 2017
3,992
Neither the C9 nor the CX are "future-proof" in the sense that they support the full HDMI 2.1 spec. Their panels are 10-bit and cannot handle the full bandwidth of 2.1 which enables 4K and 120 FPS. But it's worth pointing out there is no current AV receiver in the market or devices that support 2.1 anyway. They are future proof in the sense that they support variable refresh rate, which I personally think is more important than 120 FPS.

If money isn't an problem, I think you'll do well with the 65" model. If you're somewhat budget conscious, then I strongly recommend waiting. The 65C9 also retailed with a MSRP of $3500 or more, and it was discounted to $2500 during Black Friday. It's somewhat likely that the CX will see steep discounts into the high $2000s during the holidays.
Appreciate the response. Yeah, I don't feel like I'll be sweating over 12bit color anytime soon, so the 2.1 specs today are more than enough. I only just now found out my Sony Receiver (STR-DN1080) from 2017 was updated for eArc, which is great.

My intentions going into the year were to wait until Black Friday 2020 for the CX for the deals. You're probably right on the 65 probably coming down another $1000 this year. What's tempting me is the 77" price. The 77" CX at $6500 is cheaper than the 77" C9 ever got during black friday. Maybe I'm just focusing on that too much. It did start at a lower price so there's probably more to go through the year.
 

superNESjoe

Developer at Limited Run Games
Verified
Oct 26, 2017
1,165
I'm in the market for a new 4k TV for the consoles in my home office. I can't spend a ton of money, I'd like to keep it under $1k. Any recommendations?
 

Orayn

Member
Oct 25, 2017
11,386
Is there a direct successor to the C9 that does support the full HDMI 2.1 spec? That's what I'd probably be after, at 65".
 

evilalien

Teyvat Traveler
Member
Oct 27, 2017
1,542
I've heard that maybe Micro-LED won't be a thing, and that samsung is investing in OLED+Quantum dots?

Micro-LED will be a thing and Samsung intends to be selling them in complete shipped units (not requiring professional installation) this year (at an absurd price). However, they are moving into QD-OLED as you said, and you should take this to mean that Micro-LED will not be price competitive with OLED for a very long time, at least 5 years if not longer. It would be a complete waste of money to retool their production lines for any smaller timescale.
 

Broken Hope

Banned
Oct 27, 2017
1,316
Neither the C9 nor the CX are "future-proof" in the sense that they support the full HDMI 2.1 spec. Their panels are 10-bit and cannot handle the full bandwidth of 2.1 which enables 4K and 120 FPS. But it's worth pointing out there is no current AV receiver in the market or devices that support 2.1 anyway. They are future proof in the sense that they support variable refresh rate, which I personally think is more important than 120 FPS.

If money isn't an problem, I think you'll do well with the 65" model. If you're somewhat budget conscious, then I strongly recommend waiting. The 65C9 also retailed with a MSRP of $3500 or more, and it was discounted to $2500 during Black Friday. It's somewhat likely that the CX will see steep discounts into the high $2000s during the holidays.
What are you talking about? They both support the full 2.1 spec including 4K 120 FPS.

Yes they don't have 12 bit panels, but no one else does either.
 

Megapighead

Member
May 2, 2018
798
I just pulled the trigger on a 65 inch LG E9. A bit unnecessary considering how much cheaper the C9 is, but I felt like splurging a bit and that form factor is just gorgeous.

Ready for next gen!
 

Mest08

Alt Account
Banned
Oct 30, 2017
1,184
Not sure if this is a good place to ask, but I'm planning on buying a 4k set in September or October for next gen. How many hdmi 2.1 ports are needed if I'm running a receiver and ps4? Most sets seem to have just one.
 

ss_lemonade

Member
Oct 27, 2017
6,741
Thanks for providing. Good to hear its better. Only hesitation is that it's more impressions vs. a proper side by side comparison. Have you seen anything like this with the new firmware?


That's the video that scared me away from Samsung

I've seen this video before in another thread and it does disappoint me quite a bit since I have an older Samsung KS9000 that does not have degraded picture quality while in game mode.
 

Pargon

Member
Oct 27, 2017
12,251
Is the jump to 77" worth it? I don't mind spending that much on a 77 inch if it's only going to be once in the next 5-6 years. BUT if TV tech is going to advance substantially in the next 3-4 years (like micro-led), I'd rather just get a 65" now and splurge on that future tech that can be my "forever" TV. Obviously not forever, but one I pretty much don't have to worry about changing away from as we're really into diminishing returns on resolution, color, contrast and brightness.

I've heard that maybe Micro-LED won't be a thing, and that samsung is investing in OLED+Quantum dots?
What do ya'll think? Are today's OLEDs future proof to go ahead and get one and not feel buyers remorse given the tech track?
I'll say this much: the last time I bought a TV, I went with 46″ rather than 52″ back when 52″ was about as large as most LCDs got, because I did not intend to be keeping it for half as long as I did.
Here I am 10 years later, wishing that I had just spent the extra money back then.

At the same time, the current state of OLED is certainly far from perfect.
  • Brightness is dim enough that they have to cheat using a white subpixel, which has a tendency to add a blue tint to the image.
  • They are noticeably less-saturated than some competitors (QLED) in HDR content.
  • Motion handling is far from perfect.
So they are certainly not a "future-proof" purchase.
But they are very good, and a TV you could "live with" for a long time because they have near-perfect contrast, better viewing angles than LCD, and don't have issues like ghosting in motion like LCD does.

As good as OLEDs are, there's still a lot of room for display technology to improve from this point; but it depends how long you are prepared to wait.
I don't expect anything significantly better than current OLEDs within the first few of years of next-gen - at least not within that kind of price range.

At 4K, no the 77" is not worth it. I feel the 65" is a really good size overall.
Most people don't sit close enough to even a 77″ screen to require >4K.

That's the video that scared me away from Samsung
Those are the 2019 models and Samsung have improved things substantially this year.
I expect HDTVtest will do another comparison at some point, but a typical local-dimming LCD design is still never going to compare to OLED in dark scenes.


In the video it states that inverse ghosting was fixed at 60Hz, but appears again at 60Hz with FreeSync enabled; but speculates that it won't be a problem since you'll be using 120Hz VRR next-gen.
I would not expect that to be the case. With VRR, a game running at 60 FPS will be updating at 60Hz - so I would expect the inverse ghosting to be present there as well.
This is one of the advantages that OLED has - the response times are significantly faster so ghosting like that is not an issue.

I've seen this video before in another thread and it does disappoint me quite a bit since I have an older Samsung KS9000 that does not have degraded picture quality while in game mode.
The KS9000 doesn't have a full-array local dimming feature to be disabled in Game Mode.
It can't lose contrast it never had to begin with.
 
Last edited:

ss_lemonade

Member
Oct 27, 2017
6,741
The KS9000 doesn't have a full-array local dimming feature to be disabled in Game Mode.
It can't lose contrast it never had to begin with.
True, but it also doesn't turn into that blue-ish, washed out image shown with the Horizon comparison. I remember you commenting on this before though, about how the KS8000/9000 has better contrast than the q90r in game mode so that's probably why.
 

Pargon

Member
Oct 27, 2017
12,251
True, but it also doesn't turn into that blue-ish, washed out image shown with the Horizon comparison. I remember you commenting on this before though, about how the KS8000/9000 has better contrast than the q90r in game mode so that's probably why.
That certainly is a factor. According to RTINGS' reviews:
  • The KS9000 has a 6800:1 native contrast ratio - which is extremely high for LCD.
  • The Q90R has a 3250:1 native contrast ratio, but is 11,200:1 with local dimming.
  • The Q90T has a 4000:1 native contrast ratio, but is 10,500:1 with local dimming.
I'm not entirely convinced that RTINGS' contrast ratio measurement of the KS9000 is accurate though - another review measured half of that.
Either way, it's unfortunate that RTINGS don't measure contrast ratio in Game Mode, so all we know is that it's going to be somewhere between 3250:1–11,200:1 for the Q90R and somewhere between 4000:1–10,500:1 for the Q90T.
It should be higher on the Q90T than the Q90R; but without measurements how much higher, and how that compares to the KS9000, is anyone's guess.

It should be noted that any LCD can look washed-out compared to an OLED in a dark room depending on the camera exposure.
How a TV looks to a camera can be different from how it looks to the eye - especially in the dark.
 

laxu

Member
Nov 26, 2017
2,788
I've seen this video before in another thread and it does disappoint me quite a bit since I have an older Samsung KS9000 that does not have degraded picture quality while in game mode.

I have a KS8000 and also own a LG C9. I also did not feel the KS8000 changed in any significant way in game mode, still looks good.

But at the same time, it's got plenty of issues. HDR tends to be washed out due to its poor local dimming, HDR adds enough input lag for it to become noticeable and it has VA black smearing issues to a much worse degree than my Samsung CRG9 desktop monitor (Samsung VA QLED panel).

The LG C9 is just better in every way. Input lag is a non-issue, HDR looks stellar, there is just way more depth to the overall image and there are no response time related anomalies.
 

Hasney

One Winged Slayer
The Fallen
Oct 25, 2017
19,366
Is there a direct successor to the C9 that does support the full HDMI 2.1 spec? That's what I'd probably be after, at 65".

The C9 does support the full spec. It's successor, the CX, does not support the full bandwidth, but it's still enough for 4k/120Hz at 10-bit. This should only be an issue for people hooking up a PC with an NVidia card as they currently do not allow you to select 10-bit colour.
 

Orayn

Member
Oct 25, 2017
11,386
The C9 does support the full spec. It's successor, the CX, does not support the full bandwidth, but it's still enough for 4k/120Hz at 10-bit. This should only be an issue for people hooking up a PC with an NVidia card as they currently do not allow you to select 10-bit colour.
Am I misunderstanding this post saying neither supports the full bandwidth?
Neither the C9 nor the CX are "future-proof" in the sense that they support the full HDMI 2.1 spec. Their panels are 10-bit and cannot handle the full bandwidth of 2.1 which enables 4K and 120 FPS. But it's worth pointing out there is no current AV receiver in the market or devices that support 2.1 anyway. They are future proof in the sense that they support variable refresh rate, which I personally think is more important than 120 FPS.
 

AndyD

Mambo Number PS5
Member
Oct 27, 2017
8,602
Nashville
For folks that want OLED, the C9 and CX are going to be the way to go.

For folks that want a more affordable FALD set for brighter brights despite the imperfect black levels and light halos, I've been partial to the Vizio P-Series Quantum (2018) and Quantum X (2019+). For future proofing, the 2020 PQX sets will have HDMI 2.1 and the brightest consumer set picture in all the land for the best specular highlights (if that's what you prioritize), and decent (though not best in class) input lag. Bad smart features though - you need a proper streaming box to supplement the TV correctly.
Honestly, pairing one of these 2020 Vizios with a rock solid Roku or the new remote Chromecast for smart features is likely the way to go as those devices are usually far better and more up to date with respect to apps and streaming quality and features than built in apps.
 

Lump

One Winged Slayer
Member
Oct 25, 2017
16,438
The slightly lower bandwidth on the CX's HDMI port compared to the C9 should realistically never matter since the limitation only prevents a 12 bit signal at 4k120hz, but the panel not being 12 bit means 10 bit should be fine anyway.

Realistically, the only company that needs to make sure there are no unnecessary technical issues with the limitation is Nvidia. No one else is going to push the limits of 4K 120hz plus HDR any time soon. It's already on Nvidia to provide HDMI 2.1 on their next cards. There are also some minor issues they'll need to address in the Nvidia control panel when it comes to identifying the LG CX's hz range properly, as the "use display's max refresh" individual game option simply isn't working right now for games that don't have in-game hz selection like Rocket League. I have to imagine that's just a driver update.
 

DavidDesu

Banned
Oct 29, 2017
5,718
Glasgow, Scotland
Are B9 and C9 basically foolproof options for the coming generation with great picture quality? I can maybe afford a B9 but I'm gonna need to know it will last for years. How's the software and speed of the B9? I have a crappy 4K LG and it is so slow and sometimes restarts apps when it feels like it. Will the B9 be much better in this regard because it seems like the ultimate next gen tv that I can afford.
 

ss_lemonade

Member
Oct 27, 2017
6,741
I have a KS8000 and also own a LG C9. I also did not feel the KS8000 changed in any significant way in game mode, still looks good.

But at the same time, it's got plenty of issues. HDR tends to be washed out due to its poor local dimming, HDR adds enough input lag for it to become noticeable and it has VA black smearing issues to a much worse degree than my Samsung CRG9 desktop monitor (Samsung VA QLED panel).

The LG C9 is just better in every way. Input lag is a non-issue, HDR looks stellar, there is just way more depth to the overall image and there are no response time related anomalies.
Interesting. Can't say I notice the difference in input lag between SDR and HDR. PC mode is where I start to notice the difference, but my brain gets used to it immediately so maybe I'm just not as sensitive. No washed out issues that I notice, but I am at a point where I wish I could do something about the massive blooming.

I'm interested in what you think about the brightness difference. I believe the Samsung can still get brighter than the oled, but I'm guessing that becomes a non-issue when taking into account all the positives you get with the newer TV?
 

molnizzle

Banned
Oct 25, 2017
17,695
Sometimes I really hate the snail's pace at which TV manufacturers operate.

Why is there still no full 2.1 LCD on the market?
 

Dreamwriter

Member
Oct 27, 2017
7,461
Are B9 and C9 basically foolproof options for the coming generation with great picture quality?
Depends on what you prefer for picture quality. They are OLED, which means they don't get really bright HDR, so you would be losing out there. But their black level is of course perfect, which leads to great contrast.
 

ReturnOfThaMack88

Alt-Account
Banned
May 30, 2020
567
So tempted to jump from a B8 to a C9 or E9. But I know I should probably just wait one more round of TVs huh? The B8 is still a beautiful picture for me.
 
Oct 27, 2017
4,696
The C9 does support the full spec. It's successor, the CX, does not support the full bandwidth, but it's still enough for 4k/120Hz at 10-bit. This should only be an issue for people hooking up a PC with an NVidia card as they currently do not allow you to select 10-bit colour.
The wildest thing about this is that I'm pretty sure most games internally max out at 10bit so I don't know why Nvidia doesn't have an option for it.
 

Dreamwriter

Member
Oct 27, 2017
7,461
Sometimes I really hate the snail's pace at which TV manufacturers operate.

Why is there still no full 2.1 LCD on the market?
What exact 2.1 feature are you waiting for? Just supporting "2.1" isn't helpful, it's the features that HDMI 2.1 makes possible that makes it future proof, and being able to accept the 2.1 input doesn't guarantee that any of those features will be supported. From what I understand, the only game-related features 2.1 gives a 4k TV that you can't get from older HDMI formats, is 4k/120hz and GSync/FreeSync without DisplayPort. And a new standard for dynamic HDR, but you already get that from normal HDMI with HDR 10+ and Dolby Vision, this would be a third standard that content providers would have to support, and it's not better than the others, just doesn't have a license fee.

Have they confirmed ps5/xsx is hdmi 2.1? I mean I assume but...
Yeah, both have announced support for 4k/120fps, which requires HDMI 2.1.
 

molnizzle

Banned
Oct 25, 2017
17,695
What exact 2.1 feature are you waiting for? Just supporting "2.1" isn't helpful, it's the features that HDMI 2.1 makes possible that makes it future proof, and being able to accept the 2.1 input doesn't guarantee that any of those features will be supported. From what I understand, the only game-related features 2.1 gives a 4k TV that you can't get from older HDMI formats, is 4k/120hz and GSync/FreeSync without DisplayPort. And a new standard for dynamic HDR, but you already get that from normal HDMI with HDR 10+ and Dolby Vision, this would be a third standard that content providers would have to support, and it's not better than the others, just doesn't have a license fee.


Yeah, both have announced support for 4k/120fps, which requires HDMI 2.1.
Yes, the bolded are obviously what I want. Along with eARC, but that is already here for the most part.
 

Maple-Rebel

Attempted to circumvent ban with alt account
Banned
Oct 15, 2018
585
Sometimes I really hate the snail's pace at which TV manufacturers operate.

Why is there still no full 2.1 LCD on the market?

Well to be fair there is no consumer item you can buy that can use the 2.1 connection. I do think all the flagships should have at least 2 2.1 ports
 

Ra

Rap Genius
Moderator
Oct 27, 2017
12,435
Dark Space
I've wanted to go OLED for a long time. I need knowledge.

Would my current gaming laptop at least be able to take advantage of the VRR support of an HDMI 2.1 equipped display? Or does the device connecting to the TV's port need an HDMI 2.1 port as well to communicate?
 

Dreamwriter

Member
Oct 27, 2017
7,461
I've wanted to go OLED for a long time. I need knowledge.

Would my current gaming laptop at least be able to take advantage of the VRR support of an HDMI 2.1 equipped display? Or does the device connecting to the TV's port need an HDMI 2.1 port as well to communicate?
The laptop would need to support HDMI 2.1 and its GPU would need to support VRR (as would the TV). HDMI 2.1 uses the same port, but a different cable, and it needs to be supported on both display and computer/console.
 

mhayze

Member
Nov 18, 2017
556
Nvidia is hampered by bandwidth on current gen GPUs. I can enable 12-bit with HDR on my Sony TV at 1080 or 8 bit (HDR or not) at 4k60 or 2560@120 from an Nvidia GPU. It doesn't have the bandwidth for > 8 bit at 4k over HDMI 2.0, IIRC.

The other HDMI 2.1 features (besides VRR and more modes) are Auto-low-latency and fast switching (QMS), both of which are nice for gaming, the former to make sure the TV is always using a low-latency mode and that you don't see delays and blanks when switching between resolutions, frame rates and modes which happens more when you can quickly bop in out of different games or switch to the dashboard.
 

xir

Member
Oct 27, 2017
12,814
Los Angeles, CA
What exact 2.1 feature are you waiting for? Just supporting "2.1" isn't helpful, it's the features that HDMI 2.1 makes possible that makes it future proof, and being able to accept the 2.1 input doesn't guarantee that any of those features will be supported. From what I understand, the only game-related features 2.1 gives a 4k TV that you can't get from older HDMI formats, is 4k/120hz and GSync/FreeSync without DisplayPort. And a new standard for dynamic HDR, but you already get that from normal HDMI with HDR 10+ and Dolby Vision, this would be a third standard that content providers would have to support, and it's not better than the others, just doesn't have a license fee.


Yeah, both have announced support for 4k/120fps, which requires HDMI 2.1.
Cool so vrr as well?
 

evilalien

Teyvat Traveler
Member
Oct 27, 2017
1,542
Those are the 2019 models and Samsung have improved things substantially this year.
I expect HDTVtest will do another comparison at some point, but a typical local-dimming LCD design is still never going to compare to OLED in dark scenes.

They already did:

The T series is much improved over the R.
 

Pargon

Member
Oct 27, 2017
12,251
Would my current gaming laptop at least be able to take advantage of the VRR support of an HDMI 2.1 equipped display? Or does the device connecting to the TV's port need an HDMI 2.1 port as well to communicate?
It should work if it has a 16 or 20 series GPU. Nothing else supports HDMI-VRR.

They already did:

The T series is much improved over the R.

I meant a comparison against OLED. But yes, the 2020 models seem hugely improved over the 2019 models in Game Mode.
 

Dreamwriter

Member
Oct 27, 2017
7,461
Cool so vrr as well?
Neither company has announced it, but XBox One S and X already have Freesync support, so it's a pretty sure thing that the Series X will support it as well, eventually at least (as I mentioned, just supporting HDMI 2.1 doesn't mean the device or TV supports all the 2.1 features).
 

DrScruffleton

Member
Oct 26, 2017
12,879
Black Friday isn't the best time to buy high end TV's, it's best for low to mid range sets. If you want a C9, now is probably the best time to buy as it'll be discontinued very shortly as the CX is being rolled out.

Oh I ended up getting a C9 relatively short after posting that haha. Got impatient. Great tv!
 

Pargon

Member
Oct 27, 2017
12,251
HDMI 2.1 adds a few things actually. 8K/60, 4K/120, variable refresh rate, eARC (for lossless audio), and auto low latency.
The important thing to note is that you basically need 120Hz support for VRR to work correctly - even if it's only operating below 60 FPS.
This is because most displays can only reduce the minimum refresh rate so low. 40Hz or 48Hz is typical as a minimum supported refresh rate.

In order for VRR to work on frame rates below the minimum supported refresh rate, the maximum needs to be ≥2.5× the minimum.
So if the minimum refresh rate is 48Hz the maximum must be at least 120Hz.
A 48–60Hz range can only support 48–60 FPS. VRR will not be active if it drops to 47 FPS.

A 48–120Hz range can use Low Frame Rate Compensation (LFC) though.
If the frame rate drops to 47 FPS it can be displayed at 94Hz (47 × 2).
This enables an active range of 0–120 FPS rather than only 48–60 FPS if it were limited to 60Hz.

Of course it is possible to support LFC at 60Hz if the minimum is low enough.
If a display supported 24–60Hz LFC could be used and VRR would work from 0–60 FPS.
But most displays have difficulties supporting very low refresh rates. LCDs actually have trouble holding an image on-screen for that long without updating it, which can result in flickering or other problems.

Neither company has announced it, but XBox One S and X already have Freesync support, so it's a pretty sure thing that the Series X will support it as well, eventually at least (as I mentioned, just supporting HDMI 2.1 doesn't mean the device or TV supports all the 2.1 features).
HDMI 2.1 VRR support has been announced for both XSX and PS5.
I've seen it reported that XSX also supports FreeSync, but not from an official source - though I don't think it would be wrong to assume that it does, since the X1X already supports both FreeSync and HDMI-VRR.
 

Ra

Rap Genius
Moderator
Oct 27, 2017
12,435
Dark Space
The laptop would need to support HDMI 2.1 and its GPU would need to support VRR (as would the TV). HDMI 2.1 uses the same port, but a different cable, and it needs to be supported on both display and computer/console.
Thank you for the quick response.

Well my RTX 2080 definitely qualifies. I guess I'm outside looking in on the rest?

Is there such a thing as an HDMI 2.0 compliant TV w/ VRR? I'm a 1080p/120Hz gamer so the 4K bandwidth is wasted on me anyway.

It should work if it has a 16 or 20 series GPU. Nothing else supports HDMI-VRR.
Wait now I'm getting conflicting info.

Let me be more specific. My laptop has HMDI 2.0b, and a 2080. Could connect to say, a C9, and utilize VRR?
 

EvilBoris

Prophet of Truth - HDTVtest
Verified
Oct 29, 2017
16,789
Thank you for the quick response.

Well my RTX 2080 definitely qualifies. I guess I'm outside looking in on the rest?

Is there such a thing as an HDMI 2.0 compliant TV w/ VRR? I'm a 1080p/120Hz gamer so the 4K bandwidth is wasted on me anyway.


Wait now I'm getting conflicting info.

Let me be more specific. My laptop has HMDI 2.0b, and a 2080. Could connect to say, a C9, and utilize VRR?

The Samsung TVs from the last few years do exactly that , they offer freesync over HDMI. 4k60

The Turing cards will do Gsync branded VRR with the C9 within the bandwidth of hdmi 2.0.
 

xir

Member
Oct 27, 2017
12,814
Los Angeles, CA
Neither company has announced it, but XBox One S and X already have Freesync support, so it's a pretty sure thing that the Series X will support it as well, eventually at least (as I mentioned, just supporting HDMI 2.1 doesn't mean the device or TV supports all the 2.1 features).
Yeah I've seen the charts. Confusing and dumb. But what's the over under ps5 has it?
 

EvilBoris

Prophet of Truth - HDTVtest
Verified
Oct 29, 2017
16,789
Neither company has announced it, but XBox One S and X already have Freesync support, so it's a pretty sure thing that the Series X will support it as well, eventually at least (as I mentioned, just supporting HDMI 2.1 doesn't mean the device or TV supports all the 2.1 features).

It's a part of the amd drivers and they have partnered with Samsung who use it in most of their sets.
 

Pargon

Member
Oct 27, 2017
12,251
Wait now I'm getting conflicting info.

Let me be more specific. My laptop has HMDI 2.0b, and a 2080. Could connect to say, a C9, and utilize VRR?
It should be capable of 1440p120 or 4K60 VRR. I believe both are 8-bit.
HDMI 2.1 is required for 4K120 and higher bit-depth support.
 

ShroudOfFate

One Winged Slayer
Banned
Oct 30, 2017
1,551
I was thinking of getting a new display for next gen and by happenstance the pawn shop I work at got an LG 65C7P which seems to be a massive upgrade from my Sony X850D. I'm slightly worried about having an OLED and playing stuff with persistent HUD elements, but the picture is incredible and seems to tick all the boxes for 'future proof' minus 4k/120fps which I'm certain most games won't support for a while.
 

Ra

Rap Genius
Moderator
Oct 27, 2017
12,435
Dark Space
The Samsung TVs from the last few years do exactly that , they offer freesync over HDMI. 4k60

The Turing cards will do Gsync branded VRR with the C9 within the bandwidth of hdmi 2.0.
I appreciate the help as I am completely TV ignorant.

I don't even watch TV anymore and haven't turned the one I have on in ages, I believe the set I have is over a decade old.

Did some digging and RTINGS has this handy list, of TVs that support Adaptive Sync with Nvidia GPUs. Seems to pretty much be just the LGs and one Samsumg right now. Other Freesync/VRR displays apparently allow G-Sync to be enabled but have flickering or go black.
 

wombleac

Member
Nov 8, 2017
712
I want 55 inch min 60 inch max, currently rocking a Kuro 60 inch elite(last 8 years). Money not an issue but dont want burn in risks b/c I play 1 game at a time usually HUD heavy fo 50 to 100 hours. Kuro still a beast, but not sure how PS5 gonna work. Thanks in advance.