• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.
  • We have made minor adjustments to how the search bar works on ResetEra. You can read about the changes here.
Status
Not open for further replies.

tokkun

Member
Oct 27, 2017
5,413
All he posted was that the official certification tests won't be available until later this year. We have known that for a long time. The actual specs of HDMI 2.1 have been known by manufacturers since last year. The SECOND those tests are available, the manufacturers will run them on the chipsets they have had ready for months now and BOOM. HDMI 2.1 certified, in PLENTY of time for CES January 2019.

The timeline says the spec will be out in Q4. When in Q4? Who knows. If it comes out on October 1st, we're probably fine. If it comes out on December 31st and CES is on January 8th, no one will have time to certify. It really comes down to the risk tolerance of the manufacturers. Some may not want to risk designing around uncertified hardware. And again, this assumes that you trust the Q4 number. I'll remind you what happened with the original spec release. HDMI forum announced in January that it would be out in early Q2, so basically in 2-3 months. What happened was:

Q2: <crickets>
Q3: <crickets>
mid-Q4: Spec is finally released.

So I will be paying attention to whether they hit their schedule on the parts of the test specification that are supposed to be released earlier. If it's all crickets again, I will assume another delay is coming.

I'll tell you, one of the things that worries me is the lack of receivers being released with pre-HDMI 2.1 features. Normally AVR companies love to jump the gun on this type of stuff, release draft hardware, and promise firmware updates that may or may not ever come. It was a big thing with HDMI 2.0. The fact that I don't see it happening yet with HDMI 2.1 makes me think that the ASICs are not really ready yet.
 
May 3, 2018
390
And so it begins...

k7FlUKfh.jpg


giphy.gif
 
OP
OP
Jeremiah

Jeremiah

Member
Oct 25, 2017
774
Welp, I must have found a unicorn because this is the cleanest looking OLED I've ever owned. Uniformity, near black and banding wise. Running some slides to get to 4 hours for that first compensation cycle to kick in, fingers crossed!
The size difference is insane, more pronounced than going from 55 to 65. Nothing really makes up for size... even when taking into account field of view. 65 inches at 6.5 feet should look the same as 75 inches at 7.5, but it doesn't.
The motion also seems much improved in 24p/30p content that I've tried. The 'stutter' associated with OLED is way less pronounced. And seems like motion resolution has increase a little? Can't be certain yet whether the increase in size is masking this and this is all placebo.

Biggest difference is HDR game mode. It's certainly brighter by default, though fiddling with in-game brightness settings on the 2017 you can match them very close.
Tone mapping makes games too bright IMO for night time use, but great for day time use. I'll continue to use it with movies however.

If you don't notice banding or care about uniformity much, the 2017 OLED's is the better buy. But shit, if you do, that's really the biggest difference this year, and for me, it's well worth the price of admission. No banding, uniform and no DSE!

The only problem is that even though the VESA mount is the same with the C7, it actually droops 3 inches lower than the 65 inch model. My center channel now clips the bottom of the screen lol

How about you let me get that unused Q9 for the price of...on the house?

Hah! I actually cancelled it. Got a chance to see it at Value Electronics in New York during a layover, preferred the OLED.
 

Kyle Cross

Member
Oct 25, 2017
8,439
On a B7 if Gamma 2.4/BT.1886 is considered the correct Gamma setting, what is it in Game Mode? It's Low, Medium, High1, and High2. Which would be the 2.4/BT.1886 equivalent?
 

Deleted member 14649

User requested account closure
Banned
Oct 27, 2017
3,524
If you don't notice banding or care about uniformity much, the 2017 OLED's is the better buy. But shit, if you do, that's really the biggest difference this year, and for me, it's well worth the price of admission. No banding, uniform and no DSE!

You probably just got lucky, as the 77" model is using the same panel as the 2017 range. They might also have higher QC for the more expensive set.
 

Slackbladder

Member
Nov 24, 2017
1,146
Kent
I use my PC on my TV. Are Oleds bad for this because of burn in? Should I go with LED instead
For absolute peace of mind go with LCD. My C8 shows games from my PC fine (the HDR is glorious) but ABL (auto brightness limiter which is on all OLEDS) make a google page or anything with a bright white background dim. Also extended periods of time with things like tabs and icons could easily lead to image retention (which can be fixed with the TV options or goes away over time) or, in worst case scenarios permanent burn in. I only use my C8 for games rather then browsing etc. Best option is a FALD (full array local dimmming) LCD lke the new Samsung Q9FN. If you only want to game with a Pc on your TV I'd say an OLED is fine.
 

Kompis

Member
Oct 27, 2017
1,021
Nothing really makes up for size... even when taking into account field of view. 65 inches at 6.5 feet should look the same as 75 inches at 7.5, but it doesn't.

It shouldn't really.

The viewing area of a 77" tv is 40% bigger than a 65". While sitting at 7.5 feet is only 15% further away than 6.5 feet.
It would really be more comparable to sitting roughly 9 feet away on the 77".

Congrats on the TV though! :3
 

Family

Banned
Feb 25, 2018
152
Definitely a smaller screen closer is just not the same at all as a bigger screen further away.

There is something about a large screen AND lots of space between you and the screen that provides for so much of a better experience. It feels so much better.

You can also jack up the brightness and get better HDR in the process, but if its too close the brightness is too much.

At 65 inch now, and would love 75, but I think in terms of being an eye sore, and manoeuvrability, going over 65 doesn't work in my current setup. Maybe when I move back to my other house.
 

MazeHaze

Member
Nov 1, 2017
8,584
Can someone explain to me why HFR is being praised as the "next" big thing? I've heard this for a while, and noticed it yesterday watching HDTVtest review the C8. Vincent said the TV supports 120hz via internal apps, but not over HDMI. It DOES support 120hz via hdmi on a PC, the C7 does as well. Tons of TVs over the past few years support 120hz input from a PC, yet there's all these articles about how HFR will be implemented in TVs next year.

Am I missing something?
 

Scently

Member
Oct 27, 2017
1,464
Can someone explain to me why HFR is being praised as the "next" big thing? I've heard this for a while, and noticed it yesterday watching HDTVtest review the C8. Vincent said the TV supports 120hz via internal apps, but not over HDMI. It DOES support 120hz via hdmi on a PC, the C7 does as well. Tons of TVs over the past few years support 120hz input from a PC, yet there's all these articles about how HFR will be implemented in TVs next year.

Am I missing something?
In 4k. Those other TVs support 120fps in 1080p.
 

tokkun

Member
Oct 27, 2017
5,413
Can someone explain to me why HFR is being praised as the "next" big thing? I've heard this for a while, and noticed it yesterday watching HDTVtest review the C8. Vincent said the TV supports 120hz via internal apps, but not over HDMI. It DOES support 120hz via hdmi on a PC, the C7 does as well. Tons of TVs over the past few years support 120hz input from a PC, yet there's all these articles about how HFR will be implemented in TVs next year.

Am I missing something?

It now supports 120 Hz at native resolution, so that's new. And although 1080p/120 worked in the past, if I'm not mistaken it was not advertised and only worked with PCs. Vincent and a lot of the A/V crowd tend to be pretty clueless about gaming, so I am not surprised that he didn't know about a feature that was only of interest to PC gamers.

As for why it is getting attention this year, it's a marketing thing. My guess is that LG originally thought these 2018 models would be HDMI 2.1. When the standard ended up getting delayed, the 2018 OLEDs turned out to be a pretty incremental upgrade over the 2017 models, which you can see in a lot of the reviews from sites like CNet and Rtings, where they tell people to just buy the 2017 model instead. LG is trying hard to come up with ways to convince people that the 2018 models are worth double the price of the 2017s, and this is one differentiator, however pointless it is to most people.
 

RedlineRonin

Member
Oct 30, 2017
2,620
Minneapolis
It now supports 120 Hz at native resolution, so that's new. And although 1080p/120 worked in the past, if I'm not mistaken it was not advertised and only worked with PCs. Vincent and a lot of the A/V crowd tend to be pretty clueless about gaming, so I am not surprised that he didn't know about a feature that was only of interest to PC gamers.

As for why it is getting attention this year, it's a marketing thing. My guess is that LG originally thought these 2018 models would be HDMI 2.1. When the standard ended up getting delayed, the 2018 OLEDs turned out to be a pretty incremental upgrade over the 2017 models, which you can see in a lot of the reviews from sites like CNet and Rtings, where they tell people to just buy the 2017 model instead. LG is trying hard to come up with ways to convince people that the 2018 models are worth double the price of the 2017s, and this is one differentiator, however pointless it is to most people.
Usually CEDIA in particular has a bigger focus on AVRs. I think, to your previous comments, if we don't see anything 2.1 enabled in AVRs by CEDIA/IFA, it'll definitely be an indicator of what to expect at CES

Until you actually brought up the AVR point, I had completely forgotten about the fever pitch there was 10ish years back, around the time 1.4 was being pushed out. It was going to enable bitstream for lossless audio which was the biggest evolution in audio since dolby digital. And to your point, manufacturers were doing everything they could to market gear as being ready for lossless audio.
 
Apr 21, 2018
240
On a B7 if Gamma 2.4/BT.1886 is considered the correct Gamma setting, what is it in Game Mode? It's Low, Medium, High1, and High2. Which would be the 2.4/BT.1886 equivalent?

Its not like its the only correct setting, you should select gamma depending on your room environment.

2.2/Medium for a bright room
2.4/bt1886/high for a dark room
 

tokkun

Member
Oct 27, 2017
5,413
Like I said, my guess is they are hitting technical hurdles. And clearly they are much more challenging than they initially thought, because it takes some serious issues to miss your estimated window by 9-12 months.

I remember after the draft spec was first unveiled, some of the professional cable makers posted comments like "we have no idea how they are going to get this to work".

Some interesting comments on the finalized spec here:
https://www.bluejeanscable.com/articles/what-about-hdmi-2.1.htm

Those who've been following our writing on HDMI for a while will recall that, long ago, we argued that because of the inherently poorer impedance stability of twisted pair cable and the high-frequency demands of HDMI signaling, the writers of the original spec ought to have built it around a coaxial cable geometry rather than a bundle of data pairs (which was done to maintain backward compatibility with the DVI specification). This, which we felt from a wire-and-cable perspective was rather obvious, led to a surprising number of people letting us know that, in their view, we had no idea what we were talking about. It was therefore a bit amusing to see that the HDMI 2.1 spec's example of a compliant Category 3 cable design replaces the four data pairs with eight coaxes -- still running differential signals, but in twinned coaxes rather than in conventional twisted pairs.

While we suppose we ought to feel somewhat vindicated there, it does point out some unfortunate features of the HDMI and HDMI 2.1 worlds. HDMI ought to have been run, originally, in coaxes without differential signaling; coax, if properly constructed for the application, has the bandwidth to run these signals -- even the signals called for in 2.1. We are, in fact, now selling a full line of 12G SDI coaxial cables for professional video camera feeds and video production. But a proper coax-based HDMI design would have had the video signals simply running unbalanced, not in a differential signaling configuration. Doing it this way opens up a range of difficulties in cable construction, including:

<further technical details>

These are just a few of the issues. Getting 12 Gbps performance out of micro-coaxes, or out of conventional twisted pairs, is going to be troublesome; it's easy to make a sketch of the cable, and much harder to make the actual cable stock. Whatever you may hear casually said about it being "just ones and zeros," there is nothing easy about shoving twelve billion of those ones and zeros down a length of cable every second.
 

Branson

Member
Oct 27, 2017
2,772
For absolute peace of mind go with LCD. My C8 shows games from my PC fine (the HDR is glorious) but ABL (auto brightness limiter which is on all OLEDS) make a google page or anything with a bright white background dim. Also extended periods of time with things like tabs and icons could easily lead to image retention (which can be fixed with the TV options or goes away over time) or, in worst case scenarios permanent burn in. I only use my C8 for games rather then browsing etc. Best option is a FALD (full array local dimmming) LCD lke the new Samsung Q9FN. If you only want to game with a Pc on your TV I'd say an OLED is fine.
I do everything with my PC. And lol I meant LCD not LED. I mainly game but use it like a PC also with windows and tabs up for a couple of hours sometimes, watching YouTube videos or side by side windows for like a game on one side and a video on the other side. I need to get a second monitor for that stuff.
 

RedlineRonin

Member
Oct 30, 2017
2,620
Minneapolis
I remember after the draft spec was first unveiled, some of the professional cable makers posted comments like "we have no idea how they are going to get this to work".

Some interesting comments on the finalized spec here:
https://www.bluejeanscable.com/articles/what-about-hdmi-2.1.htm
Very interesting! Thanks for sharing.

If anyone knows what they're talking about in regards to cables, it's the BJC guys.

E: Also hilarious that he's talking about the challenges of 12gbps and spec for 2.1 is 48gbps. I can't imagine they challenges they're facing.
 
Last edited:
Apr 21, 2018
240
I know, my room is dimly lit so BT.1886/2.4 is the "correct" gamma. However HDR Game mode is locked to Medium. I also hear High1 is 2.4 while High2 is BT.1886?

its not like HDR its locked to Medium, HDR use a new gamma standart called Gamma PQ(SMPTE ST.2084) which use gamma 2.2 as a base to create the HDR curve.

remember that gamma 2.2 its still the native gamma of every panel, and HDR its based on that.

and yes, High is 2.4 and High2 is BT1886, they are very close to each other, maybe BT1886 comes a little more fast in shadow detail but the difference its hard to see.
 

Kyle Cross

Member
Oct 25, 2017
8,439
its not like HDR its locked to Medium, HDR use a new gamma standart called Gamma PQ(SMPTE ST.2084) which use gamma 2.2 as a base to create the HDR curve.

remember that gamma 2.2 its still the native gamma of every panel, and HDR its based on that.

and yes, High is 2.4 and High2 is BT1886, they are very close to each other, maybe BT1886 comes a little more fast in shadow detail but the difference its hard to see.
Ah, okay. So don't worry about HDR being locked to Medium, it all works out due to the nature of HDR? I haven't noticed any raised blacks in HDR like I have in SDR when set to 2.2/Medium.
 

DOTDASHDOT

Helios Abandoned. Atropos Conquered.
Member
Oct 26, 2017
3,076
Ah, okay. So don't worry about HDR being locked to Medium, it all works out due to the nature of HDR? I haven't noticed any raised blacks in HDR like I have in SDR when set to 2.2/Medium.

Gamma does act similarly to brightness, with SDR especially, above blacks will appear even lighter, the lower you go with gamma, and if you use 51/52 brightness, coupled with a high OLED light, it will make some scenes look truly dreadful.
 

Deleted member 17092

User requested account closure
Banned
Oct 27, 2017
20,360
For all the dimness FUD on OLEDs, in game mode or not, what are y'all smoking? My B7 speculars actually hurt my eyes in GOW and 4k brs. In a bright room even it's dramatically brighter than my x800d. Is it just you've never had an OLED and want to justify your LED?
 

DOTDASHDOT

Helios Abandoned. Atropos Conquered.
Member
Oct 26, 2017
3,076
For all the dimness FUD on OLEDs, in game mode or not, what are y'all smoking? My B7 speculars actually hurt my eyes in GOW and 4k brs. In a bright room even it's dramatically brighter than my x800d. Is it just you've never had an OLED and want to justify your LED?

It's crazy talk on a whole other level! It's all starting to come out In the wash now, with 'professionals' actually coming out with the truth......at last :)
 

jstevenson

Developer at Insomniac Games
Verified
Oct 25, 2017
2,042
Burbank CA
the downside of over the years having gone from a 13" to a 20" to a 34" to a 50" to a 55" to a 58" to a 65"

is I know sometime in the next few years I'm gonna be springing for a 75-77" flat panel, and that shit is gonna hurt the ole wallet
 

Deleted member 12177

User requested account closure
Banned
Oct 27, 2017
375
the downside of over the years having gone from a 13" to a 20" to a 34" to a 50" to a 55" to a 58" to a 65"

is I know sometime in the next few years I'm gonna be springing for a 75-77" flat panel, and that shit is gonna hurt the ole wallet

Not really. The reason the sizes are going up on average is because the cost to produce and ultimately, the consumer cost is coming down.

Later this year, you'll be able to get a OLED 77inch for 5k or under. That wasn't possible with even a 55 inch version a few years ago.
 

Deleted member 17092

User requested account closure
Banned
Oct 27, 2017
20,360
Not really. The reason the sizes are going up on average is because the cost to produce and ultimately, the consumer cost is coming down.

Later this year, you'll be able to get a OLED 77inch for 5k or under. That wasn't possible with even a 55 inch version a few years ago.

Tv tech is like Moore's law to the extreme. We will see an entry level 55 OLED for like $700 within 2 years, and it will be better than my B7. Just a 65 4k panel with no HDR was like $2200 3 years ago. Now you can get a more advanced TCL or vizio for a third of the cost. Or yeah the OLED comparison is even crazier, like 1/4 to 1/5 the cost in a few years.
 

Yibby

Member
Nov 10, 2017
1,781
For all the dimness FUD on OLEDs, in game mode or not, what are y'all smoking? My B7 speculars actually hurt my eyes in GOW and 4k brs. In a bright room even it's dramatically brighter than my x800d. Is it just you've never had an OLED and want to justify your LED?
Watch the "It" 4k Bluray in a dark room. Those damn kids and their flashlights :D. I can't imagine how the Z9D must be, with its 1600 nits.
 

Deleted member 17092

User requested account closure
Banned
Oct 27, 2017
20,360
Watch the "It" 4k Bluray in a dark room. Those damn kids and their flashlights :D. I can't imagine how the Z9D must be, with its 1600 nits.

Well that's just it (pun intended), with the infinite contrast of an OLED I have a hard time believing those 1600 nits are actually going to give you better results for most situations. It's like 0 to 100 vs 75 to 150.
 
Apr 21, 2018
240
Ah, okay. So don't worry about HDR being locked to Medium, it all works out due to the nature of HDR? I haven't noticed any raised blacks in HDR like I have in SDR when set to 2.2/Medium.

For example l use gamma 2.2 even in my dark room, its what my plasma use and my monitors are calibrated too., also 89.9% of BDs are mastered on gamma 2.2, same for internet video(stream) and games.

but l have to modify a little the luminence level because 2.2 was a little bit washed out ootb.

a good calibrated 2.2 gamma in a oled its amazing, tons of shadow details without the picture looks whaded out, too satured or too dark.

Is PC mode in LG 7 OLEDs broken on all modes or just HDR?

just HDR, PC mode works great for SDR gaming and PC use, you get low input lag and the sharpness of 4:4:4 chroma, the only dowside its that you just destroy HDR modes.
 

Kyle Cross

Member
Oct 25, 2017
8,439
For example l use gamma 2.2 even in my dark room, its what my plasma use and my monitors are calibrated too., also 89.9% of BDs are mastered on gamma 2.2, same for internet video(stream) and games.

but l have to modify a little the luminence level because 2.2 was a little bit washed out ootb.

a good calibrated 2.2 gamma in a oled its amazing, tons of shadow details without the picture looks whaded out, too satured or too dark.



just HDR, PC mode works great for SDR gaming and PC use, you get low input lag and the sharpness of 4:4:4 chroma, the only dowside its that you just destroy HDR modes.
I'm so confused. I've heard 2.4 is what to use for my room condition, but the TV and content is actually mastered in 2.2?
 

wbloop

Member
Oct 26, 2017
2,273
Germany
I'm so confused. I've heard 2.4 is what to use for my room condition, but the TV and content is actually mastered in 2.2?
I'm wondering too, now. I darken my room when I play on the TV and I have SDR and HDR set to 2.2/Medium gamma according to multiple calibration guides. Have I been doing it wrong all the time and that is the reason why God of War looked kinda washed out in the darker color department?
 

KCsoLucky

Member
Oct 29, 2017
1,585
I've had a 65" B7 in my living room for a while now. I recently bought a 49" X900E for my gaming/PC/non-wife room at a good price and I'm surprised at how well it holds up with me being used to the OLED. I didn't care for the difference in picture at first but since I've gotten used to it, it's been great especially since that room sees a lot of light. I just plugged my 60" Samsung FN8500 back in that I hadn't used in over a year and I am also surprised with how well that has held up. I would have kept using it as the primary TV for a while if it wasn't for the abysmal input lag. It's perfect for bedroom viewing for the wife and kids though.
 

Gatti-man

Banned
Jan 31, 2018
2,359
So I took delivery of my c8 77" OLED. God of war is stunning. The difference in colors and fine detail compared to my FALD Vizio P75 is pretty huge. I'm seriously impressed.

Also refresh rate is better. For example when I use murder of crows in GoW I see far more detail and enemy movement than on the Vizio. It's a fabulous TV
 
Status
Not open for further replies.