True, but look at all that grain! ;)Not to pile on but I see burn-in. This is probably a floor model.
True, but look at all that grain! ;)Not to pile on but I see burn-in. This is probably a floor model.
All he posted was that the official certification tests won't be available until later this year. We have known that for a long time. The actual specs of HDMI 2.1 have been known by manufacturers since last year. The SECOND those tests are available, the manufacturers will run them on the chipsets they have had ready for months now and BOOM. HDMI 2.1 certified, in PLENTY of time for CES January 2019.
How about you let me get that unused Q9 for the price of...on the house?
If you don't notice banding or care about uniformity much, the 2017 OLED's is the better buy. But shit, if you do, that's really the biggest difference this year, and for me, it's well worth the price of admission. No banding, uniform and no DSE!
Probably true as Vincent said his 77" C8 was also great in regards to uniformity.You probably just got lucky, as the 77" model is using the same panel as the 2017 range. They might also have higher QC for the more expensive set.
For absolute peace of mind go with LCD. My C8 shows games from my PC fine (the HDR is glorious) but ABL (auto brightness limiter which is on all OLEDS) make a google page or anything with a bright white background dim. Also extended periods of time with things like tabs and icons could easily lead to image retention (which can be fixed with the TV options or goes away over time) or, in worst case scenarios permanent burn in. I only use my C8 for games rather then browsing etc. Best option is a FALD (full array local dimmming) LCD lke the new Samsung Q9FN. If you only want to game with a Pc on your TV I'd say an OLED is fine.I use my PC on my TV. Are Oleds bad for this because of burn in? Should I go with LED instead
Nothing really makes up for size... even when taking into account field of view. 65 inches at 6.5 feet should look the same as 75 inches at 7.5, but it doesn't.
In 4k. Those other TVs support 120fps in 1080p.Can someone explain to me why HFR is being praised as the "next" big thing? I've heard this for a while, and noticed it yesterday watching HDTVtest review the C8. Vincent said the TV supports 120hz via internal apps, but not over HDMI. It DOES support 120hz via hdmi on a PC, the C7 does as well. Tons of TVs over the past few years support 120hz input from a PC, yet there's all these articles about how HFR will be implemented in TVs next year.
Am I missing something?
Ahhhh, that's it thanks.
Can someone explain to me why HFR is being praised as the "next" big thing? I've heard this for a while, and noticed it yesterday watching HDTVtest review the C8. Vincent said the TV supports 120hz via internal apps, but not over HDMI. It DOES support 120hz via hdmi on a PC, the C7 does as well. Tons of TVs over the past few years support 120hz input from a PC, yet there's all these articles about how HFR will be implemented in TVs next year.
Am I missing something?
Usually CEDIA in particular has a bigger focus on AVRs. I think, to your previous comments, if we don't see anything 2.1 enabled in AVRs by CEDIA/IFA, it'll definitely be an indicator of what to expect at CESIt now supports 120 Hz at native resolution, so that's new. And although 1080p/120 worked in the past, if I'm not mistaken it was not advertised and only worked with PCs. Vincent and a lot of the A/V crowd tend to be pretty clueless about gaming, so I am not surprised that he didn't know about a feature that was only of interest to PC gamers.
As for why it is getting attention this year, it's a marketing thing. My guess is that LG originally thought these 2018 models would be HDMI 2.1. When the standard ended up getting delayed, the 2018 OLEDs turned out to be a pretty incremental upgrade over the 2017 models, which you can see in a lot of the reviews from sites like CNet and Rtings, where they tell people to just buy the 2017 model instead. LG is trying hard to come up with ways to convince people that the 2018 models are worth double the price of the 2017s, and this is one differentiator, however pointless it is to most people.
Like I said, my guess is they are hitting technical hurdles. And clearly they are much more challenging than they initially thought, because it takes some serious issues to miss your estimated window by 9-12 months.
On a B7 if Gamma 2.4/BT.1886 is considered the correct Gamma setting, what is it in Game Mode? It's Low, Medium, High1, and High2. Which would be the 2.4/BT.1886 equivalent?
Like I said, my guess is they are hitting technical hurdles. And clearly they are much more challenging than they initially thought, because it takes some serious issues to miss your estimated window by 9-12 months.
I know, my room is dimly lit so BT.1886/2.4 is the "correct" gamma. However HDR Game mode is locked to Medium. I also hear High1 is 2.4 while High2 is BT.1886?Its not like its the only correct setting, you should select gamma depending on your room environment.
2.2/Medium for a bright room
2.4/bt1886/high for a dark room
Like I said, my guess is they are hitting technical hurdles. And clearly they are much more challenging than they initially thought, because it takes some serious issues to miss your estimated window by 9-12 months.
Those who've been following our writing on HDMI for a while will recall that, long ago, we argued that because of the inherently poorer impedance stability of twisted pair cable and the high-frequency demands of HDMI signaling, the writers of the original spec ought to have built it around a coaxial cable geometry rather than a bundle of data pairs (which was done to maintain backward compatibility with the DVI specification). This, which we felt from a wire-and-cable perspective was rather obvious, led to a surprising number of people letting us know that, in their view, we had no idea what we were talking about. It was therefore a bit amusing to see that the HDMI 2.1 spec's example of a compliant Category 3 cable design replaces the four data pairs with eight coaxes -- still running differential signals, but in twinned coaxes rather than in conventional twisted pairs.
While we suppose we ought to feel somewhat vindicated there, it does point out some unfortunate features of the HDMI and HDMI 2.1 worlds. HDMI ought to have been run, originally, in coaxes without differential signaling; coax, if properly constructed for the application, has the bandwidth to run these signals -- even the signals called for in 2.1. We are, in fact, now selling a full line of 12G SDI coaxial cables for professional video camera feeds and video production. But a proper coax-based HDMI design would have had the video signals simply running unbalanced, not in a differential signaling configuration. Doing it this way opens up a range of difficulties in cable construction, including:
<further technical details>
These are just a few of the issues. Getting 12 Gbps performance out of micro-coaxes, or out of conventional twisted pairs, is going to be troublesome; it's easy to make a sketch of the cable, and much harder to make the actual cable stock. Whatever you may hear casually said about it being "just ones and zeros," there is nothing easy about shoving twelve billion of those ones and zeros down a length of cable every second.
I do everything with my PC. And lol I meant LCD not LED. I mainly game but use it like a PC also with windows and tabs up for a couple of hours sometimes, watching YouTube videos or side by side windows for like a game on one side and a video on the other side. I need to get a second monitor for that stuff.For absolute peace of mind go with LCD. My C8 shows games from my PC fine (the HDR is glorious) but ABL (auto brightness limiter which is on all OLEDS) make a google page or anything with a bright white background dim. Also extended periods of time with things like tabs and icons could easily lead to image retention (which can be fixed with the TV options or goes away over time) or, in worst case scenarios permanent burn in. I only use my C8 for games rather then browsing etc. Best option is a FALD (full array local dimmming) LCD lke the new Samsung Q9FN. If you only want to game with a Pc on your TV I'd say an OLED is fine.
Very interesting! Thanks for sharing.I remember after the draft spec was first unveiled, some of the professional cable makers posted comments like "we have no idea how they are going to get this to work".
Some interesting comments on the finalized spec here:
https://www.bluejeanscable.com/articles/what-about-hdmi-2.1.htm
I know, my room is dimly lit so BT.1886/2.4 is the "correct" gamma. However HDR Game mode is locked to Medium. I also hear High1 is 2.4 while High2 is BT.1886?
Ah, okay. So don't worry about HDR being locked to Medium, it all works out due to the nature of HDR? I haven't noticed any raised blacks in HDR like I have in SDR when set to 2.2/Medium.its not like HDR its locked to Medium, HDR use a new gamma standart called Gamma PQ(SMPTE ST.2084) which use gamma 2.2 as a base to create the HDR curve.
remember that gamma 2.2 its still the native gamma of every panel, and HDR its based on that.
and yes, High is 2.4 and High2 is BT1886, they are very close to each other, maybe BT1886 comes a little more fast in shadow detail but the difference its hard to see.
Ah, okay. So don't worry about HDR being locked to Medium, it all works out due to the nature of HDR? I haven't noticed any raised blacks in HDR like I have in SDR when set to 2.2/Medium.
For all the dimness FUD on OLEDs, in game mode or not, what are y'all smoking? My B7 speculars actually hurt my eyes in GOW and 4k brs. In a bright room even it's dramatically brighter than my x800d. Is it just you've never had an OLED and want to justify your LED?
Edit: Misread.For all the dimness FUD on OLEDs, in game mode or not, what are y'all smoking? My B7 speculars actually hurt my eyes in GOW and 4k brs. In a bright room even it's dramatically brighter than my x800d. Is it just you've never had an OLED and want to justify your LED?
Just HDR
the downside of over the years having gone from a 13" to a 20" to a 34" to a 50" to a 55" to a 58" to a 65"
is I know sometime in the next few years I'm gonna be springing for a 75-77" flat panel, and that shit is gonna hurt the ole wallet
Not really. The reason the sizes are going up on average is because the cost to produce and ultimately, the consumer cost is coming down.
Later this year, you'll be able to get a OLED 77inch for 5k or under. That wasn't possible with even a 55 inch version a few years ago.
Watch the "It" 4k Bluray in a dark room. Those damn kids and their flashlights :D. I can't imagine how the Z9D must be, with its 1600 nits.For all the dimness FUD on OLEDs, in game mode or not, what are y'all smoking? My B7 speculars actually hurt my eyes in GOW and 4k brs. In a bright room even it's dramatically brighter than my x800d. Is it just you've never had an OLED and want to justify your LED?
What specifically is "PC mode" on the b7 anyway? I haven't found anything labeled that way as I've been setting up my b7 over the last week.
Watch the "It" 4k Bluray in a dark room. Those damn kids and their flashlights :D. I can't imagine how the Z9D must be, with its 1600 nits.
Ah, okay. So don't worry about HDR being locked to Medium, it all works out due to the nature of HDR? I haven't noticed any raised blacks in HDR like I have in SDR when set to 2.2/Medium.
I'm so confused. I've heard 2.4 is what to use for my room condition, but the TV and content is actually mastered in 2.2?For example l use gamma 2.2 even in my dark room, its what my plasma use and my monitors are calibrated too., also 89.9% of BDs are mastered on gamma 2.2, same for internet video(stream) and games.
but l have to modify a little the luminence level because 2.2 was a little bit washed out ootb.
a good calibrated 2.2 gamma in a oled its amazing, tons of shadow details without the picture looks whaded out, too satured or too dark.
just HDR, PC mode works great for SDR gaming and PC use, you get low input lag and the sharpness of 4:4:4 chroma, the only dowside its that you just destroy HDR modes.
I'm wondering too, now. I darken my room when I play on the TV and I have SDR and HDR set to 2.2/Medium gamma according to multiple calibration guides. Have I been doing it wrong all the time and that is the reason why God of War looked kinda washed out in the darker color department?I'm so confused. I've heard 2.4 is what to use for my room condition, but the TV and content is actually mastered in 2.2?
Anybody here buy the 4k version of Blade Runner 2049? I want to know where redeem the code so I can get the 4K version.