• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.

EvilBoris

Prophet of Truth - HDTVtest
Verified
Oct 29, 2017
16,678
Sabrina is so dark that it fools b7s into thinking there's just a static image on screen so it auto dims even more. I noticed it when I had to get up and paused the show and suddenly the screen got a hell of a lot brighter, it sucks.

Oh dear that is bad^
I really don't know what on earth they've done with that show. I'm all for some cinema style low brightness grading , but what in the dark lord's name were they thinking?
 

FrankNitty

Member
Oct 25, 2017
593
SoCal
Random question since you are a more experienced calibrator. (I've Been practicing with HCFR and an i1 Display Pro). I've seen some mixed answers on this regarding contrast. My LG B8 displays whiter than white video levels at default contrast in ISF preset. I've seen some argue that you should preserve them because some media is mastered with whiter than white levels in highlights etc, but I've AB'd between the two and it doesn't really appear that I'm missing anything in highlights when raising my contrast to clip just above video level.

I was concerned that maybe it would negatively effect gamma, but upon measuring that doesn't seem to be the case. It appears to still hit target gammas.

Do you really think it's as big of a deal to clip "whiter than white" as some people make of it? Even THX's patterns seem to imply clipping whiter than white is "proper" as they don't want you to be seeing the "upper" contrast bars flashing.
When setting with patterns, I seem to get a less flat, more punchy image, and I'm not seeing any visible blowout.

Would love your thoughts.

In media media there is content above 235 but that's what full range is for. in video very little content will be above 235, but there i no reason not to display it. If it's there and you're display is capable of displaying anything above 235 you should display it, and if you can do it without clipping then that is the best. In all cases if something is to be thrown away it is always at the high end since so little can exist there. I personally wouldn't clip it out intentionally unless it was the very highest end and I had a need to.

What you want to do though is also check flashing bars for all color. To see if it is clipping anything in the primaries and secondaries. There is a pattern that displays white, black, red, green, blue, cyan, yellow and magenta that show video level and flashes. see if you are clipping below video level on that.

With good controls on a display contrast will not impact gamma.

The way I see it personally is if it is there. I want. If it can display it I want to see it.
 

Hasney

One Winged Slayer
The Fallen
Oct 25, 2017
18,589
So my current TV is a Sony Android TV and I'm moving to WebOS on LG with the C9. What functionality will I be missing?

My mum has an LG and I've used it the last couple of years I've been home for Christmas, so from that experience and a quick bit of Googling, I'm missing:

  • Easy Google Cast support. Some apps like YouTube can support it, but if I want to cast other sources, I need to use a 3rd party app?
  • Steam streaming since there's no equivalent WebOS app
If I want these, especially the latter, is it worth getting an NVidia Shield? I'm half wanting to wait to see if NVidia updates it with an HDMI 2.1 model, but all seems quiet on that front.
 

TheModestGun

Banned
Dec 5, 2017
3,781
In media media there is content above 235 but that's what full range is for. in video very little content will be above 235, but there i no reason not to display it. If it's there and you're display is capable of displaying anything above 235 you should display it, and if you can do it without clipping then that is the best. In all cases if something is to be thrown away it is always at the high end since so little can exist there. I personally wouldn't clip it out intentionally unless it was the very highest end and I had a need to.

What you want to do though is also check flashing bars for all color. To see if it is clipping anything in the primaries and secondaries. There is a pattern that displays white, black, red, green, blue, cyan, yellow and magenta that show video level and flashes. see if you are clipping below video level on that.

With good controls on a display contrast will not impact gamma.

The way I see it personally is if it is there. I want. If it can display it I want to see it.
Interesting! So you are saying that contrast set too high has the potential to introduce color clipping? Do you have a link to this pattern by any chance? I think I've seen it before but not sure what specifically what to search for. I understand the argument for wanting to have the possibility of wanting to see all upper detail if it's there, but if all of my devices are set for limited video level and my THX patterns suggest you should clip WTW, aren't I just artificially limiting the amount of contrast I could be utilizing before actually clipping since very little utilizes whiter than white other than highlights that aren't necessarily even utilized or noticeable to my eyes in most movies?

Or at this point would it make more sense to bring the "OLED light" setting up more if I want the image to feel a bit less flat?
 
Last edited:

marecki

Member
Aug 2, 2018
251
I suspect that some of the problems are down to the Xbox. On my Q9FN, when Freesync is on ALLM can't switch it off , so game mode is always turned on even if you boot a Blu-ray.
Disabling VRR stops this behaviour
That is exactly the same on C9, I didn't mean to say it doesn't work, it just gets triggered on boot and stays on, which isn't right. However when switching xbox to 1440 ALLM works correctly (triggers when a game is started)
 

FrankNitty

Member
Oct 25, 2017
593
SoCal
Interesting! So you are saying that contrast set too high has the potential to introduce color clipping? Do you have a link to this pattern by any chance? I think I've seen it before but not sure what specifically what to search for. I understand the argument for wanting to have the possibility of wanting to see all upper detail if it's there, but if all of my devices are set for limited video level and my THX patterns suggest you should clip WTW, aren't I just artificially limiting the amount of contrast I could be utilizing before actually clipping since very little utilizes whiter than white other than highlights that aren't necessarily even utilized or noticeable to my eyes in most movies?

Scroll down about halfway through. It's called 7 color clipping bars
https://displaycalibrations.com/patterns_overview.html#Color-Reproduction-02
It looks similar to this https://www.youtube.com/watch?v=jbxABSglKE0

I think AVS 709 disc has something similar to this as well but it's been awhile since I loaded that disc up. Yes you are limiting the overall contrast of the display if you clip WTW. Either way is not really wrong. I always go for making sure it can display the full dynamic range. If your preference is to only calibrate to legal video level of 235 and clip everything above it's not wrong as long as you are calibrating to your coordinates for the desired color space (which if you are looking at THX it would be Rec 709) the calibration is correct.

I was taught to display everything WTW just before clipping and color shift. That's how I have been doing it for 10+ years. When you calibrate you calibrate to coordinates. There is no coordinate that we calibrate to that is for WTW. Everything is within legal video level. So yes you can clip WTW and it is still correct. No reason to clip it though. If it can display it, display it.
 

Winstano

Editor-in-chief at nextgenbase.com
Verified
Oct 28, 2017
1,828
Just watched Into the Spider-Verse on 4k Blu-Ray on my 803... Holy flurking schmidt. This is by far the best purchase I've ever made. Absolutely unreal. The scenes with Kingpin when he filled the screen were as creepy as they'll get, just a little ball of flesh in a sea of pure black. I want more!
 
Oct 27, 2017
4,641
I just stumbled upon this fairly recent video of Vincent doing a comparison between HDR10+ and HDR10.

It's very interesting. Frankly, the difference is surprisingly very underwhelming...

It sure doesn't make me regret owning a TV that only supports HDR10 and Dolby Vision.



I do wonder why LG doesn't support HDR10+ when they already have good support for HDR10, HLG & Dolby Vision. Does anyone know? And is there Any chance that they could patch it in or is it a hardware dependent thing?
 

Skyfireblaze

Member
Oct 25, 2017
11,257
I have a question, my mum has a new LG TV now and is happy with it and considering the price we paid I think we made a good deal, especially since we wanted something that could be maintained locally warranty-wise. But while shopping around online I also came around the fabled TCL 6-series, or so I think atleast. If I understand things correctly, is this my local variant of a 43" TCL 6-series TV? It's called TLC 43DP640 and are they really as good as everyone says online? If so I might bite because I could grab it for a good price.

I'm not particularly fussed about TVs as my TV usage mostly consists of local anime dubs, soccer whenever the German national-team plays and gaming whenever friends are over. Most of my gaming is done on my 144hz 1080p VRR monitor as I'm primarily a PC gamer and while I'm not really fazed about 4k, I am interested in HDR and do happen to have PS4 Pro. Plus as weird as it might sound, as I play lots of emulated retro-games too, the prospect of being able to run CRT Royale properly at 4k is enticing too.

So yeah, what is this TLC 43DP640?
 

aevanhoe

Slayer of the Eternal Voidslurper
Member
Aug 28, 2018
7,323
I do wonder why LG doesn't support HDR10+ when they already have good support for HDR10, HLG & Dolby Vision. Does anyone know? And is there Any chance that they could patch it in or is it a hardware dependent thing?

Because HDR10+ is a Samsung thing, I guess. The real question is why doesn't Samsung and others support Dolby Vision? HDR is a mess and has been for a while, I hope we get a standard accepted by everyone that supports dynamic metadata.
 
Oct 27, 2017
4,641
Because HDR10+ is a Samsung thing, I guess. The real question is why doesn't Samsung and others support Dolby Vision? HDR is a mess and has been for a while, I hope we get a standard accepted by everyone that supports dynamic metadata.
Samsung probably don't support Dolby Vision because of the licensing fees (which are supposedly quite high). HDR10 & HDR10+ are supposedly license free.

A quick search of Dolby's site doesn't tell you the costs, you have to contact them and provide company details and I assume negotiate something... HDR10 license program seems to much simpler in comparison - the website says there is actually a fee (I think an "admin fee" rather than a licence fee) but it seems pretty low upfront in comparison.

In the event that an Adopter enters into multiple license agreements with Licensor, the maximum aggregate annual administration fee Adopter shall pay under all such license agreements shall not exceed US$10,000.00.

That seems pretty reasonable for a company of LG's size. I hate format wars...
 
Last edited:

Chamber

Member
Oct 25, 2017
5,279
TCL can pay the DV licensing fee but the 5th most profitable company in world can't swing it? There's probably a bigger reason than that.
 

Skyfireblaze

Member
Oct 25, 2017
11,257
I have another question while I'm thinking about the above TLC 43DP640 I asked about. Is it likely that OLED will eventually trickle-down to the 300-400€ range in the next two or so years or will it take longer than that?
 

MrBob

Member
Oct 25, 2017
6,668
I do wonder why LG doesn't support HDR10+ when they already have good support for HDR10, HLG & Dolby Vision. Does anyone know? And is there Any chance that they could patch it in or is it a hardware dependent thing?
I think it is because LG already implement their own version of hdr 10+ since the 2017 oled series with active hdr. Sony has a version of active hdr on their tvs too.
 
Oct 26, 2017
5,435
Hi , guys. Could use some input

My TCL 607 is a goner and Best Buy has set me up for an exchange, which I can exercise within 90 days. So my question is what to go with? I can go for a TCL 615 but wanted to ask in case new sets were on their way in the next 90 days and prices across current sets change.

What I'm working with here is the price point they are crediting me ($499.99)

edit: I'm just having them deliver the r615
 
Last edited:

EvilBoris

Prophet of Truth - HDTVtest
Verified
Oct 29, 2017
16,678
I think it is because LG already implement their own version of hdr 10+ since the 2017 oled series with active hdr. Sony has a version of active hdr on their tvs too.

People are getting confused between Samsung's HDR+ which is the fake HDR mode that Sony and LG will also have an equivalent of.

HDR10+ is dynamic metadata specified at the point of mastering and requires support to read it.
 
Last edited:

Deleted member 49179

User requested account closure
Banned
Oct 30, 2018
4,140
People are getting confused between Samsung's HDR+ which is the fake HDR mode that Sony and LG will also have an equivalent of.

HDR10+ is dynamic metadata specified at the point of mastering and requires support to read it.

Oh, so is it what "Active HDR" all about? The fake SDR to HDR conversion the TV has an option to do?
 

EvilBoris

Prophet of Truth - HDTVtest
Verified
Oct 29, 2017
16,678
Oh, so is it what "Active HDR" all about? The fake SDR to HDR conversion the TV has an option to do?

Active HDR = Dynamic Tone mapping. They changed the name I think when they went to a newer processor. I'm sure I will be corrected

I'm not sure if this is the HDR effect mode on LG, or that is a seperate thing. It could quite easily be as again, that can be handled via tonemapping
 

DOTDASHDOT

Helios Abandoned. Atropos Conquered.
Member
Oct 26, 2017
3,076
Active HDR = Dynamic Tone mapping. They changed the name I think when they went to a newer processor. I'm sure I will be corrected

I'm not sure if this is the HDR effect mode on LG, or that is a seperate thing. It could quite easily be as again, that can be handled via tonemapping

HDR effect is SDR to HDR conversion.
 

MrBob

Member
Oct 25, 2017
6,668
Oh, so is it what "Active HDR" all about? The fake SDR to HDR conversion the TV has an option to do?
No, that is a separate mode. You can add fake"hdr" to sdr content. There are 3 levels of intensity and I'm slightly surprised how well the lowest level does for movies. Works better than i thought it would.

Active hdr uses an algorithm to add dynamic Metadata to hdr10 content. Been hoping to see Vincent do a comparison with hdr10+ to see how lgs solution stacks up.
 
Last edited:
Nov 20, 2017
335
Looks like Samsung released the RU series to replace the NU series this year? Is the RU series still a decent line to go with?

For the most part, the TVs are identical, with the main upgrade being Bluetooth on more models.

The one exception would be the RU8000, who got a worse local dimming than the NU series. They wanted to make it cheaper and to better differ it from the Q60.
 

Sanctuary

Member
Oct 27, 2017
14,198
Active HDR = Dynamic Tone mapping. They changed the name I think when they went to a newer processor. I'm sure I will be corrected

I'm not sure if this is the HDR effect mode on LG, or that is a seperate thing. It could quite easily be as again, that can be handled via tonemapping

LG's response to HDR10+ was that their dynamic tone mapping was just as good, and they didn't feel obliged to support it. Which is a silly thing to say when they already support Dolby Vision, which is superior to both. "HDR effect" might be using something similar behind the scenes, but it's a separate picture mode. Dynamic tone mapping is an option you can turn on or off within a legitimate HDR picture mode. LG must believe it's hot shit though, since it's always on by default out of the box.

I can't honestly decide yet whether or not I actually like it. It seems like for content that was mastered to 1000 nits, it's not really needed and it lowers the gamma in some areas (it doesn't look bad, but it's not accurate), but for content that was mastered up to 4000 nits or even higher, it helps somewhat.
 
Last edited:

rou021

Member
Oct 27, 2017
526
That's not really a Rtings-specific issue. I was watching HDTVTest's preview video of the C9 and they said the same thing almost word for word.

I think AV enthusiasts often tend to make this type of mistake where they can't see the forest for the trees. Take calibration. Unless your TV is noticeably inaccurate, I don't see the advantage in it. Aside from skin tones, people aren't that good at noticing differences in colors unless you are doing a side-by-side test. You spend hundreds of dollars to improve color accuracy (which you probably won't notice) and the side effect in many cases is increased posterization (which is much easier to notice). These days when TVs come with highly accurate ISF or Technicolor modes out of the box it just seems like a waste.

There are a bunch of other examples I could rant about, but I'll stop there. At the end of the day, I think it comes down to people feeling insecure about dedicating so much of their time and money to a hobby, so they get pretentious about it.
I certainly understand your skepticism towards better picture accuracy. For a lot of people, it's probably isn't worth the trouble when factoring in the time and money required to get there. This doesn't mean that people can't notice the difference between an accurate and an inaccurate picture. To illustrate this point, the targets used in consumer display calibration typically aren't how close the TV gets to "perfect". It's not about trying to reach an arbitrarily ideal standard that nobody can see. In actually, the goal is to get visual errors just below the threshold of being noticeable to the average viewer. These thresholds are determined by years of research and the data say regular people can notice these differences.

That said, there are multiple reasons why everybody's not complaining about how off the picture looks on their display. For one, most people aren't paying close attention to the various aspects of the picture and even if they did, they probably still wouldn't care that it's not correct. Another is that most people have been used to watching an inaccurate picture their whole lives, so they don't know what's right and what's wrong. Indeed, many people are shocked at how their TV looks after it's been calibrated and think it looks worse. Just look through older posts in this thread to see what it's like when people simply change to a warmer, though more accurate, color temperature setting. The opinions vary, but they don't usually entail, "I can't really tell much of a difference".

The environment and viewing conditions in which a display is viewed also affects the perception of contrast, gamma, color, and resolution/sharpness. Even a Grade 1 studio monitor can look wrong in the wrong kind of environment. This is not to mention the content being viewed. If someone's watching something that's low resolution, bit starved, and poorly mastered, then the inaccuracies may not stand out as much. In one context the difference between a calibrated and an uncalibrated picture can be dramatic; in others it can be minimal. This is why these factors are considered before choosing what targets to hit for a calibration (if the calibrator is a good one anyway).

I should also add that the ISF and Technicolor modes, while more accurate than most presets, aren't "highly accurate". I'm not even sure I'd describe them as accurate to begin with, but that's another story. Will they be better then a lot of the other options and sufficient for many people though? Probably, but they still have a lot of inaccuracies that will be, to a degree, noticeable. And you should NOT be getting increased posterization from a calibration. In fact, you should get LESS or even no posterization after calibration. If it's worse, I'd suggest getting a refund.

I mean I get it. Sometimes enthusiasts get can get a little too...well, enthusiastic about this stuff. Whether or not a proper calibration is worth the time and money is going to depend on a number of factors. Truth be told, for some it probably isn't worth it. Nonetheless, it's a going a bit far to say that people who are into picture accuracy and calibration are insecure and pretentious.
 

CrypticSlayer

Member
Oct 27, 2017
647
Vizio isn't competing with the TCL with the Quantum. They have lower budget sets already that compete in that space with the P and M series
Isn't the P quantum this year a renaming of the P series line? It's just 1000 nits.
I'm wondering this too. 65" is too big for my apartment, so I was hoping they'd put out a 55" model for me to consider.
Yup this is me as well. Removing the 55 inch option would be backwards to me especially since the m series has been 600 nits and last year's model was failing to hit that.
 

henhowc

Member
Oct 26, 2017
33,454
Los Angeles, CA
Isn't the P quantum this year a renaming of the P series line? It's just 1000 nits.

Yup this is me as well. Removing the 55 inch option would be backwards to me especially since the m series has been 600 nits and last year's model was failing to hit that.

ah i didn't even realize they were renaming. so confusing. 2018 PQ is now PQX for 2019. 2018 P us now PQ. 2018 M is now MQ lol what the hell. my bad.
 

ss_lemonade

Member
Oct 27, 2017
6,648
So yeah, what is this TLC 43DP640?
My guess is this is different from the TCL P series in the US since the US P605/607 support HDR10 and Dolby Vision while this dp640 only seems to support HDR10. I actually have the same issue in the Philippines identifying TVs, where our TCL P series is also different and seems to be an even more budget model than the p607 while also being identified as a 2018 series TV (the P series in the US is the 2017 lineup. I think the R series is the 2018 one).
 

DOTDASHDOT

Helios Abandoned. Atropos Conquered.
Member
Oct 26, 2017
3,076
Anyone else use BFI on a B or C8 for gaming? I really like it, clears up the blur really well and the flickering blends in when you are playing, used to love it on my W905 Sony.
 

Hasney

One Winged Slayer
The Fallen
Oct 25, 2017
18,589
Anyone else use BFI on a B or C8 for gaming? I really like it, clears up the blur really well and the flickering blends in when you are playing, used to love it on my W905 Sony.

With my current TV, I had it on during UFC 3 and I was thinking that it looked really smooth, but fuck me did I purchase the wrong TV because the input lag was horrendous. It's horiffic for response times
 

Skyfireblaze

Member
Oct 25, 2017
11,257
My guess is this is different from the TCL P series in the US since the US P605/607 support HDR10 and Dolby Vision while this dp640 only seems to support HDR10. I actually have the same issue in the Philippines identifying TVs, where our TCL P series is also different and seems to be an even more budget model than the p607 while also being identified as a 2018 series TV (the P series in the US is the 2017 lineup. I think the R series is the 2018 one).

Yeah that's really infuriating, since local variants are so common it's impossible to say for sure what you're buying. So by what you are saying this 350€ TCL I saw isn't nearly as good as I think it would be?
 

DOTDASHDOT

Helios Abandoned. Atropos Conquered.
Member
Oct 26, 2017
3,076
With my current TV, I had it on during UFC 3 and I was thinking that it looked really smooth, but fuck me did I purchase the wrong TV because the input lag was horrendous. It's horiffic for response times

That's the good thing about the 8 series OLED's, there is no increase in lag, it's way better than I thought it was going to be, all the horror stories are from people using it with the wrong content/and or refresh rates, it has to be 60hz only, or in conjunction with frame interpolation for video content for it to be effective, I feel like I just got the best feature back from my Panasonic DX and Sony, with all the benefits of OLED.
 

ss_lemonade

Member
Oct 27, 2017
6,648
Yeah that's really infuriating, since local variants are so common it's impossible to say for sure what you're buying. So by what you are saying this 350€ TCL I saw isn't nearly as good as I think it would be?
I can't really say. That price though is very similar to the P6 that we have in the Philippines, and the P6 we have (after looking at impressions online from sites like https://forums.whirlpool.net.au/archive/2701106 since it seems like we have the same TV models as Australia ) doesn't seem to get bright enough for HDR and/or is considered to be one of those "fake" budget HDR TVs. Again, not sure if the specific model you are looking at has the same issues but it seems to be in the same price range.
 

EvilBoris

Prophet of Truth - HDTVtest
Verified
Oct 29, 2017
16,678
So. The ALLM actually works better on my TV when it's enabled on the TV but not on the Xbox.
Very odd
 

ss_lemonade

Member
Oct 27, 2017
6,648
Anyone else use BFI on a B or C8 for gaming? I really like it, clears up the blur really well and the flickering blends in when you are playing, used to love it on my W905 Sony.
Impulse is awesome on my w900a, but the drop in brightness sucks when not playing in a dark environment. Is brightness an issue too with these OLEDs and BFI?
 

Skyfireblaze

Member
Oct 25, 2017
11,257
I can't really say. That price though is very similar to the P6 that we have in the Philippines, and the P6 we have (after looking at impressions online from sites like https://forums.whirlpool.net.au/archive/2701106 since it seems like we have the same TV models as Australia ) doesn't seem to get bright enough for HDR and/or is considered to be one of those "fake" budget HDR TVs. Again, not sure if the specific model you are looking at has the same issues but it seems to be in the same price range.

I see, thanks for the info! I'll refrain from buying it then, I don't really need a new TV with how little I use mine anyway but I thought if I could get good HDR for that price I would take it but if that TV can't do HDR well then it's a moot-point for me.
 

tokkun

Member
Oct 27, 2017
5,399
Nonetheless, it's a going a bit far to say that people who are into picture accuracy and calibration are insecure and pretentious.

That's not what I intended to convey.

Let me give you an illustrative example. The other day on AVSForum some dude was posting pictures of his OLED TV setup with some Philips Hue lightstrips doing an Ambilight type thing with Hue Sync. If you're unfamiliar with the concept, it analyzes the content of the image and projects colored lights onto your wall to give you a sense of having a more expansive visual field. It's a bit gimmicky, but it's a fun gimmick from time to time. Anyway you can probably guess the direction of the comments. A bunch of people piled on him saying "You don't need accent lighting with an OLED, you'll compromise your black levels" and "Any bias lighting that is not 6500K will throw off your color accuracy." Some people allowed that it was OK if you were playing a video game, but not when doing "critical viewing".

The pretentious part here is not these people enjoy the image they get from a calibrated set, it's that they believe this is the only correct way for anyone to use a TV, and that people doing anything else are getting an objectively inferior experience.

Also if you describe what you are doing as "critical viewing" and it is not related your job, that's pretty pretentious on its own.
 

DOTDASHDOT

Helios Abandoned. Atropos Conquered.
Member
Oct 26, 2017
3,076
Impulse is awesome on my w900a, but the drop in brightness sucks when not playing in a dark environment. Is brightness an issue too with these OLEDs and BFI?

I loved it on my W900 (W905A) you can make the brightness back up by going to 100 (max) about the equivalent of 50 without.

Like I was saying though, people use it in the wrong circumstances then say it's bad, but 60hz it's damn good.