• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.
  • We have made minor adjustments to how the search bar works on ResetEra. You can read about the changes here.

Deleted member 14649

User requested account closure
Banned
Oct 27, 2017
3,524
Yeah, it's the same across all their TVs from 2017 to 2019 when playing DV on HDMI. Same DV profile too (profile 5).

On my x930e, DV at gamma default 0 is a bit dark also. And I resisted modifying the settings for the longest time because I use the default gamma in every other picture mode.

But since I bumped the DV gamma a bit (2 points in my case) it's now exactly the same "brightness" level as regular HDR. I tested it on an Apple TV while switching back and forth between HDR and DV on both an Apple movie, and a Netflix DV show.

I just use Dolby Vision Bright with my calibrated colour temperature. Might not be accurate but it is certainly bright enough then.
 

Haint

Banned
Oct 14, 2018
1,361
Hopefully Samsung ends their battle with Dolby by the time these are out.

I don't think there's any kind of feud. Dolby's licensing fee is simply more than Samsung is willing to pay. Based on the Windows/Xbox Atmos Headphone license @ $10, I would guess Vision is probably more expensive considering it has broader support and they've likely got a lot more invested (e.g. the Pulsar mastering displays). Dolby probably sets a non-negotiable flat rate per unit sold that applies to all manufacturers equally, but Samsung wants a large discount based on their volume.
 

Rated-G

Member
Oct 29, 2017
1,328
If you're in any game or app, the system will remain on and active.

It still shuts it off, the second the TV is off, it's like power is completely cut off from the PlayStation. I tried it with GTA V paused, GTA with the PS4 in rest mode, while a YouTube video was playing, and while a Blu-ray was playing.
 

Emick81

Member
Jan 17, 2018
973
Yeah, it's the same across all their TVs from 2017 to 2019 when playing DV on HDMI. Same DV profile too (profile 5).

On my x930e, DV at gamma default 0 is a bit dark also. And I resisted modifying the settings for the longest time because I use the default gamma in every other picture mode.

But since I bumped the DV gamma a bit (2 points in my case) it's now exactly the same "brightness" level as regular HDR. I tested it on an Apple TV while switching back and forth between HDR and DV on both an Apple movie, and a Netflix DV show.
Is Gamma an option you can change while in the Forced DV mode using netflix on the App of the TV itself?

DV on my 930 is way too dark, I always switch to my Shield that does not support DV for those shows.
 

MazeHaze

Member
Nov 1, 2017
8,582
After all that Qled marketing they want to sell OLEDs? Why not just jump to micro led TVs? How do they want to compete with LG? Top emission OLED?
I'm pretty sure microLED still requires a major technological breakthrough in the manufacturing process before it can be viable. These displays have millions of pixels, and the current microLED manufacturing process involves placing each pixel individually. So the cost and time it takes to produce just one panel FAR exceeds the amount of money anybody would pay for it, and mass production is not feasible. It will get there eventually, but it's hard to say when until that breakthrough happens
 

Hobbun

Member
Oct 27, 2017
4,396
Hopefully it's sooner than later.

What kind of drawbacks, besides cost (at first), would be anticipated for MicroLED? Where I don't anticipate we will ever get a 'perfect' TV (although it would be nice), I would like to finally get a television where we don't have to worry about burn-in, banding, artifacting, yellowing of the screen, blooming, ghosting or anything visual related.

Seems like there is always at least one thing for every kind of technology. I would finally prefer our TV is 'not perfect' because it may not last quite as long as hoped (60,000-70,000 hours instead of 100,000) or I have to pay a little extra money because it 'is' better quality (no visual issues). I would be ok with that.
 

Deleted member 49179

User requested account closure
Banned
Oct 30, 2018
4,140
I just use Dolby Vision Bright with my calibrated colour temperature. Might not be accurate but it is certainly bright enough then.

Oh, that's right! This is something they added with their latest TVs. Unfortunately, my x930e doesn't have different picture modes for Dolby Vision (Bright, Dark, etc.) There is only "Dolby Vision". So I'm not sure how those modes are interacting with the picture.

I also have a feeling that properly doing the 10 point grayscale calibration would greatly help with the dim DV picture over HDMI. But I don't have the tools...

Is Gamma an option you can change while in the Forced DV mode using netflix on the App of the TV itself?

DV on my 930 is way too dark, I always switch to my Shield that does not support DV for those shows.

I never use the native apps of the TV, but I'm pretty sure you have access to the settings of the picture modes while using them. Including the Dolby Vision mode.
 

DangerMouse

Member
Oct 25, 2017
22,402
My A8F arrived today and I love it so far! PQ is beautiful and no headaches so far! Though I have not used HDR much at all. Just playing games in reg SDR and they look amazing. Wish the 1080p input lag was better but I guess I'll have to upgrade to PS4 pro. Watched some Endgame in Dolby Vision and the sound thru the acoustic surface sound system sounded sooo good for built-in TV sound. Very happy right now!
Congrats!
 

MrBob

Member
Oct 25, 2017
6,670
I don't think there's any kind of feud. Dolby's licensing fee is simply more than Samsung is willing to pay. Based on the Windows/Xbox Atmos Headphone license @ $10, I would guess Vision is probably more expensive considering it has broader support and they've likely got a lot more invested (e.g. the Pulsar mastering displays). Dolby probably sets a non-negotiable flat rate per unit sold that applies to all manufacturers equally, but Samsung wants a large discount based on their volume.
Yeah maybe Samsung just salty Dolby won't give a discount. The fee can't be too bad though because we are seeing Dolby vision show up in sub 400 dollar tvs. Maybe it is just Samsung pride with hdr10+ right now because it makes them look bad to sell 3500 dollar tvs without Dolby vision when tvs under 500 support dv.
 

DOTDASHDOT

Helios Abandoned. Atropos Conquered.
Member
Oct 26, 2017
3,076
Sometimes it's easy to lose sight just how good these top end TV's are, I'm staying at a cottage which has a 37" JVC 1080p LCD and I connected my Switch up, not a lie to say that my 65" C8 does a lot better in displaying 1080p vs a much smaller native 1080p set, pretty amazing tbh.
 

DangerMouse

Member
Oct 25, 2017
22,402
Alita is on sale on itunes for 12.99. Is it show off tv worthy?
I think it is. Some reviews mention softness to the image but I think it still looks stunning on the screen with all that environment detail, very visible action sequences, and great CG/animation on Alita and the others. The visual style was awesome to me as an anime/scifi fan.
In addition to being a great, fun movie, despite some flaws, they really adapted it well to live action.
 

MrBob

Member
Oct 25, 2017
6,670
Sometimes it's easy to lose sight just how good these top end TV's are, I'm staying at a cottage which has a 37" JVC 1080p LCD and I connected my Switch up, not a lie to say that my 65" C8 does a lot better in displaying 1080p vs a much smaller native 1080p set, pretty amazing tbh.

On this topic of how good tvs have become, TV manufacturers are way ahead of monitor manufacturers in terms features and closing the gap in input lag. Dell just announced a refresh of their three year old 34 inch ultrawide and its a pile of trash for its price point.


DP 1.2 and hdmi 1.4 (Really???)
120 hz screen that is limited by the old g sync module as it has the same panel from lg that they use on their 144 hz 34 inch freesync monitor.
1499 price point with literally offers almost nothing new over the old model which you can buy right now around 800 dollars. Heck even the lg I listed before goes on sale for 800-900 and Dell trying to sell a worse model for almost twice the price.

Dell is even launching a 55 inch oled monitor for 4000 dollars that has dp1.2 (LOL at that not even dp 1.4) and no hdr. It's unbelievable. The TV market is so much better right now. You can almost buy two 65 inch c9s for that 55 inch dell. It's really the dark ages right now for monitors.
 

EeK9X

Member
Jan 31, 2019
1,068
I enable dual monitor in bios, and send an hdmi from the motherboard to the receiver for audio, and another from the gpu to the TV. It's a little bit of a wonky solution but it works for me. In my case my receiver didnt support HDR, but I needed to send audio through the receiver to get surround sound (a PC hooked directly into a TV wont be supplied with the EID for surround sound, even with ARC, there is no way to send anything more than a stereo signal)

Can you get lossless HD audio using the mobo's HDMI? If so, does it work well with Atmos? What drivers are you using? I'm always afraid of trying new audio solutions because it took Microsoft and Nvidia months to fix their Atmos issues (introduced with 1809 and only fixed on 1903).

The problem with connecting a GPU directly to the TV, as I understand it, is that the graphics drivers will only pick up the TV's internal speakers (two) from the EDID, regardless of the TV's connection to the AVR, restricting the output to two channels. That's why only "Stereo" will be available under "Speaker Setup", on Windows. An Nvidia representative said that the software team was in "early discussions" about incorporating a fix as part of a "possible HDTV connectivity makeover". That was eight years ago.

However, even with a direct connection, all audio formats (even the higher resolution ones) still show as supported, so you can bitstream/pass-through digital audio using an optical cable, and your receiver will play the appropriate format in 5.1 surround sound. The problem, in that scenario, is that only HDMI can pass lossless HD audio (including Atmos). Also, most games send uncompressed audio (multichannel PCM), which would then be played in stereo, since a digital optical connection has enough "room" (bandwidth) to transfer just two channels of PCM audio.

Weirdly, with my current setup (GPU connected to the TV via HDMI; TV sending audio to the AVR via a TosLink optical digital audio cable), I can select "Dolby Atmos for Home Theater" on Speaker Setup and even test all of my speakers - with a caveat: most of the time, audio is all over the place.

I have a 6.1 setup, so, in an Atmos 7.1 configuration, the left and right back channels should be spread evenly between the corresponding left or right back speaker and also the center back speaker; meanwhile, the left and right surround channels would go solely to the left and right back speakers, respectively. The way it is now, both the side left and rear back channels are being output by the rear left speaker; the rear right channel is sometimes output by the appropriate rear right speaker, but, other times, by the rear left and the rear center speakers combined; and the side right side channel is usually output by the rear right speaker, but can also be heard on occasion from the front right speaker. Oh, and no matter how many times I select all of the speakers as "full-range" during setup, the surround speakers default to their unchecked state with every retry.

All sound files that I tested (downloaded from here) are decoded as either "Dolby D + DDS" or "DTS + Neural:X", after the AVR's front panel briefly flashes "Dolby Surround". That's likely due to the implementation of Atmos on PC, and Windows is just remixing the audio, I assume. I do hear surround sound, and, unlike when testing speakers, it sounds right.

I was thinking of overriding the TV's or AVR's EDID as a possible workaround to the audio/video issues. I've asked the creator of CRU (Custom Resolution Utility) for some help and am now waiting for his response.

I have an X4200W and it passes 1440p/120 HDR10 just fine, so it would be a surprise if it doesn't work on a newer model in the same price class. Maybe you need to disable some video processing settings in the AVR to make sure it is just doing HDMI passthrough?

In practice, though, I usually choose to run at 1080p/120 instead so I get sharper text and other fine details.

I believe I have everything configured correctly. Is your GPU also Nvidia? If it's connected directly to your Denon AVR via HDMI, would you mind verifying that you can actually change the refresh rate to anything above 60Hz when 2560 x 1440 is selected in the Nvidia Control Panel?

Not that I'm doubting you, but I've yet to see a single person who got anything other than 1080p at 120Hz working through any receiver - not just Denon. If you can, would you please share your settings with me, so I can see what I did wrong?

If I connect the GPU through the AVR, all available resolutions are listed under "Ultra HD, HD, SD" in the Nvidia Control Panel, with "1080p, 1920 x 1080" and below being the only ones with a maximum possible refresh rate of 120Hz. "4k x 2k, 2560 x 1440" (the only one even remotely resembling 1440p) and "4k x 2k, 1920 x 1080" are 60Hz only. Trying to add a custom resolution of 2560 x 1440 causes the display to freak out and the drivers to crash.

When connecting the GPU directly to the TV, a new set of resolutions appears under the "PC" category, and all of those have 120Hz as the only possible refresh rate. "4k x 2k, 2560 x 1440" can again be found under the first set, with all sorts of refresh rates - up to 60Hz. In that same category, "1080p, 1920 x 1080" has available refresh rates of 119Hz and 100Hz, whereas the "PC" resolution of "1920 x 1080" has 120Hz.

From what I've read, receivers are built with standard TV and PC resolutions in mind, which is why none of them support 1440p at 120Hz.
 
Last edited:

MazeHaze

Member
Nov 1, 2017
8,582
Can you get lossless HD audio using the mobo's HDMI? If so, does it work well with Atmos? What drivers are you using? I'm always afraid of trying new audio solutions because it took Microsoft and Nvidia months to fix their Atmos issues (introduced with 1809 and only fixed on 1903).

The problem with connecting a GPU directly to the TV, as I understand it, is that the graphics drivers will only pick up the TV's internal speakers (two) from the EDID, regardless of the TV's connection to the AVR, restricting the output to two channels. That's why only "Stereo" will be available under "Speaker Setup", on Windows. An Nvidia representative said that the software team was in "early discussions" about incorporating a fix as part of a "possible HDTV connectivity makeover". That was eight years ago.

However, even with a direct connection, all audio formats (even the higher resolution ones) still show as supported, so you can bitstream/pass-through digital audio using an optical cable, and your receiver will play the appropriate format in 5.1 surround sound. The problem, in that scenario, is that only HDMI can pass lossless HD audio (including Atmos). Also, most games send uncompressed audio (multichannel PCM), which would then be played in stereo, since a digital optical connection has enough "room" (bandwidth) to transfer just two channels of PCM audio.

Weirdly, with my current setup (GPU connected to the TV via HDMI; TV sending audio to the AVR via a TosLink optical digital audio cable), I can select "Dolby Atmos for Home Theater" on Speaker Setup and even test all of my speakers - with a caveat: most of the time, audio is all over the place.

I have a 6.1 setup, so, in an Atmos 7.1 configuration, the left and right back channels should be spread evenly between the corresponding left or right back speaker and also the center back speaker; meanwhile, the left and right surround channels would go solely to the left and right back speakers, respectively. The way it is now, both the side left and rear back channels are being output by the rear left speaker; the rear right channel is sometimes output by the appropriate rear right speaker, but, other times, by the rear left and the rear center speakers combined; and the side right side channel is usually output by the rear right speaker, but can also be heard on occasion from the front right speaker. Oh, and no matter how many times I select all of the speakers as "full-range" during setup, the surround speakers default to their unchecked state with every retry.

All sound files that I tested (downloaded from here) are decoded as either "Dolby D + DDS" or "DTS + Neural:X", after the AVR's front panel briefly flashes "Dolby Surround". That's likely due to the implementation of Atmos on PC, and Windows is just remixing the audio, I assume. I do hear surround sound, and, unlike when testing speakers, it sounds right.

I was thinking of overriding the TV's or AVR's EDID as a possible workaround to the audio/video issues. I've asked the creator of CRU (Custom Resolution Utility) for some help and am now waiting for his response.



I believe I have everything configured correctly. Is your GPU also Nvidia? If it's connected directly to your Denon AVR via HDMI, would you mind verifying that you can actually change the refresh rate to anything above 60Hz when 2560 x 1440 is selected in the Nvidia Control Panel?

Not that I'm doubting you, but I've yet to see a single person who got anything other than 1080p at 120Hz working through any receiver - not just Denon. If you can, would you please share your settings with me, so I can see what I did wrong?

If I connect the GPU through the AVR, all available resolutions are listed under "Ultra HD, HD, SD" in the Nvidia Control Panel, with "1080p, 1920 x 1080" and below being the only ones with a maximum possible refresh rate of 120Hz. "4k x 2k, 2560 x 1440" (the only one even remotely resembling 1440p) and "4k x 2k, 1920 x 1080" are 60Hz only. Trying to add a custom resolution of 2560 x 1440 causes the display to freak out and the drivers to crash.

When connecting the GPU directly to the TV, a new set of resolutions appears under the "PC" category, and all of those have 120Hz as the only possible refresh rate. "4k x 2k, 2560 x 1440" can again be found under the first set, with all sorts of refresh rates - up to 60Hz. In that same category, "1080p, 1920 x 1080" has available refresh rates of 119Hz and 100Hz, whereas the "PC" resolution of "1920 x 1080" has 120Hz.

From what I've read, receivers are built with standard TV and PC resolutions in mind, which is why none of them support 1440p at 120Hz.
I dont have an atmos setup but my mobo hdmi sends PCM /lossless 5.1 just fine
 

tokkun

Member
Oct 27, 2017
5,407
I believe I have everything configured correctly. Is your GPU also Nvidia? If it's connected directly to your Denon AVR via HDMI, would you mind verifying that you can actually change the refresh rate to anything above 60Hz when 2560 x 1440 is selected in the Nvidia Control Panel?

Not that I'm doubting you, but I've yet to see a single person who got anything other than 1080p at 120Hz working through any receiver - not just Denon. If you can, would you please share your settings with me, so I can see what I did wrong?

If I connect the GPU through the AVR, all available resolutions are listed under "Ultra HD, HD, SD" in the Nvidia Control Panel, with "1080p, 1920 x 1080" and below being the only ones with a maximum possible refresh rate of 120Hz. "4k x 2k, 2560 x 1440" (the only one even remotely resembling 1440p) and "4k x 2k, 1920 x 1080" are 60Hz only. Trying to add a custom resolution of 2560 x 1440 causes the display to freak out and the drivers to crash.

When connecting the GPU directly to the TV, a new set of resolutions appears under the "PC" category, and all of those have 120Hz as the only possible refresh rate. "4k x 2k, 2560 x 1440" can again be found under the first set, with all sorts of refresh rates - up to 60Hz. In that same category, "1080p, 1920 x 1080" has available refresh rates of 119Hz and 100Hz, whereas the "PC" resolution of "1920 x 1080" has 120Hz.

From what I've read, receivers are built with standard TV and PC resolutions in mind, which is why none of them support 1440p at 120Hz.

You need to enabled 'unexposed' resolutions. Here's a screenshot of my Nvidia Control Panel:

BqHQUbU.png


To get there, do the following:
1. Select the AVR as the display you want to change
2. Click the customize button.
3. Check "Enable resolutions not exposed by the display"
4. Check 2560 x 1440 at 120Hz (32-bit)
5. Click OK to exit the submenu.

It should now appear as an option under the PC category. It's possible you many need to got through the Create Custom Resolution dialog in the previous menu, but I don't think it is necessary. You can ignore the 64-bit custom resolution in the screenshot; that was just me experimenting with the custom resolution feature.

Anyway, it works fine for me. The TV is clearly displaying 1440p/120. I did originally have problems with signal integrity when I was trying to run it with HDR enabled over a 45-foot passive HDMI cable, and I had to upgrade to a fiber optic cable to fix that. Also, while the 980 Ti will happily driver 3 displays at 1440p/120, it chokes when I try to switch my TV to 4K/60 unless I disable one of my monitors.
 

EeK9X

Member
Jan 31, 2019
1,068
I dont have an atmos setup but my mobo hdmi sends PCM /lossless 5.1 just fine

Good to know. Thanks for replying!

You need to enabled 'unexposed' resolutions. Here's a screenshot of my Nvidia Control Panel:

To get there, do the following:
1. Select the AVR as the display you want to change
2. Click the customize button.
3. Check "Enable resolutions not exposed by the display"
4. Check 2560 x 1440 at 120Hz (32-bit)
5. Click OK to exit the submenu.

It should now appear as an option under the PC category. It's possible you many need to got through the Create Custom Resolution dialog in the previous menu, but I don't think it is necessary. You can ignore the 64-bit custom resolution in the screenshot; that was just me experimenting with the custom resolution feature.

Anyway, it works fine for me. The TV is clearly displaying 1440p/120. I did originally have problems with signal integrity when I was trying to run it with HDR enabled over a 45-foot passive HDMI cable, and I had to upgrade to a fiber optic cable to fix that. Also, while the 980 Ti will happily driver 3 displays at 1440p/120, it chokes when I try to switch my TV to 4K/60 unless I disable one of my monitors.

See, I don't have any resolutions other than a single 1024x768 (or something equally as crazy old - I'm not near my desktop atm) in that screen. And as I mentioned, if I try do add a custom one 2560x1400@120Hz myself, the picture isn't even displayed and the drivers crash, forcing me to restart the machine.

The max length of any HDMI cables that I'm using is 10ft, and they're all "Premium Certified", by Monoprice (the preferred brand over at the AVSForums). No issues with handshaking, even in 4K@60Hz (either in 4:4:4 8-bits or 4:2:2 HDR).

What displays are you using? Only one of them is connected through the AVR? What's its native resolution?
 

Vimto

Member
Oct 29, 2017
3,714
Hello people, I have small (stupid?) question..

I just purchased lg c9 oled and I noticed the 120hz mode have lower input lag than the 60hz mode..

my question is, can I take advantage of that even if my game is running at 60fps?
 

DOTDASHDOT

Helios Abandoned. Atropos Conquered.
Member
Oct 26, 2017
3,076
On this topic of how good tvs have become, TV manufacturers are way ahead of monitor manufacturers in terms features and closing the gap in input lag. Dell just announced a refresh of their three year old 34 inch ultrawide and its a pile of trash for its price point.


DP 1.2 and hdmi 1.4 (Really???)
120 hz screen that is limited by the old g sync module as it has the same panel from lg that they use on their 144 hz 34 inch freesync monitor.
1499 price point with literally offers almost nothing new over the old model which you can buy right now around 800 dollars. Heck even the lg I listed before goes on sale for 800-900 and Dell trying to sell a worse model for almost twice the price.

Dell is even launching a 55 inch oled monitor for 4000 dollars that has dp1.2 (LOL at that not even dp 1.4) and no hdr. It's unbelievable. The TV market is so much better right now. You can almost buy two 65 inch c9s for that 55 inch dell. It's really the dark ages right now for monitors.

Yeah I often think that about monitors, such poor pricing vs the best TV's.
 

Deleted member 35478

User-requested account closure
Banned
Dec 6, 2017
1,788
So I have a 65" LG B7, used to post in this thread quite often, but lately I've been out of the loop. How out of date is my tv at this point, and I suppose going forward with next gen consoles? Also, I know the B7's game mode can appear a little dark with certain games in HDR mode, there was a recent update my TV received, but I don't think that update has changed anything in that regard. Any news I may have missed on that front? Thank you
 

Doc Holliday

Member
Oct 27, 2017
5,814
So I got a good deal on Greentoe for the 65" Sony 950g, 5 year warranty in NYC. Is Greentoe legit? I've never heard of them before I read this thread. Is it white glove delivery?
 

Starwing

One Winged Slayer
The Fallen
Oct 31, 2018
4,119
Hello people, I have small (stupid?) question..

I just purchased lg c9 oled and I noticed the 120hz mode have lower input lag than the 60hz mode..

my question is, can I take advantage of that even if my game is running at 60fps?
I believe that so as long as the device is sending a 120hz signal to the C9, you should get the lower input lag at any framerate below it i.e. a computer is sending a 1440p @ 120hz signal despite the content displaying at 60fps. Someone correct me if I'm wrong.
 

shinbojan

Member
Oct 27, 2017
1,101
So I have a 65" LG B7, used to post in this thread quite often, but lately I've been out of the loop. How out of date is my tv at this point, and I suppose going forward with next gen consoles? Also, I know the B7's game mode can appear a little dark with certain games in HDR mode, there was a recent update my TV received, but I don't think that update has changed anything in that regard. Any news I may have missed on that front? Thank you

If you are playing on ps4, you can change HDR settings in the OS now.
That solves problems with older games that don't have brightness settings.
 

Ghost Rider

Member
Oct 27, 2017
860
Just got a 65" x900f Sony to replace my stupid KS8000. So far I love it but I now realize that my receiver does not support Dolby Vision. It has everything but that.
Is there a way to run my X one X direct to the tv but have the sound output through the receiver to maintain the surround sound?
 

Deleted member 35478

User-requested account closure
Banned
Dec 6, 2017
1,788
Just got a 65" x900f Sony to replace my stupid KS8000. So far I love it but I now realize that my receiver does not support Dolby Vision. It has everything but that.
Is there a way to run my X one X direct to the tv but have the sound output through the receiver to maintain the surround sound?

If your receiver has HDMI ARC support yes. Run Xbox to tv input, and an HDMI cable from tv to receiver to the output/ARC labeled input. I believe ARC is limited to dolby digital 5.1, and can't be bitstream or PCM to have 5.1.
 

Ghost Rider

Member
Oct 27, 2017
860
If your receiver has HDMI ARC support yes. Run Xbox to tv input, and an HDMI cable from tv to receiver to the output/ARC labeled input. I believe ARC is limited to dolby digital 5.1, and can't be bitstream or PCM to have 5.1.
Thank you for the quick reply!
Now to decide if Dolby Vision is worth the hassle
**edit** took me a minute to change the cables. Receiver was already hooked up to the ARC input so all I had to do was run the HDMI from the OneX to the tv.
 
Last edited:

nib95

Contains No Misinformation on Philly Cheesesteaks
Banned
Oct 28, 2017
18,498
HDTVtest's final blind test shoot out video is out.

Again, this is an important test (arguably the best shoot out ever done) because these were all retail units that had all their bezels etc covered up, and were placed directly adjacent to the world's best mastering reference monitor, a set used by the industry professionals to grade movies etc, so audiences knew exactly what things should have looked like on the TV's.

Also, for things to remain impartial, even when they changed source material they did so behind curtains so nobody could see the UI features to discern which TV was which.





Results with individual points

Best TV of 2019
  1. LG C9 - 33.16
  2. Panasonic GZ 2000 - 32.90
  3. Sony AG9 - 30.16
  4. Samsung Q90 - 27.75
Best Home Threatre TV
  1. Panasonic GZ 2000 - 20.84
  2. LG C9 - 20.81
  3. Sony AG9 - 20.14
  4. Samsung Q90 - 15.78
Best Living Room TV
  1. Panasonic GZ 2000 - 23.73
  2. Samsung Q90 - 23.44
  3. Sony AG9 - 22.86
  4. LG C9 - 22.43
Best Gaming TV
  1. LG C9 - 4.67
  2. Panasonic GZ 2000 - 3.92
  3. Samsung Q90 - 3.66
  4. Sony AG9 - 3.39
Best HDR TV
  1. LG C9 - 17.34
  2. Panasonic GZ 2000 - 16.81
  3. Sony AG9 - 15.05
  4. Samsung Q90 - 12.74
Individual results.

Black's and Shadow Detail
  1. LG C9 - 4.69
  2. Panasonic GZ 2000 - 4.31
  3. Sony AG9 - 4.03
  4. Samsung Q90 - 2.63
Colour Accuracy
  1. LG C9 - 4.47
  2. Panasonic GZ 2000 - 4.36
  3. LG C9 - 4.47
  4. Sony AG9 - 4.28
  5. Samsung Q90 - 2.92
Tone Mapping
  1. Panasonic GZ 2000 - 4.32
  2. LG C9 - 4.11
  3. Samsung Q90 - 3.51
  4. Sony AG9 - 3.04
Motion Handling
  1. Sony AG9 - 4.15
  2. Panasonic GZ 2000 - 4.12
  3. LG C9 - 4.05
  4. Samsung Q90 - 3.62
Uniformity
  1. Panasonic GZ 2000 - 4.25
  2. Sony AG9 - 3.96
  3. LG C9 - 3.80
  4. Samsung Q90 - 3.44
Video Processing
  1. Panasonic GZ 2000 - 3.92
  2. Sony AG9 - 3.85
  3. LG C9 - 3.63
  4. Samsung Q90 - 3.60
Gaming
  1. LG C9 - 4.67
  2. Panasonic GZ 2000 - 3.92
  3. Samsung Q90 - 3.66
  4. Sony AG9 - 3.39
Bright Room
  1. Samsung Q90 - 4.37
  2. LG C9 - 3.75
  3. Panasonic GZ 2000 - 3.69
  4. Sony AG9 - 3.47
 

DOTDASHDOT

Helios Abandoned. Atropos Conquered.
Member
Oct 26, 2017
3,076
HDTVtest's final blind test shoot out video is out.

Again, this is an important test (arguably the best shoot out ever done) because these were all retail units that had all their bezels etc covered up, and were placed directly adjacent to the world's best mastering reference monitor, a set used by the industry professionals to grade movies etc, so audiences knew exactly what things should have looked like on the TV's.

Also, for things to remain impartial, even when they changed source material they did so behind curtains so nobody could see the UI features to discern which TV was which.





Results with individual points

Best TV of 2019
  1. LG C9 - 33.16
  2. Panasonic GZ 2000 - 32.90
  3. Sony AG9 - 30.16
  4. Samsung Q90 - 27.75
Best Home Threatre TV
  1. Panasonic GZ 2000 - 20.84
  2. LG C9 - 20.81
  3. Sony AG9 - 20.14
  4. Samsung Q90 - 15.78
Best Living Room TV
  1. Panasonic GZ 2000 - 23.73
  2. Samsung Q90 - 23.44
  3. Sony AG9 - 22.86
  4. LG C9 - 22.43
Best Gaming TV
  1. LG C9 - 4.67
  2. Panasonic GZ 2000 - 3.92
  3. Samsung Q90 - 3.66
  4. Sony AG9 - 3.39
Best HDR TV
  1. LG C9 - 17.34
  2. Panasonic GZ 2000 - 16.81
  3. Sony AG9 - 15.05
  4. Samsung Q90 - 12.74
Individual results.

Black's and Shadow Detail
  1. LG C9 - 4.69
  2. Panasonic GZ 2000 - 4.31
  3. Sony AG9 - 4.03
  4. Samsung Q90 - 2.63
Colour Accuracy
  1. LG C9 - 4.47
  2. Panasonic GZ 2000 - 4.36
  3. LG C9 - 4.47
  4. Sony AG9 - 4.28
  5. Samsung Q90 - 2.92
Tone Mapping
  1. Panasonic GZ 2000 - 4.32
  2. LG C9 - 4.11
  3. Samsung Q90 - 3.51
  4. Sony AG9 - 3.04
Motion Handling
  1. Sony AG9 - 4.15
  2. Panasonic GZ 2000 - 4.12
  3. LG C9 - 4.05
  4. Samsung Q90 - 3.62
Uniformity
  1. Panasonic GZ 2000 - 4.25
  2. Sony AG9 - 3.96
  3. LG C9 - 3.80
  4. Samsung Q90 - 3.44
Video Processing
  1. Panasonic GZ 2000 - 3.92
  2. Sony AG9 - 3.85
  3. LG C9 - 3.63
  4. Samsung Q90 - 3.60
Gaming
  1. LG C9 - 4.67
  2. Panasonic GZ 2000 - 3.92
  3. Samsung Q90 - 3.66
  4. Sony AG9 - 3.39
Bright Room
  1. Samsung Q90 - 4.37
  2. LG C9 - 3.75
  3. Panasonic GZ 2000 - 3.69
  4. Sony AG9 - 3.47


Fair play to LG really, goes to show once calibrated that it's a contender for best TV of 2019 full stop, but if there was no intention of getting a TV calibrated and/or don't play games, plus you have access to Panasonic, then the GZ2000 is the best choice.
 

Deleted member 16452

User requested account closure
Banned
Oct 27, 2017
7,276
HDTVtest's final blind test shoot out video is out.

Again, this is an important test (arguably the best shoot out ever done) because these were all retail units that had all their bezels etc covered up, and were placed directly adjacent to the world's best mastering reference monitor, a set used by the industry professionals to grade movies etc, so audiences knew exactly what things should have looked like on the TV's.

Also, for things to remain impartial, even when they changed source material they did so behind curtains so nobody could see the UI features to discern which TV was which.





Results with individual points

Best TV of 2019
  1. LG C9 - 33.16
  2. Panasonic GZ 2000 - 32.90
  3. Sony AG9 - 30.16
  4. Samsung Q90 - 27.75
Best Home Threatre TV
  1. Panasonic GZ 2000 - 20.84
  2. LG C9 - 20.81
  3. Sony AG9 - 20.14
  4. Samsung Q90 - 15.78
Best Living Room TV
  1. Panasonic GZ 2000 - 23.73
  2. Samsung Q90 - 23.44
  3. Sony AG9 - 22.86
  4. LG C9 - 22.43
Best Gaming TV
  1. LG C9 - 4.67
  2. Panasonic GZ 2000 - 3.92
  3. Samsung Q90 - 3.66
  4. Sony AG9 - 3.39
Best HDR TV
  1. LG C9 - 17.34
  2. Panasonic GZ 2000 - 16.81
  3. Sony AG9 - 15.05
  4. Samsung Q90 - 12.74
Individual results.

Black's and Shadow Detail
  1. LG C9 - 4.69
  2. Panasonic GZ 2000 - 4.31
  3. Sony AG9 - 4.03
  4. Samsung Q90 - 2.63
Colour Accuracy
  1. LG C9 - 4.47
  2. Panasonic GZ 2000 - 4.36
  3. LG C9 - 4.47
  4. Sony AG9 - 4.28
  5. Samsung Q90 - 2.92
Tone Mapping
  1. Panasonic GZ 2000 - 4.32
  2. LG C9 - 4.11
  3. Samsung Q90 - 3.51
  4. Sony AG9 - 3.04
Motion Handling
  1. Sony AG9 - 4.15
  2. Panasonic GZ 2000 - 4.12
  3. LG C9 - 4.05
  4. Samsung Q90 - 3.62
Uniformity
  1. Panasonic GZ 2000 - 4.25
  2. Sony AG9 - 3.96
  3. LG C9 - 3.80
  4. Samsung Q90 - 3.44
Video Processing
  1. Panasonic GZ 2000 - 3.92
  2. Sony AG9 - 3.85
  3. LG C9 - 3.63
  4. Samsung Q90 - 3.60
Gaming
  1. LG C9 - 4.67
  2. Panasonic GZ 2000 - 3.92
  3. Samsung Q90 - 3.66
  4. Sony AG9 - 3.39
Bright Room
  1. Samsung Q90 - 4.37
  2. LG C9 - 3.75
  3. Panasonic GZ 2000 - 3.69
  4. Sony AG9 - 3.47


Thanks for posting this.

The LG C9 seems like an amazing TV for just about anything. I do wish we had access to Panasonic in the US tho.
 

Nothing

Member
Oct 30, 2017
2,095
Age has nothing to do with the top disc performers. Film has no resolution. A lot of the older movies translate much better to 4K UHD than many of the early movies shot on digital around the turn of the century.

Many people would probably be surprised to know that all of these MCU movies are finished on 2K digital intermediaries too. Many films are. They still look great on disc.

Karate Kid is good? Might grab it then.
It's one of the best looking discs out there. Along with A Few Good Men, Leon: The Professional, and Blade Runner.
 

Rbk_3

Banned
Oct 27, 2017
661
I am thinking of returning my 65C8 and waiting until the C9 drops below $3K CND. I feel I am going to regret not having VRR for next gen.

If it follows last years trends, last year the C8 was available here in early September for as low as $2800 street price.