• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.

shockdude

Member
Oct 26, 2017
1,309
A common belief I've seen here and elsewhere is that CRT displays have "zero input lag", making them inherently superior to LCD displays and other modern "laggy" display technologies.

Using the industry-standard definition of input lag, 60Hz CRTs don't have 0ms input lag. 60Hz CRTs have 8.3ms of input lag.

Moreover, LCD monitors have been within 2ms of CRT input lag for over a decade.

If the industry defined input lag as literal raw processing latency i.e. actual lag over a CRT, then CRT input lag would be 0ms and LCD monitor input lag would be 2ms and we wouldn't need this OP. But that's not the definition used today, so here we are.

------

We hear these "10ms", "20ms", input lag numbers get thrown around all the time. But what do they actually mean?

A video consists of multiple frames, or images, which are rapidly displayed in succession to produce the illusion of movement. For 60Hz displays, this means a new frame is drawn every 1/60th of a second, or every 16.7ms.

While it may seem like it, frames are not instantaneously displayed every 16.7ms. All CRTs and nearly all modern displays draw their frames from top to bottom. See this excellent video by the Slow Mo Guys for a visualization, particularly 2:12 for a CRT and 4:25 for an LCD HDTV.



This "top-to-bottom" drawing behavior means that the input lag of a display varies based on where you look on the screen. You have the least input lag at the top of the screen and the most input lag at the bottom.

Input lag is almost always measured using a Leo Bodnar lag tester or a similar device. It lets you measure input lag at three zones on the screen: the top, the middle, and the bottom of the screen. The top of the screen will report the least input lag, while the bottom of the screen will report the most input lag

lagtest_01.jpg


The industry standard for reporting input lag is the lag measured at the middle of the screen, usually at 60Hz. Whenever you see "10ms", "20ms", etc., that's the time it takes for a line to be drawn at the middle of the screen.

-----

Given that the industry standard of input lag is "lag at the middle of the screen", how much lag does a CRT have?

A 60Hz CRT takes 16.7ms to draw one frame from top to bottom. Assuming a CRT with literally perfect input lag, the top line of the screen would be drawn in 0ms, and the bottom of the screen would be drawn in 16.7ms. The middle of the screen will therefore be drawn in 8.3ms. This value might vary by a few ms, depending on the CRT's overscan, but in general 8.3ms is the input lag you should expect from a 60Hz CRT.

That LCD PC monitor from 2008 with 10ms input lag? Only 2ms laggier than a CRT.
That HDTV with 25ms input lag? 1 frame laggier than a CRT.

-----

To further demonstrate this point, see this input lag analysis by Fizzi36 from the Smash Bros subreddit. Smash Bros players are known to be some of the pickiest players with regard to input lag. Note that Fizzi36 used a 14" CRT as a baseline for his tests; in other words, he defined "input lag" as "lag compared to this CRT" instead of "time until drawing a line at the middle of the screen".

Looking at Fizzi36's raw data, a Wii connected to a Sewell Wii-to-HDMI adapter and then connected to a LCD monitor had only 2ms of input lag compared to a CRT. "2ms" is exactly the amount of input lag we expect for an LCD over a CRT.

(This also means that the Sewell Wii-to-HDMI adapter is a magical device that adds nearly 0 latency).

-----

Recognizing that 60Hz CRTs have 8.3ms of input lag has the following implications:
  • Your monitor/TV is not as laggy as the numbers make it seem. To compare your input lag numbers to a CRT, subtract 8.3ms.
  • High-refresh-rate monitors have lower input lag because they draw frames faster. A 120Hz CRT would have 4.2ms of input lag. A 144Hz CRT would have 3.5ms of input lag. In general, the minimum possible input lag of a display with a given refresh rate is "1/(refresh rate*2)".
  • Without a CRT, playing games on a (good) PC monitor is the closest you can get to experiencing CRT-like input lag.
  • Again, this OP would be unnecessary if the industry standard definition of "input lag" was literally raw input lag i.e. actual lag over a CRT, instead of "lag at the middle of the screen" as it is currently defined. Raw input lag of modern LCD monitors are ~2ms.
Edit: Revised thesis, clarified "60Hz CRT" as necessary, made a statement about the industry-standard definition of input lag vs. raw processing lag.
Edit 2: Changed "LCD technology" to "LCD monitors" since LCD TVs are known to be laggy for reasons unrelated to LCD technology.
 
Last edited:

ghibli99

Member
Oct 27, 2017
17,670
Stuff fascinates me, but doesn't really affect me in day-to-day gaming, especially since I don't play competitively. I have a 144Hz G-Sync monitor and a Sony X800D TV. I game 50/50 on each, sometimes the same game on one or the other display, depending on what size I prefer during a given session. It's obviously smoother at 144Hz, but even dropping to 60Hz and with the added lag, it makes no noticeable difference.

That being said, lag associated with audio is the thing that tends to bother me more than anything... whether it's with lip syncing of dialogue, hitting notes in music games being off, etc. It's been a long time since something just plain worked right out the box without having to do some relatively time-consuming calibration first.
 

Dreamboum

Member
Oct 28, 2017
22,834
That's cool information, thank you. I'm still stuck with my CRT because they're still cheap and the input lag should stay the same, whereas I can't trust my monitor when I don't have the equipment to test it.
 

D65

Member
Oct 26, 2017
6,862
You just proved that they are inherently superior...

And they have "zero lag" from the top, which is the point.
 

Deleted member 11018

User requested account closure
Banned
Oct 27, 2017
2,419
Guitar hero taught me the hard way what real lag is, because i had direct output from the console for audio, while the video went though the TV, and it was not good for fast coverts compared to the crt. The real lag is not how fast your TV is to display a frame, it's the time it takes to preprocess input and signals to me. You could have hellish fast display but 250ms delay between input and result. Hence the "game modes" on tv, to skip some of the time consuming processing.
 

TheZynster

Member
Oct 26, 2017
13,285
so an MX279H which is the monitor I use only has .7ms added.................no wonder it feels so damn good to play on and why I have play competitively on it only if I'm doing shooters.
 

Ahasverus

Banned
Oct 27, 2017
4,599
Colombia
More threads like these and less "This os my opinion, period. Discuss"

Very interesting LCD's are not that bad nowadays
 
OP
OP
shockdude

shockdude

Member
Oct 26, 2017
1,309
That being said, lag associated with audio is the thing that tends to bother me more than anything... whether it's with lip syncing of dialogue, hitting notes in music games being off, etc. It's been a long time since something just plain worked right out the box without having to do some relatively time-consuming calibration first.
Guitar hero taught me the hard way what real lag is, because i had direct output from the console for audio, while the video went though the TV, and it was not good for fast coverts compared to the crt. The real lag is not how fast your TV is to display a frame, it's the time it takes to preprocess input and signals to me. You could have hellish fast display but 250ms delay between input and result. Hence the "game modes" on tv, to skip some of the time consuming processing.
Audio lag and sync sucks. I truthfully believe that audio/display lag mismatch is part of why games like Guitar Hero died. I made a thread on audio latency on the old forum; it was stupidly long and nobody read it lol.
You just proved that they are inherently superior...

And they have "zero lag" from the top, which is the point.
Well yeah, but they're superior by 2ms (arguably imperceptible), not by 10ms, as the numbers in the Display Lag Database might lead people to believe.
 

TheZynster

Member
Oct 26, 2017
13,285
Audio lag and sync sucks. I truthfully believe that audio/display lag mismatch is part of why games like Guitar Hero died. I made a thread on audio latency on the old forum; it was stupidly long and nobody read it lol.

Well yeah, but they're superior by 2ms (arguably imperceptible), not by 10ms, as the numbers in the Display Lag Database might lead people to believe.


My monitor was the last time I felt like anything matched on a music game.....problem is I would rather play on a big TV for a music game than a monitor of course. But I know that when RB2 came out and had the auto calibration it was fantastic.....for the guitars, then the drums had their own numbers due to latency.

I NEVER once had perfect calibration after going to a big HDTV when playing a music game again. Every miss I always felt there was doubt that it was calibration and not me if I ever missed a simple note on expert. It just never felt the same, and that was due to an audio system and a big HDTV.
 

hibikase

User requested ban
Banned
Oct 26, 2017
6,820
You're just arguing semantics, CRTs have no processing lag which is what any sane person means when they say they have no lag. The act of drawing the picture on the screen is not instant but that's a given. Come on now.
 

gcwy

Member
Oct 27, 2017
8,685
Houston, TX
Huh. I always thought people who pushed this were one of those contrarian hipsters or people blinded by nostalgia. I've played video games on CRTs most of my life and I could never see myself going back to them.
 

Deleted member 176

User requested account closure
Banned
Oct 25, 2017
37,160
You're just arguing semantics, CRTs have no processing lag which is what any sane person means when they say they have no lag. The act of drawing the picture on the screen is not instant but that's a given. Come on now.
I think the point is that 10ms is actually really good because it's really 2ms.
 

Aztechnology

Community Resettler
Avenger
Oct 25, 2017
14,131
My monitor was the last time I felt like anything matched on a music game.....problem is I would rather play on a big TV for a music game than a monitor of course. But I know that when RB2 came out and had the auto calibration it was fantastic.....for the guitars, then the drums had their own numbers due to latency.

I NEVER once had perfect calibration after going to a big HDTV when playing a music game again. Every miss I always felt there was doubt that it was calibration and not me if I ever missed a simple note on expert. It just never felt the same, and that was due to an audio system and a big HDTV.
Well what's your HDTV? I mean consider the fact that there's a lot of TV's now with sub 20ms input lag. I personally absolutely notice 40+ but below like 20, it's significantly less perceptible. Of course that largely depends on your input device as well. For instance at 30ms with a mouse depending in the refresh rate it can feel a little off for me. Sub 20 though I usually can't really tell.
 

medyej

Member
Oct 26, 2017
6,409
Is this mainly about CRT televisions and not CRT PC monitors? Because PC monitors back before LCDs were running at high resolutions with high refresh rates and it was a huge downgrade going to LCDs for many years, only when 120-144hz lcds focused on reducing input lag started being mainstream did it feel like it was a worthy tradeoff.
 
OP
OP
shockdude

shockdude

Member
Oct 26, 2017
1,309
Thanks for reading the thread everyone. Glad to hear that people were able to learn something from it.
My monitor was the last time I felt like anything matched on a music game.....problem is I would rather play on a big TV for a music game than a monitor of course. But I know that when RB2 came out and had the auto calibration it was fantastic.....for the guitars, then the drums had their own numbers due to latency.

I NEVER once had perfect calibration after going to a big HDTV when playing a music game again. Every miss I always felt there was doubt that it was calibration and not me if I ever missed a simple note on expert. It just never felt the same, and that was due to an audio system and a big HDTV.
For my rhythm game setup I had to measure audio lag using Audacity and a laptop mic. By comparing the latency of my laptop speakers vs. my soundbar, I was able to compute audio latency of my setup, which was 60ms. Imagine 60ms of input lag, man.
You're just arguing semantics, CRTs have no processing lag which is what any sane person means when they say they have no lag. The act of drawing the picture on the screen is not instant but that's a given. Come on now.
Semantics are a crucial part of this argument, yes. The "zero input lag" of a CRT cannot be directly compared with the input lag numbers of modern displays without knowing what those numbers actually mean. A LCD with 10ms of measured input lag is not 10ms laggier than a CRT.
Is this mainly about CRT televisions and not CRT PC monitors? Because PC monitors back before LCDs were running at high resolutions with high refresh rates and it was a huge downgrade going to LCDs for many years, only when 120-144hz lcds focused on reducing input lag started being mainstream did it feel like it was a worthy tradeoff.
This is mostly about 60Hz CRTs, PC or TV. High refresh rates are briefly addressed at the end of the post.
It's immediately apparent that inputs are faster going from an LCD (Zowie or whatever monitor that has low input lag) to a crt.
Unless you're comparing a console's HDMI-out to the LCD with the console's analog out to a CRT, there's probably going to be some processing adding delay. My ASUS LCD is noticeably laggier with VGA vs DVI, for example.
 
Last edited:

FriedConsole

Member
Oct 27, 2017
1,187
There was a time early in HD tvs where it was noticeable. Now it's not even worth checking the number when buying a TV.
 

TheZynster

Member
Oct 26, 2017
13,285
Well what's your HDTV? I mean consider the fact that there's a lot of TV's now with sub 20ms input lag. I personally absolutely notice 40+ but below like 20, it's significantly less perceptible. Of course that largely depends on your input device as well. For instance at 30ms with a mouse depending in the refresh rate it can feel a little off for me. Sub 20 though I usually can't really tell.


I haven't tested it since getting my OLED C7 which is fucking amazing for a 55 inch, I bet now I would be happy enough if I picked up a rock band rivals kit.
 

Fugu

Member
Oct 26, 2017
2,726
This is a very odd post for a few reasons.

1. The OP's thesis is not "your LCDs perform better than you think they do", it's "no display can draw an image instantly".
2. Many CRTs do not operate at 60Hz and do not take as long as 16ms to draw a frame.
3. The OP's concern for semantics seems to only manifest when discussing the relative merits of LCDs. To say that CRTs have 8.3ms of input lag is both not really true and not really relevant, particularly in a comparison to LCDs.
 

Aztechnology

Community Resettler
Avenger
Oct 25, 2017
14,131
I haven't tested it since getting my OLED C7 which is fucking amazing for a 55 inch, I bet now I would be happy enough if I picked up a rock band rivals kit.
I think a huge part of that like someone else mentioned higher up was audio latency. Something that's taken for granted, but was a real issue. Yea they had the calibration, at least for like Rock Band, where you would try to sync it. But on the wrong setup, it still never felt quite right.
 
OP
OP
shockdude

shockdude

Member
Oct 26, 2017
1,309
This is a very odd post for a few reasons.

1. The OP's thesis is not "your LCDs perform better than you think they do", it's "no display can draw an image instantly".
2. Many CRTs do not operate at 60Hz and do not take as long as 16ms to draw a frame.
3. The OP's concern for semantics seems to only manifest when discussing the relative merits of LCDs. To say that CRTs have 8.3ms of input lag is both not really true and not really relevant, particularly in a comparison to LCDs.
1. Yes. The idea of "LCDs performing better than you think they do" stems from that thesis.
2. High refresh rates are addressed at the end of the OP. However, display input lag is almost always measured at 60Hz, so it is only fair to use 60Hz CRTs as a reference.
3. The semantics are defined by the industry who decided that "input lag" is the lag at the center of the screen. Using the industry definition, CRTs have 8.3ms of input lag, and this number is the only way that we can properly compare a CRT's input lag with the reported input lag of every other display. This definition is also a way to properly compare the input lag difference between a 60Hz and a 120Hz+ display, CRT or otherwise.

There are other benefits of CRTs unrelated to input lag such as motion that other people have mentioned.

If there's anything about the OP which you (or anyone) think should be revised or clarified, please let me know.
 

Arkaign

Member
Nov 25, 2017
1,991
Is this mainly about CRT televisions and not CRT PC monitors? Because PC monitors back before LCDs were running at high resolutions with high refresh rates and it was a huge downgrade going to LCDs for many years, only when 120-144hz lcds focused on reducing input lag started being mainstream did it feel like it was a worthy tradeoff.

Definitely. I played Quake 3, UT series, etc on a 21" at 1280x1024 100hz. It felt better than my 1440p 144hz gsync in responsiveness. I couldn't go back for newer titles, but the old ones still shine on that CRT.
 

Narroo

Banned
Feb 27, 2018
1,819
So, this is a little bit misleading.

CRT's don't 'save' frames or load them into a buffer. They project whatever is at their input at a given moment, and it is possible to change a frame mid render. Most systems don't do this because it's bad form, but the claim that CRT's have minimal input lag isn't quite unfounded.

More relevantly, for video games, is the Vertical Blanking Interval. CRT systems had ~22 'lines' of dead time between frames to allow systems to transmit data or catch up with them. It's not necessary, and CRTs can move/draw very fast, but it's part of the standard. This ends up being around 10% or so of the draw time, I believe. Not huge, but it's really 15ms per 'frame' to draw.
 

SweetVermouth

Banned
Mar 5, 2018
4,272
The biggest factor introducing lag is something like vsync or analog to digital conversions done by TVs or whatever. This wasn't an issue when all consoles output analog video because everyone had CRTs. Nowadays we are using flatscreens and when emulating old games they're most likely using some kind of vsync introducing more lag. There are some emulators however that allow you to reduce that lag like retroarch and it really makes a difference and plays perfectly fine.

The math in the OP is still not 100% correct though. Here is a video explaining more about how CRTs show images and also how the SNES calculates everything from frame to frame:
 

Vipu

Banned
Oct 26, 2017
2,276
LCDs are nice these days on input lag but damn the motion needs to get much cleaner, ULMB is nice but usually cant be on at same time with g-sync or if fps is not stable.
 

Fugu

Member
Oct 26, 2017
2,726
1. Yes. The idea of "LCDs performing better than you think they do" stems from that thesis.
2. High refresh rates are addressed at the end of the OP. However, display input lag is almost always measured at 60Hz, so it is only fair to use 60Hz CRTs as a reference.
3. The semantics are defined by the industry who decided that "input lag" is the lag at the center of the screen. Using the industry definition, CRTs have 8.3ms of input lag, and this number is the only way that we can properly compare a CRT's input lag with the reported input lag of every other display. This definition is also a way to properly compare the input lag difference between a 60Hz and a 120Hz+ display, CRT or otherwise.

There are other benefits of CRTs unrelated to input lag such as motion that other people have mentioned.

If there's anything about the OP which you (or anyone) think should be revised or clarified, please let me know.
They don't perform better than people think they do, they perform better relative to certain CRTs. The absolute performance is still accurately measured by the raw input lag number.
It isn't fair to use 60Hz CRTs as a reference because the only determining factor for input lag in CRTs is refresh rate. Increasing the refresh rate reduces the input lag by the same amount, and the only reason that CRTs ever have any input lag at all is because of the non-infinite refresh rate. The input lag of a CRT is essentially the hypothetical minimum input lag.
 

khaz

Member
Oct 25, 2017
274
Light guns are the ultimate display lag test. If your display doesn't work with them, then it has lag.
 
OP
OP
shockdude

shockdude

Member
Oct 26, 2017
1,309
So, this is a little bit misleading.

CRT's don't 'save' frames or load them into a buffer. They project whatever is at their input at a given moment, and it is possible to change a frame mid render. Most systems don't do this because it's bad form, but the claim that CRT's have minimal input lag isn't quite unfounded.

More relevantly, for video games, is the Vertical Blanking Interval. CRT systems had ~22 'lines' of dead time between frames to allow systems to transmit data or catch up with them. It's not necessary, and CRTs can move/draw very fast, but it's part of the standard. This ends up being around 10% or so of the draw time, I believe. Not huge, but it's really 15ms per 'frame' to draw.
The biggest factor introducing lag is something like vsync or analog to digital conversions done by TVs or whatever. This wasn't an issue when all consoles output analog video because everyone had CRTs. Nowadays we are using flatscreens and when emulating old games they're most likely using some kind of vsync introducing more lag. There are some emulators however that allow you to reduce that lag like retroarch and it really makes a difference and plays perfectly fine.

The math in the OP is still not 100% correct though. Here is a video explaining more about how CRTs show images and also how the SNES calculates everything from frame to frame:

I don't think LCDs buffer (entire) frames, otherwise 10ms input lag would be impossible. There might be buffering elsewhere such as within the GPU, though.
Fantastic points on CRT behavior, thanks for your input. I'm going to leave the OP unchanged since explaining vblanking would take a while and the overall point of the OP remains unchanged.
crtscanpxo3p.png

They don't perform better than people think they do, they perform better relative to certain CRTs. The absolute performance is still accurately measured by the raw input lag number.
It isn't fair to use 60Hz CRTs as a reference because the only determining factor for input lag in CRTs is refresh rate. Increasing the refresh rate reduces the input lag by the same amount, and the only reason that CRTs ever have any input lag at all is because of the non-infinite refresh rate. The input lag of a CRT is essentially the hypothetical minimum input lag.
If CRT input lag is hypothetical minimum input lag, then input lag "relative to certain CRTs" is mathematically equivalent to "raw input lag".
The raw input lag number of a good PC monitor is ~2ms. How do we know this? By comparing the measured 10ms input lag of a typical LCD monitor with the expected 8.3ms input lag of a 60Hz CRT.
See also: this 144Hz LCD Monitor with 4.6ms input lag. A 144Hz CRT has 3.5ms of input lag, meaning this LCD has ~1ms of raw input lag.

No one ever reports raw input lag; if they did we wouldn't need this OP in the first place. Edited OP to make this point clear.
 
Last edited:

LCGeek

Member
Oct 28, 2017
5,855
Huh. I always thought people who pushed this were one of those contrarian hipsters or people blinded by nostalgia. I've played video games on CRTs most of my life and I could never see myself going back to them.

It's false argument to begin with and while I'm glad the Op is fighting back. We need to get out of the mindset a display has input lag, to me it doesn't it's literally only displaying the signal given to it and since it has no awareness of your inputs I see it as a dumb notion to link the two.

An ungodly crt like gdm-fw900 has no competition stock or tweaked even compared to oled at 1080p. The same for princeton monitors when the few of them existed for customers and not just big companies or very pricy clients could be gotten. I say that from experience and I've yet to see any flat panel in my history post crts or sed demonstrations that have topped either.


As for the Op

how about if we have topic like this we inform people a little more of options in the past that LCD and flat panels made us regress in, such as high refreshrates. That's huge caveat to leave out to people that CRTs had no problem with high refreshrates even in the 90s. Not only that refreshrate as we now know from the blur buster studies has a direct impact on the pixel persistence of lcd, thus even for lcds it would seem dumb to me not mention that that factor would change at higher rates. I don't even need to debate this part that's a fact and it should be highlighted that 60hz alone in pixel persistence is 16MS. There's no doubt that in high refreshrate test or ulmb test LCD would have even less input lag, which would only help the argument they aren't as weak as crts in this area. I know this cause I use said such products and immediately see the difference vs shit 60hz.

Makes most of the testing bunk and it should clearly state said such standards with a giant asterisk.

I don't like the tone because while 60hz is common it is not the top of the mountain and not even close. We have 440hz monitors that exist but the test only answers the question at 60hz, which is bare minimun these days they phased out 15hz/30hz screens a long time ago. I laugh when people use the words standards as if we should ignore the larger implications of what the information is telling us. Just like I call out bad networking standards or polcies I will do the same on a subject in which consumers can easily enjoy better benefits of a better display.

Simply due to the nature of the topic I think this should be highlighted and explained well so that the typical fud on this subject even what you acknowledge doesn't become more muddled.

Don't get me wrong the claim you make is real and should be fought back but we should also highlight the flaws of said such debate on either end. As a lightboost lover I can't let this stand and I given my ample technical reasons why.
 
Last edited:
OP
OP
shockdude

shockdude

Member
Oct 26, 2017
1,309
It's false argument to begin with and while I'm glad the Op is fighting back. We need to get out of the mindset a display has input lag, to me it doesn't it's literally only displaying the signal given to it and since it has no awareness of your inputs I see it as a dumb notion to link the two.

An ungodly crt like gdm-fw900 has no competition stock or tweaked even compared to oled at 1080p. The same for princeton monitors when the few of them existed for customers and not just big companies or very pricy clients could be gotten. I say that from experience and I've yet to see any flat panel in my history post crts or sed demonstrations that have topped either.


As for the Op

how about if we have topic like this we inform people a little more of options in the past that LCD and flat panels made us regress in, such as high refreshrates. That's huge caveat to leave out to people that CRTs had no problem with high refreshrates even in the 90s. Not only that refreshrate as we now know from the blur buster studies has a direct impact on the pixel persistence of lcd, thus even for lcds it would seem dumb to me not mention that that factor would change at higher rates. I don't even need to debate this part that's a fact and it should be highlighted that 60hz alone in pixel persistence is 16MS. There's no doubt that in high refreshrate test or ulmb test LCD would have even less input lag, which would only help the argument they aren't as weak as crts in this area. I know this cause I use said such products and immediately see the difference vs shit 60hz.

Makes most of the testing bunk and it should clearly state said such standards with a giant asterisk.

I don't like the tone because while 60hz is common it is not the top of the mountain and not even close. We have 440hz monitors that exist but the test only answers the question at 60hz, which is bare minimun these days they phased out 15hz/30hz screens a long time ago. I laugh when people use the words standards as if we should ignore the larger implications of what the information is telling us. Just like I call out bad networking standards or polcies I will do the same on a subject in which consumers can easily enjoy better benefits of a better display.

Simply due to the nature of the topic I think this should be highlighted and explained well so that the typical fud on this subject even what you acknowledge doesn't become more muddled.

Don't get me wrong the claim you make is real and should be fought back but we should also highlight the flaws of said such debate on either end. As a lightboost lover I can't let this stand and I given my ample technical reasons why.
Thanks for the input. I've edited the OP and thesis to make it more clear that I'm referring to 60Hz CRTs and that the reason for this OP is due to the industry standard input lag definition.
Again, let me know if there's anything else you'd like to add or edit.
 

LCGeek

Member
Oct 28, 2017
5,855
Thanks for the input. I've edited the OP and thesis to make it more clear that I'm referring to 60Hz CRTs and that the reason for this OP is due to the industry standard input lag definition.
Again, let me know if there's anything else you'd like to add or edit.

I just noticed that and big ups for explaining this shit to consumers in general. Not enough people care about their games once it hits their display despite telling themselves I love graphics. Seems odd to gimp the one thing that soley defines how well your game will look once it has been rendered.
 

Farmerboy

Member
Oct 25, 2017
337
Melbourne Australia
I game on a 10yo TH- 50PZ850a Panny plasma.

Its been a great tv but I shudder to think what it's input lag is.

Plus the PS4 is connected through a Denon reciever. What input lag is that adding? Would it be better to connect directly to the tv?
 

lvl 99 Pixel

Member
Oct 25, 2017
44,607
No

CRTs have better motion handling than LCDs, lower input lag and deeper blacks than LCDs. Don't forget that retro consoles work on them without the hassle of an upscaler and look way, way better than if you would just hook them up to your LCD TV.

I prefer CRT for smash bros melee and all, but i still loathe the high pitched sound they can emit and the convex CRT screens are super ugly. Get a room full of them and its a migraine for me.
 

user__

Member
Oct 25, 2017
570
It could be a question of semantics, but I didn't ever really thought about it. Thanks for the thread, In the past I was kind of obsessed with the input lag but now I'm much less finicky about it, I have a Samsung MU8000 and it doesn't really feels laggy at all, even if RTings says it has about 22 ms, I see now why it's not a big deal even technically speaking...

Audio lag and sync sucks. I truthfully believe that audio/display lag mismatch is part of why games like Guitar Hero died. I made a thread on audio latency on the old forum; it was stupidly long and nobody read it lol.
I don't know if the new hdmi standards will address this problem, but the audio to the tv through the digital cable is really, really slow. And Sony and the other companies have done a real bad job by removing the analog audio outputs from their consoles. Fortunately an hdmi audio splitter covers this flaw, but adding cheap boxes in between doesn't sound like a good solution as it could add more video lag and make the quality of the signal decay.
If I try to play rhythm games on PS4, forgetting to switch the audio channel on the amp, I feel like I novice struggling in any slightly fast section, then 20 seconds in I recall to switch back to the audio extractor, then everything is super-responsive again and it feels that there's actually a lot of time to hit the note right. I'm sure a lot of people don't know this, as the games often don't say a word about it, and they just think they are bad. It's sad really.
 

Narroo

Banned
Feb 27, 2018
1,819
I don't think LCDs buffer (entire) frames, otherwise 10ms input lag would be impossible. There might be buffering elsewhere such as within the GPU, though.
Fantastic points on CRT behavior, thanks for your input. I'm going to leave the OP unchanged since explaining vblanking would take a while and the overall point of the OP remains unchanged.


If CRT input lag is hypothetical minimum input lag, then input lag "relative to certain CRTs" is mathematically equivalent to "raw input lag".
The raw input lag number of a good PC monitor is ~2ms. How do we know this? By comparing the measured 10ms input lag of a typical LCD monitor with the expected 8.3ms input lag of a 60Hz CRT.
See also: this 144Hz LCD Monitor with 4.6ms input lag. A 144Hz CRT has 3.5ms of input lag, meaning this LCD has ~1ms of raw input lag.

No one ever reports raw input lag; if they did we wouldn't need this OP in the first place.

If we want to be really technical what your discussing is response time. CRT's really do have near-zero input lag. A good example is the NES: The NES locked frame data during drawing. It used the Vertical Blanking Signal to update the output for the next frame, which would then be drawn as expected on the next frame. This by definition is pretty much zero input lag. For an LCD, you can have several frames of delay on top of a The response time was 16ms. LCD response times tend to be a bit dishonest, and can be in the 40ms range. See this link. LCD response times can vary with what's being displayed. Static images update quick; moving images update slow.

It also depends on your resolution. For example, 480i draws two 240p images in succession. So really, the middle screen is being drawn in ~7.5ms, not 15ms. Vertical Blanking can be important too for minimize that response time.

Anyways, at this level we're at sub-1 frame of 'effective input lag,' in which case the systems input lag becomes important. For example, do inputs on frame one affect frame 2, or frame 3? Kinda hard to find data, though most systems have 1-2 frame of lag as a rule of thumb.

Also, using 144Hz 'effective input lag' is a bit cheating here, isn't it? A 144Hz CRT, which those do exist, very much would have a lower response time. At worst, 3.47ms for mid frame. Including Vertical Blanking? I've no clue what the VBI would be for 144Hz. If it's the same as 480i though, (1.34ms of blanking per frame), then it's 2.1ms response time for mid frame.
 

Fugu

Member
Oct 26, 2017
2,726
I don't think LCDs buffer (entire) frames, otherwise 10ms input lag would be impossible. There might be buffering elsewhere such as within the GPU, though.
Fantastic points on CRT behavior, thanks for your input. I'm going to leave the OP unchanged since explaining vblanking would take a while and the overall point of the OP remains unchanged.
crtscanpxo3p.png


If CRT input lag is hypothetical minimum input lag, then input lag "relative to certain CRTs" is mathematically equivalent to "raw input lag".
The raw input lag number of a good PC monitor is ~2ms. How do we know this? By comparing the measured 10ms input lag of a typical LCD monitor with the expected 8.3ms input lag of a 60Hz CRT.
See also: this 144Hz LCD Monitor with 4.6ms input lag. A 144Hz CRT has 3.5ms of input lag, meaning this LCD has ~1ms of raw input lag.

No one ever reports raw input lag; if they did we wouldn't need this OP in the first place. Edited OP to make this point clear.
No, the raw input lag is the 10ms of input lag since that's how much input lag there actually is.
 

Durante

Dark Souls Man
Member
Oct 24, 2017
5,074
Moreover, LCD technology has been within 2ms of CRT input lag for over a decade.
I think that's an incredibly misleading statement when unqualified.

The practical input lag is the sum of processing lag and average pixel switching times, and for the vast majority of LCD monitors (never mind TVs!) that is far in excess of 10 ms (what you define as "within 2 ms of CRT") -- not only a decade ago but also today still.
 
OP
OP
shockdude

shockdude

Member
Oct 26, 2017
1,309
I don't know if the new hdmi standards will address this problem, but the audio to the tv through the digital cable is really, really slow. And Sony and the other companies have done a real bad job by removing the analog audio outputs from their consoles. Fortunately an hdmi audio splitter covers this flaw, but adding cheap boxes in between doesn't sound like a good solution as it could add more video lag and make the quality of the signal decay.
If I try to play rhythm games on PS4, forgetting to switch the audio channel on the amp, I feel like I novice struggling in any slightly fast section, then 20 seconds in I recall to switch back to the audio extractor, then everything is super-responsive again and it feels that there's actually a lot of time to hit the note right. I'm sure a lot of people don't know this, as the games often don't say a word about it, and they just think they are bad. It's sad really.
The worst part is that HDMI does not have significant inherent audio lag. My monitor speakers through HDMI have 1ms of audio latency over my laptop speakers, which is well within margin of error due to the whole "speed of sound" thing.
If we want to be really technical what your discussing is response time. CRT's really do have near-zero input lag. A good example is the NES: The NES locked frame data during drawing. It used the Vertical Blanking Signal to update the output for the next frame, which would then be drawn as expected on the next frame. This by definition is pretty much zero input lag. For an LCD, you can have several frames of delay on top of a The response time was 16ms. LCD response times tend to be a bit dishonest, and can be in the 40ms range. See this link. LCD response times can vary with what's being displayed. Static images update quick; moving images update slow.

It also depends on your resolution. For example, 480i draws two 240p images in succession. So really, the middle screen is being drawn in ~7.5ms, not 15ms. Vertical Blanking can be important too for minimize that response time.

Anyways, at this level we're at sub-1 frame of 'effective input lag,' in which case the systems input lag becomes important. For example, do inputs on frame one affect frame 2, or frame 3? Kinda hard to find data, though most systems have 1-2 frame of lag as a rule of thumb.

Also, using 144Hz 'effective input lag' is a bit cheating here, isn't it? A 144Hz CRT, which those do exist, very much would have a lower response time. At worst, 3.47ms for mid frame. Including Vertical Blanking? I've no clue what the VBI would be for 144Hz. If it's the same as 480i though, (1.34ms of blanking per frame), then it's 2.1ms response time for mid frame.
I think that's an incredibly misleading statement when unqualified.

The practical input lag is the sum of processing lag and average pixel switching times, and for the vast majority of LCD monitors (never mind TVs!) that is far in excess of 10 ms (what you define as "within 2 ms of CRT") -- not only a decade ago but also today still.
I thought a 480i frame was drawn in 15ms, not 7.5ms? Drawing 1 480i frame is equivalent to drawing 1 240p frame?

And yeah LCD response time does muck things up. Input lag testers measure black-to-white response time as part of their input lag measurement, but LCDs have so many other color transitions with high response time.
I'm thinking of how to best integrate response time in the OP. Having to qualify all the components of LCD input lag seems messy...

Edit: I'm going to leave the OP as is. I don't think 15ms response time is the same thing as 15ms added input lag.
See: Rting's Acer Predator 144Hz monitor review. While 100% response time is 13.1ms, 80% response time is 2.8ms, and 80% imo is more than enough of a transition to be a "noticeable change" with regards to input lag. Input lag measurements already take some of the black-to-white response time into account anyways, and I wouldn't be surprised if measured input lag of LCD monitors was almost entirely response time lag instead of processing lag.
Though response time does cause a lot of motion blur compared to CRTs.
No, the raw input lag is the 10ms of input lag since that's how much input lag there actually is.
A major point of the OP is that the measured "10ms" of an LCD monitor is not the "raw input lag" of an LCD, because of how the industry defines input lag.
Can you link? I never saw this!
I'm not really proud of it because it's stupidly long and pretty unfocused and should've gone through more revisions. If you're really interested:
https://www./threads/audio-lag-how-to-hear-a-gunshot-50ms-faster-than-your-enemy.1335534/
 
Last edited: