A common belief I've seen here and elsewhere is that CRT displays have "zero input lag", making them inherently superior to LCD displays and other modern "laggy" display technologies.
Using the industry-standard definition of input lag, 60Hz CRTs don't have 0ms input lag. 60Hz CRTs have 8.3ms of input lag.
Moreover, LCD monitors have been within 2ms of CRT input lag for over a decade.
If the industry defined input lag as literal raw processing latency i.e. actual lag over a CRT, then CRT input lag would be 0ms and LCD monitor input lag would be 2ms and we wouldn't need this OP. But that's not the definition used today, so here we are.
------
We hear these "10ms", "20ms", input lag numbers get thrown around all the time. But what do they actually mean?
A video consists of multiple frames, or images, which are rapidly displayed in succession to produce the illusion of movement. For 60Hz displays, this means a new frame is drawn every 1/60th of a second, or every 16.7ms.
While it may seem like it, frames are not instantaneously displayed every 16.7ms. All CRTs and nearly all modern displays draw their frames from top to bottom. See this excellent video by the Slow Mo Guys for a visualization, particularly 2:12 for a CRT and 4:25 for an LCD HDTV.
This "top-to-bottom" drawing behavior means that the input lag of a display varies based on where you look on the screen. You have the least input lag at the top of the screen and the most input lag at the bottom.
Input lag is almost always measured using a Leo Bodnar lag tester or a similar device. It lets you measure input lag at three zones on the screen: the top, the middle, and the bottom of the screen. The top of the screen will report the least input lag, while the bottom of the screen will report the most input lag
The industry standard for reporting input lag is the lag measured at the middle of the screen, usually at 60Hz. Whenever you see "10ms", "20ms", etc., that's the time it takes for a line to be drawn at the middle of the screen.
-----
Given that the industry standard of input lag is "lag at the middle of the screen", how much lag does a CRT have?
A 60Hz CRT takes 16.7ms to draw one frame from top to bottom. Assuming a CRT with literally perfect input lag, the top line of the screen would be drawn in 0ms, and the bottom of the screen would be drawn in 16.7ms. The middle of the screen will therefore be drawn in 8.3ms. This value might vary by a few ms, depending on the CRT's overscan, but in general 8.3ms is the input lag you should expect from a 60Hz CRT.
That LCD PC monitor from 2008 with 10ms input lag? Only 2ms laggier than a CRT.
That HDTV with 25ms input lag? 1 frame laggier than a CRT.
-----
To further demonstrate this point, see this input lag analysis by Fizzi36 from the Smash Bros subreddit. Smash Bros players are known to be some of the pickiest players with regard to input lag. Note that Fizzi36 used a 14" CRT as a baseline for his tests; in other words, he defined "input lag" as "lag compared to this CRT" instead of "time until drawing a line at the middle of the screen".
Looking at Fizzi36's raw data, a Wii connected to a Sewell Wii-to-HDMI adapter and then connected to a LCD monitor had only 2ms of input lag compared to a CRT. "2ms" is exactly the amount of input lag we expect for an LCD over a CRT.
(This also means that the Sewell Wii-to-HDMI adapter is a magical device that adds nearly 0 latency).
-----
Recognizing that 60Hz CRTs have 8.3ms of input lag has the following implications:
Edit 2: Changed "LCD technology" to "LCD monitors" since LCD TVs are known to be laggy for reasons unrelated to LCD technology.
Using the industry-standard definition of input lag, 60Hz CRTs don't have 0ms input lag. 60Hz CRTs have 8.3ms of input lag.
Moreover, LCD monitors have been within 2ms of CRT input lag for over a decade.
If the industry defined input lag as literal raw processing latency i.e. actual lag over a CRT, then CRT input lag would be 0ms and LCD monitor input lag would be 2ms and we wouldn't need this OP. But that's not the definition used today, so here we are.
------
We hear these "10ms", "20ms", input lag numbers get thrown around all the time. But what do they actually mean?
A video consists of multiple frames, or images, which are rapidly displayed in succession to produce the illusion of movement. For 60Hz displays, this means a new frame is drawn every 1/60th of a second, or every 16.7ms.
While it may seem like it, frames are not instantaneously displayed every 16.7ms. All CRTs and nearly all modern displays draw their frames from top to bottom. See this excellent video by the Slow Mo Guys for a visualization, particularly 2:12 for a CRT and 4:25 for an LCD HDTV.
This "top-to-bottom" drawing behavior means that the input lag of a display varies based on where you look on the screen. You have the least input lag at the top of the screen and the most input lag at the bottom.
Input lag is almost always measured using a Leo Bodnar lag tester or a similar device. It lets you measure input lag at three zones on the screen: the top, the middle, and the bottom of the screen. The top of the screen will report the least input lag, while the bottom of the screen will report the most input lag
The industry standard for reporting input lag is the lag measured at the middle of the screen, usually at 60Hz. Whenever you see "10ms", "20ms", etc., that's the time it takes for a line to be drawn at the middle of the screen.
-----
Given that the industry standard of input lag is "lag at the middle of the screen", how much lag does a CRT have?
A 60Hz CRT takes 16.7ms to draw one frame from top to bottom. Assuming a CRT with literally perfect input lag, the top line of the screen would be drawn in 0ms, and the bottom of the screen would be drawn in 16.7ms. The middle of the screen will therefore be drawn in 8.3ms. This value might vary by a few ms, depending on the CRT's overscan, but in general 8.3ms is the input lag you should expect from a 60Hz CRT.
That LCD PC monitor from 2008 with 10ms input lag? Only 2ms laggier than a CRT.
That HDTV with 25ms input lag? 1 frame laggier than a CRT.
-----
To further demonstrate this point, see this input lag analysis by Fizzi36 from the Smash Bros subreddit. Smash Bros players are known to be some of the pickiest players with regard to input lag. Note that Fizzi36 used a 14" CRT as a baseline for his tests; in other words, he defined "input lag" as "lag compared to this CRT" instead of "time until drawing a line at the middle of the screen".
Looking at Fizzi36's raw data, a Wii connected to a Sewell Wii-to-HDMI adapter and then connected to a LCD monitor had only 2ms of input lag compared to a CRT. "2ms" is exactly the amount of input lag we expect for an LCD over a CRT.
(This also means that the Sewell Wii-to-HDMI adapter is a magical device that adds nearly 0 latency).
-----
Recognizing that 60Hz CRTs have 8.3ms of input lag has the following implications:
- Your monitor/TV is not as laggy as the numbers make it seem. To compare your input lag numbers to a CRT, subtract 8.3ms.
- High-refresh-rate monitors have lower input lag because they draw frames faster. A 120Hz CRT would have 4.2ms of input lag. A 144Hz CRT would have 3.5ms of input lag. In general, the minimum possible input lag of a display with a given refresh rate is "1/(refresh rate*2)".
- Without a CRT, playing games on a (good) PC monitor is the closest you can get to experiencing CRT-like input lag.
- Again, this OP would be unnecessary if the industry standard definition of "input lag" was literally raw input lag i.e. actual lag over a CRT, instead of "lag at the middle of the screen" as it is currently defined. Raw input lag of modern LCD monitors are ~2ms.
Edit 2: Changed "LCD technology" to "LCD monitors" since LCD TVs are known to be laggy for reasons unrelated to LCD technology.
Last edited: