• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.
  • We have made minor adjustments to how the search bar works on ResetEra. You can read about the changes here.

user__

Member
Oct 25, 2017
570
The worst part is that HDMI does not have significant inherent audio lag. My monitor speakers through HDMI have 1ms of audio latency over my laptop speakers, which is well within margin of error due to the whole "speed of sound" thing.
I read your gaf thread too. It's clear that HDMI is not the culprit, otherwise my HDMI audio splitter would not work. So it's absurd that my TV wants so much time to process sound, and this is in PCM too, I don't care for fancy spatial sound because I use wire headphones attached to the amp almost always. The audio goes to the amp through optical, because arc is not supported by my old Yamaha and the TV doesn't have a damn audio jack.
But it could be the amp at fault too, even if I doubt because through the RCA inputs my amp is instantaneous. I don't have a sure way to verify but I could try to play with the TV speakers.

Edit: I just discovered that my TV has configurable audio lag! it had 70 ms of audio lag set by default, probably because the normal tv mode outputs have ~70ms of video lag; but this same value is used in game and pc mode for some obscure reason. This means that setting it at 0 I probably have slightly negative lag, but probably is not a big deal.
The lag can be adjusted from 0 to +250 ms and using this video I can notice the difference quite well, with a 0 config I can see that the audio comes out when the yellow circle appears, and scrolling the lag cursor on the TV settings while the youtube video is on I can ear clearly the frequency of the bip increasing or reducing.
This could mean I can ditch the audio extractor! Without this thread, probably I would never have known.
 
Last edited:

Deleted member 11517

User requested account closure
Banned
Oct 27, 2017
4,260
so an MX279H which is the monitor I use only has .7ms added.................no wonder it feels so damn good to play on and why I have play competitively on it only if I'm doing shooters.
Hah, that's my monitor!

In any case what's weird and I barely see mentioned is these Bluetooth controllers we have nowadays also produce some input lag, really most noticeable playing rhythm games and fighting games to me.



What's weird also, they say the launch models of the DS4 *always* use Bluetooth connection even when plugged in.

The strange part is that my PC doesn't have Bluetooth technology and my DS4 feels so much more snappy compared to using it on PS4.

There definitely is a difference.

Also games produce their own input lag as well, it's an interesting subject.


I think the comparison - no matter how you measure input lag on CRT's - isn't really very valid either when most TV's nowadays come with huge amounts of input lag and most people never bother to calibrate their settings accordingly - they may wonder why they keep dying in competitive games, but would never want to miss out on all the "picture enhancing" features of their (probably) expensive sets.
 

user__

Member
Oct 25, 2017
570
Sorry for the slight off-topic, just an update on my discovery:
The audio lag setting on the TV works extremely well, just tested with a song on Project Diva Future Tone.
Audio through TV out with default setting (70 ms): score around 85%
Audio through HDMI audio extractor (supposedly around 0 ms): score of 95%
Audio through TV out with 0 ms setting: score of 96%
Audio through TV out with 20 ms setting (supposedly in sync with the video lag) 97%

shockdude on your gaf thread you mention that the TV "cheats" and I came to the same conclusion.
It's likely that most of TVs take into account the response time of the video part comprehensive of the post processings, and purposely delay the audio to sync it, but they also offer a game mode that sensibly reduce the video lag, but without adjusting the audio delay accordingly, hence why some solution like an audio extractor is necessary for the majority of TVs, and why probably modern PC monitors with built-in audio don't need anything.
 

Polioliolio

Member
Nov 6, 2017
5,396
Thank you for clarifying.

But saying that 8ms input lag is 0ms input lag is dumb. For one, CRTs are gone and who is going to use that metric except antique game players, and also, if ever it's possible that new tv technology does improve on CRT lag, you'll have a negative lag measurement.

That's interesting that the higher part of the screen has less lag than the lower. I can imagine a stuck up smash player preferring to hang out on platforms to have the input lag edge over a guy on the ground.
 

petran79

Banned
Oct 27, 2017
3,025
Greece
found also this

https://danluu.com/input-lag/

At 144 Hz, each frame takes 7 ms. A change to the screen will have 0 ms to 7 ms of extra latency as it waits for the next frame boundary before getting rendered (on average,we expect half of the maximum latency, or 3.5 ms). On top of that, even though my display at home advertises a 1 ms switching time, it actually appears to take 10 ms to fully change color once the display has started changing color. When we add up the latency from waiting for the next frame to the latency of an actual color change, we get an expected latency of 7/2 + 10 = 13.5ms


With the old CRT in the apple 2e, we'd expect half of a 60 Hz refresh (16.7 ms / 2) plus a negligible delay, or 8.3 ms. That's hard to beat today: a state of the art "gaming monitor" can get the total display latency down into the same range, but in terms of marketshare, very few people have such displays, and even displays that are advertised as being fast aren't always actually fast.


LatencyComparison.png

s34rhNc.jpg
 

Beer Monkey

Banned
Oct 30, 2017
9,308
8 and 16 bit systems especially do not buffer frames. So there's literally no half frame of lag (average) there on a CRT. There's nothing buffered. Latency of the video itself is literally a fraction of a millisecond.

Of course, classic game systems have game loops coded that may introduce some input latency. Hell, Capcom CPS-2 boards and game logic have at least a couple of frames.
 

LewieP

Member
Oct 26, 2017
18,097
It also takes time for the light to travel from the display to your eye, then from your eye to your brain.
 

JahIthBer

Member
Jan 27, 2018
10,382
With Retroarch's run ahead/frame delay & Dolphin's low latency options, it's kinda hard to justify keeping a CRT around really, input lag was the main thing to have one, even Melee pro's will eventually have to jump to laptops/PC's with Monitor's, chugging those CRT's around is a pain.
 

Karsticles

Self-Requested Ban
Banned
Oct 25, 2017
2,198
Does this mean that bigger TV screens will always have higher lag than smaller ones due to draw time?
 

X1 Two

Banned
Oct 26, 2017
3,023
You're just arguing semantics, CRTs have no processing lag which is what any sane person means when they say they have no lag. The act of drawing the picture on the screen is not instant but that's a given. Come on now.
Yeah, it's ridiculous. Also who ever used 60 Hz with a CRT?
 

SeeingeyeDug

Member
Oct 28, 2017
3,004
Audio lag and sync sucks. I truthfully believe that audio/display lag mismatch is part of why games like Guitar Hero died. I made a thread on audio latency on the old forum; it was stupidly long and nobody read it lol.

Well yeah, but they're superior by 2ms (arguably imperceptible), not by 10ms, as the numbers in the Display Lag Database might lead people to believe.

Guitar hero and rock band had fairly robust in-game calibrations to match up any video lag and audio lag. If you did the calibrations each time you changed display or audio, you were matched up perfectly. They were easy to use as well. Press button at certain repeating visual points. Or press button at certain repeating audio points. Then the lag is auto set. In the last iterations the guitar itself had audio and video sensors to hear and read screen flashes to remove the user reliance of pressing the button accuracy.

Video and audio lag had zero to do with why those games died.
 

Chaosblade

Resettlement Advisor
Member
Oct 25, 2017
6,596
This is something a lot of Melee fans struggle with too. There is definitely a large number of players/viewers that think CRTs are truly instantaneous and lagless while any LCD inherently adds several frames of lag making the game unplayable.

Melee is unique in that other games don't need scaling to be played on a modern display, and don't need analog to digital conversion. But there are plenty of fast monitors that get around that first issue and quality HDMI dongles like the Sewell and GCVideo solutions for the second.
 

Nothing Loud

Literally Cinderella
Member
Oct 25, 2017
9,981
No

CRTs have better motion handling than LCDs, lower input lag and deeper blacks than LCDs. Don't forget that retro consoles work on them without the hassle of an upscaler and look way, way better than if you would just hook them up to your LCD TV.

But they weigh like bricks and are not worth the ER bills for herniating your discs trying to pick up a CRT bigger than 20"

Thanks for the informative OP
 
OP
OP
shockdude

shockdude

Member
Oct 26, 2017
1,311
Sorry for the slight off-topic, just an update on my discovery:
Glad to hear you were able to figure things out with your audio lag.
I'd still recommend the Audacity-Laptop method of computing audio lag if you want to be super precise. You only have to do it once for your setup, after all.
8 and 16 bit systems especially do not buffer frames. So there's literally no half frame of lag (average) there on a CRT. There's nothing buffered. Latency of the video itself is literally a fraction of a millisecond.

Of course, classic game systems have game loops coded that may introduce some input latency. Hell, Capcom CPS-2 boards and game logic have at least a couple of frames.
Imo this lack of buffering is a console-specific input optimization; the same optimization would theoretically work on LCD monitors which are also unbuffered (otherwise latency-free converters like the Sewell Wii-to-HDMI adapter would be impossible).
OP confusing pixel response time with input lag and display lag?
Pixel response time is a component of input lag and is already accounted for due to how input lag testers work. I'd consider pixel response time to be a component of display processing latency, since it's literally the time needed for the LCD panel to process the color change.
Pixel response time does have an effect on motion, though.
Guitar hero and rock band had fairly robust in-game calibrations to match up any video lag and audio lag. If you did the calibrations each time you changed display or audio, you were matched up perfectly. They were easy to use as well. Press button at certain repeating visual points. Or press button at certain repeating audio points. Then the lag is auto set. In the last iterations the guitar itself had audio and video sensors to hear and read screen flashes to remove the user reliance of pressing the button accuracy.

Video and audio lag had zero to do with why those games died.
I agree there were much bigger reasons why those games died. But I don't believe latency had zero impact on their death. I'm running with the assumption that the majority of players did not know how to calibrate rhythm games, even with robust calibration tools, and there's only so much in-game calibration can do to compensate for input lag. I believe people would have been more accepting of expensive rhythm game peripherals if they worked perfectly in-sync out of the box, as they did in the CRT days. But it's too late for good sync to revive the genre now.
 
Last edited:

Deleted member 2171

User requested account closure
Banned
Oct 25, 2017
3,731
I think it's been less about the actual amount of lag and how predictable the lag is. Predictable lag means Rock Band needs no or very little calibration to play nicely on CRTs, but once TVs started having random processing and display lags, you -have- to calibrate to have a good experience. CRTs had to have a uniform predictable lag to even work the way they did.
 

godofcookery

Avenger
Oct 25, 2017
949
As I am reading this thread, is there a term for 'button press' -> 'drawn to middle of screen' across both CRT and LCD? Is that not input lag?
 

HTupolev

Member
Oct 27, 2017
2,436
I thought a 480i frame was drawn in 15ms, not 7.5ms? Drawing 1 480i frame is equivalent to drawing 1 240p frame?
It's useful to use the term "fields" when discussion interlacing. Interlaced displays alternate drawing even/odd lines on successive sweeps; each set of 240 lines is a "field."

Whether a field is a frame... In the case of 240p, there's no ambiguity; the television is being tricked to draw only even fields or only odd fields, so you get a fresh 240p image in the same place on screen 60 times each second. In the case of 480-line 30fps video, two fields make a frame, more or less. In the case of 480-line 60fps video, where each field is stepped forward in time but you're still alternating lines, it's really hard to say what constitutes a "frame."
 

TAJ

Banned
Oct 28, 2017
12,446
My monitor was the last time I felt like anything matched on a music game.....problem is I would rather play on a big TV for a music game than a monitor of course. But I know that when RB2 came out and had the auto calibration it was fantastic.....for the guitars, then the drums had their own numbers due to latency.

I NEVER once had perfect calibration after going to a big HDTV when playing a music game again. Every miss I always felt there was doubt that it was calibration and not me if I ever missed a simple note on expert. It just never felt the same, and that was due to an audio system and a big HDTV.

Starting with Rock Band 2 they had automated lag calibration using a sensor and microphone on the guitar controller.

I agree there were much bigger reasons why those games died. But I don't believe latency had zero impact on their death. I'm running with the assumption that the majority of players did not know how to calibrate rhythm games, even with robust calibration tools, and there's only so much in-game calibration can do to compensate for input lag. I believe people would have been more accepting of expensive rhythm game peripherals if they worked perfectly in-sync out of the box, as they did in the CRT days. But it's too late for good sync to revive the genre now.

No. The way the gameplay worked in those games correct calibration completely nullified all of the effects of lag, regardless of how much there was.
Rocksmith was a different beast, though. Calibrated delay couldn't save that one.
 

Pargon

Member
Oct 27, 2017
12,014
It's misleading to say that a CRT has "8.3ms latency".
CRTs have effectively zero latency. 60Hz scanout takes ~8.3ms to reach the middle of the frame, but that's nothing to do with the CRT.

This is more of an issue with how review sites are measuring display latency.
I have long argued that they should not be taking measurements at the middle of the display - that's the worst possible location for it.

They should be taking two measurements:
  1. The very top of the display. This tells you how much processing latency the display has.
  2. The very bottom of the display. This tells you how much latency scanout adds.
Why two measurements? Because not all displays draw the image the same way.

A plasma TV updates the panel globally rather than scanning out the image. This means that it has to buffer a frame (minimum latency possible is ~16.7ms) but then it draws an image to the entire display at once in a very short amount of time.
So a plasma TV might have "20ms" latency when measured at the center of the screen, but it will measure 20ms at the top and bottom of the screen too.

It's the same with LCDs. There are no globally-updated LCD TVs/Monitors that I'm aware of, but some displays use a "fast scanout" to draw the image.
This means that they will buffer a frame, but then scan out the image in a fraction of that time. It may take 16.7ms to render a frame at 60 FPS, but a 240Hz G-Sync monitor will draw that on the screen in 4.2ms (one frame at 240Hz). Many televisions do the same thing.

The faster the scanout, the less the image skews with fast horizontal motion - which is certainly a benefit to gaming.
If you have a computer or a tablet, drag a full-height window side-to-side and look at the bottom edge. You will see that vertical lines bend as the bottom lags behind the top of the image.
I'd argue that the delay required to buffer a frame is probably worth it for the improved motion handling that you get with a global scanout - especially as we move to higher refresh rates.

I believe some (or all?) VR displays are using global scanout because it's a factor in preventing motion sickness.
The smoothness of image on my 21" CRT 10+ years ago.. at this point I doubt technology can reach that level again. 144hz is nice but still feels worse.
Yes, this is often overlooked. The flicker of a CRT not only improved motion clarity, it also significantly improved motion smoothness.

There are two white circles moving back and forth across the screen. Both are animated at ~10 FPS.
The upper circle is displayed for the entire duration, while the lower circle is only visible for 1/6 of the frames.

If you cover up the circles and view them one at a time, you should see that the lower circle is more distinct in motion, and also appears to be moving much more smoothly.
The demonstrates how modern displays compare against CRTs when you display a source at the same framerate - though the difference is larger than this example can show.
It's why 60 FPS games at home or in the arcades were so impressive on CRTs, while 60 FPS doesn't look nearly as impressive on a fast gaming monitor or OLED TV these days.
Have you gamed on an OLED?
Current OLEDs are pretty bad for gaming.
In theory they could be near-perfect gaming displays but they are only just starting to get black frame insertion options, and the options they have are extremely limited since they're literally drawing black frames in-between every other video frame, rather than driving the panel in a more CRT-like manner.
It's an improvement, but a minor one. The best option right now is NVIDIA's ULMB, but it also has several limitations - such as not working below 85Hz (though people have figured out a way to force that on some monitors).
CRTs are the new vinyl
No, CRTs are still significantly better than any other display type in a few key areas, such as motion clarity and smoothness.
The way they render low resolution computer-generated content (not video) can also be preferable to upscaling on modern displays.
Vinyl records are worse than modern digital audio in every possible way, but they make nice collectible items for people that like having them framed or laid out on a shelf.

If I could buy a modern, high resolution, high refresh rate CRT monitor, I would.
CRT televisions - even HDTVs - were far lower quality displays, except perhaps for broadcast monitors.
I game on a 10yo TH- 50PZ850a Panny plasma.
Its been a great tv but I shudder to think what it's input lag is.
Plus the PS4 is connected through a Denon reciever. What input lag is that adding? Would it be better to connect directly to the tv?
It shouldn't be adding any latency if it's just set to pass through the signal and not apply any video processing.
Panasonic's Plasmas have typically been in the 20-40ms range for latency, depending on the model.
The math in the OP is still not 100% correct though. Here is a video explaining more about how CRTs show images and also how the SNES calculates everything from frame to frame:

That video was great, thanks.
 
Oct 25, 2017
2,935
And a thread like this is the reason why I'm still kind of bummed that the testing for the Zisworks x28 (120hz to 480hz monitor) was never publicly finished beyond the panel's response times - there was supposed to be an input lag test (for individual resolutions and BL modes) and a test/overview on the backlight modes, but they never happened because the site owner was swamped with other stuff. Last word was the end of 2017.

Official word from the designer was
Input lag is 2 lines on the DP chips and 4 lines in the tcon+panel (plus up to three more in dual input mode). This results in typically 22~33 microseconds of lag (basically zero, most monitors have input lag measurements in the tens of milliseconds)

LCD monitors have gotten substantially faster... but a combination of internals optimized around expansive feature sets (rather than speed) and the use of slower VA and IPS panels set the tone of the conversation.

I don't have any testing equipment or scientific drive to analyze the two variants I have, so all I can say is "it looks smooth/cool af and I can do Tekken combos real gud".
 

SeeingeyeDug

Member
Oct 28, 2017
3,004
Glad to hear you were able to figure things out with your audio lag.
I'd still recommend the Audacity-Laptop method of computing audio lag if you want to be super precise. You only have to do it once for your setup, after all.

Imo this lack of buffering is a console-specific input optimization; the same optimization would theoretically work on LCD monitors which are also unbuffered (otherwise latency-free converters like the Sewell Wii-to-HDMI adapter would be impossible).


Pixel response time is a component of input lag and is already accounted for due to how input lag testers work. I'd consider pixel response time to be a component of display processing latency, since it's literally the time needed for the LCD panel to process the color change.
Pixel response time does have an effect on motion, though.

I agree there were much bigger reasons why those games died. But I don't believe latency had zero impact on their death. I'm running with the assumption that the majority of players did not know how to calibrate rhythm games, even with robust calibration tools, and there's only so much in-game calibration can do to compensate for input lag. I believe people would have been more accepting of expensive rhythm game peripherals if they worked perfectly in-sync out of the box, as they did in the CRT days. But it's too late for good sync to revive the genre now.

Even if you have the greatest zero input lag CRT on the market, people really into the genre would have been playing the audio through a stereo system, introducing audio lag which would still need calibration using the tools. I ran a Rock Band night at a local bar for years, plugging into their house audio system and feeding the video through a projector to a small screen so the people on stage could still face the audience. I ran with the basic calibration tools in the system and people who were really good walked in and 100%ed plenty of songs on expert drums. The calibration tools worked good enough for people to play through songs without error.
 

Iris

Member
Oct 28, 2017
102
This doesn't take into account scalers for SD consoles so it's completely pointless for most people to consider. An LCD TV is absolutely going to be laggier than a CRT TV because it has to deinterlace and scale the signal before being displayed. You could use an external scaler like an OSSC which works well but there's other reasons to use a CRT aside from input lag.
 

ty_hot

Banned
Dec 14, 2017
7,176
It makes no sense to use the CRT lag as the base, the zero. But thanks for the explanation.

Edit. Btw, what was the usual inlut lag for low end 720p LCD TVs? I have one at home, bought in 2008 and I never felt any lag (or was I just used to it?) and only now with 4k in play that I started reading about input lag (at the time the real problem of LCDs was the ghosting effect, that was also measured in ms, 8 being a decent number).
 

pswii60

Member
Oct 27, 2017
26,670
The Milky Way
There was a time early in HD tvs where it was noticeable. Now it's not even worth checking the number when buying a TV.
Only really true for higher end TVs which are indeed mostly in the 10-20ms range these days.

The lower end stuff can still vary wildly though, like the Sony X800G at a miserable 30ms+, or even worse the likes of Samsung's N5300 with over a pretty damn shocking 50ms of of lag. So still something that should be checked, especially if someone is on a budget. And certainly anyone considering game streaming like Stadia, every ms will count.
 

Deleted member 4346

User requested account closure
Banned
Oct 25, 2017
8,976
It makes no sense to use the CRT lag as the base, the zero. But thanks for the explanation.

Edit. Btw, what was the usual inlut lag for low end 720p LCD TVs? I have one at home, bought in 2008 and I never felt any lag (or was I just used to it?) and only now with 4k in play that I started reading about input lag (at the time the real problem of LCDs was the ghosting effect, that was also measured in ms, 8 being a decent number).

Of course it does. Standard def CRTs and CRT monitors do not cache a frame before drawing. The act of scanning out the picture takes some time but the upper right corner 1st pixel on a CRT will be drawn with near zero lag.

CRTs aren't just desirable for this reason, however. CRT motion resolution beats out LCD and OLED, and CRT black level beats LCD. I maintain that, if they made high-end CRT displays today in the premium market, there would still be a customer base for them.

You are used to the lag on your 720p LCD. There's no usual input lag. Samsung has had 100ms+ of input lag across entire model years in the past. Sets in the last couple of years have made big strides in this area but 10 years ago or more high input lag was common.
 

TAJ

Banned
Oct 28, 2017
12,446
My projector is a pretty great example of how big the gulf between how fast the screen is redrawn and input lag can be.
The entire screen can be redrawn in 1.11ms. A full RGB frame can be completed in 3.33ms. But the input lag is 16.67ms.
 

shark97

Banned
Nov 7, 2017
5,327
Arent the lowest latency LCD PC monitors only crappy TN panels that no one would actually want to use? Or am I confused on that point?
 

Beer Monkey

Banned
Oct 30, 2017
9,308
This thread is still, somewhat, misinformation.

Even your fastest LCD over HDMI is still receiving its data top to bottom with each scanline fed through the pipeline from right to left.

Nothing is actually faster than CRT. Everything else is CRT plus more.

Edit. Btw, what was the usual inlut lag for low end 720p LCD TVs?

Your "720p" LCD TV was 768p and it scanned and buffered the video and sucked for latency compared to CRT.

Every. Fucking. Time.
 

TAJ

Banned
Oct 28, 2017
12,446
This thread is still, somewhat, misinformation.

Even your fastest LCD over HDMI is still receiving its data top to bottom with each scanline fed through the pipeline from right to left.

Nothing is actually faster than CRT. Everything else is CRT plus more.

DLP is faster. There has been a 1,000Hz DLP projector that actually accepted 1,000fps video. Input is the biggest limiting factor, too.
 

Beer Monkey

Banned
Oct 30, 2017
9,308
DLP is faster. There has been a 1,000fps DLP projector that actually accepted 1,000fps video. Input is the biggest limiting factor, too.

That doesn't make it faster unless you have a source that is faster, as you allude to.

The bottom line is nothing is faster than CRT in practice, not theory. The thread is, and was, misleading.
 

Irrotational

Prophet of Truth
Member
Oct 25, 2017
7,151
Beer Monkey - what about the bottom of the screen versus some TVs which are using the global scanout option as described by Paragon above?

The 2019 LG OLEDs have VRR/Gsync and so are presumably doing global scanout @120Hz. Surely the bottom of those TV's is getting updates faster than the bottom of a CRT?

Or am I missing something?
 

DealWithIt

Member
Oct 28, 2017
2,690
This thread is still, somewhat, misinformation.

Even your fastest LCD over HDMI is still receiving its data top to bottom with each scanline fed through the pipeline from right to left.

Nothing is actually faster than CRT. Everything else is CRT plus more.

Perhaps I'm misunderstanding, but the OP explains this very thing. As far as I can tell.
 

laxu

Member
Nov 26, 2017
2,782
I'd say that input lag has become a non-issue as long as you own a proper gaming monitor or TV. The only things CRTs do better are handling old low res games and motion resolution because CRT's just manage to look better for those due to how they are not pin-sharp like LCDs. Motion resolution is better because CRTs don't have sample and hold blur due to flickering at their refresh rate. For LCDs you need strobing backlights or black frame insertion for this and they have their own issues (reduced brightness or added input lag) and are not available on many displays.

Response times and overdrive implementations on non-OLED screens are also a very varied bag depending on LCD panel type and the particular panel, e.g. some VA panels are awful for dark transitions and on others it's a non-issue while some have bad overdrive implementations, even new displays. For example the 4K 120 Hz ASUS XG438Q looks like a great display on paper but it combined both horrible black smearing with bad inverse ghosting, making it something that nobody should buy. I don't know how manufacturers with years of experience of making displays that are just fine manage to mess these things up on a display by display basis.
 

giblet

Banned
Oct 28, 2017
179
Turn off VSync and you will get torn frames "half way" down with 0 lag.

Not a good comparison to be fair.