BFI? I run mine at 120hz, looks smooth to me with no ghosting.
For most things I'd say motion looks better on OLED than my CRT monitor or SD CRT TV. Both of those have noticeable phosphor trails, but aside from that they still feel snappy to use and some SD content arguably looks better on them.
If you're seeing long trails on the CRT, you probably have brightness set too low.
CRTs were never meant to turn
off when displaying black, and anything bright moving over that will do it. The tube needs to be kept 'active'.
As for OLED: 60 FPS games are not nearly as smooth on them as they are on a CRT at 60Hz.
Even after enabling BFI, OLED still lacks the crystal-clear motion of a CRT; but it does a good job reducing judder.
I've mostly been playing PC games at high frame rates on my C1 OLEDs, but recently tried playing some retro games again and
had to set up BFI, because I was seeing bad judder and double-images as the screen scrolled; e.g. the candles on the walls in
Symphony of the Night.
It's a dramatic improvement for anything locked to 60 FPS, like old console games.
Thankfully that can often be set up in the emulator itself (at 120Hz) rather than having to completely disable VRR on the TV every time.
That being said: HDR shaders combined with BFI on an OLED is starting to look pretty damn good.
I'd still prefer to have an actual CRT if I could get one, but things are much better now than they were even just a few years ago.
You can get better resolution and smooth framerate now on panels. CRTS were great but the color bleeding and low light lumens output much to be desired on that side of things, IMO. Heck I had 32" Vega tv even a Sony VPH-1271 projector at one point 20 some years ago so I have seen what they can do. But the higher end modern tech just seems better now imo. But hey if you dig it go for it. With video everyone perceives things differently. So enjoy what you like.
Color bleeding mostly just means that you weren't using an RGB connection.
I say "mostly" because some consumer sets (especially modded ones) are still going to have some bleeding/blending in the darker shades, compared to a professional display/PC monitor.
CRTs hurt my eyes like crazy. Is this something that's addressable?
Not really - at least not for console games.
The most you can do for them is lower the "contrast" control and make the set dimmer - which should make the flicker less noticeable.
One of the main benefits to CRT is how smooth and clear motion is on them, compared to flat panels; but that only works for 60 FPS at 60Hz.
If you were to double the refresh rate and display 60 FPS at 120Hz to minimize flicker, you get very clear and distinct double-images as the screen scrolls - which defeats the purpose.
But if you're playing PC games on a monitor, you could easily run games at high frame rates, matching the refresh rate; e.g. 120 FPS at 120Hz.
The fact LCD TV's killed FED or SED research still saddens me. We could have progressed crt technology in new ways and still had the refresh and color benefits today.
People should be thinking of SED/FED as improved Plasma technology rather than "flat-panel CRTs."
They had a fixed pixel grid, and were driven using PWM - just like Plasma TVs.
At the time, the main advantages were: power efficiency, contrast, and motion clarity.
Good improvements; but they do little to make a Plasma TV anything more like a CRT, or something that would compete with OLED for modern content.
I did appreciate their push for motion clarity, but it seems that no-one really cares to compete on that with televisions these days.
The problem is that people are so used to flicker-free displays now, that it would be tough to convince most people to go back.
BFI at 60Hz can be pretty harsh on OLED at times (especially at higher brightness levels) - and it would have to be more extreme to match a CRT.