I'm not sure how you could look at any game rendered in 8K and not see it as a big improvement in image quality over 4K.
If TAA is being used, the image looks quite a bit sharper too.
With that said, I don't think the performance cost is remotely worth it. Even native 4K is a waste of resources with current GPU hardware.
Motion resolution is lagging far behind static resolution. Games need to stop targeting 30 FPS.
HDMI 2.1 brings 120Hz variable refresh rate support, so games should start to target 120 FPS next-gen instead of chasing after higher and higher static resolutions. That would be a far bigger leap than even going to 8K native.
I think colour depth, frame rate and image compression are more important past 4k.
You have to remember that most films are at best finished and delivered to the cinema in 4K. If an image looks good on a cinema screen, theres limited argument for needing more on even a 100" screen.
I barely see a difference between 2K and 4K projection at a big-ass movie theater. Gaming at 4K with ever-increasingly-improving TAA is beyond good enough.
Projection is considerably softer and lower contrast than a direct-view display; even single-chip DLP projectors, so resolution tends to be far less noticeable with them despite the large image size.