Again: the main reason for this is because you're probably watching SDR way out of spec.
I don't think many people would struggle to see the benefit of HDR if they were comparing a calibrated SDR image to HDR.
But TVs now have the option to display SDR far brighter and more vivid than it was ever intended to look, which can outshine the HDR image that is forced to be displayed in a more accurate way by default.
Calibrated SDR on the left, calibrated HDR on the right:
Both images have a similar brightness, but HDR has a more natural color palette and an extended dynamic range.
Note how the sky in SDR washes out to a flat gray area (limited dynamic range) while the sky in HDR has the sun and lens flare stand out as much brighter objects against it.
The sun might be 100 nits against a 90 nits sky in SDR, but it could be 1000 nits against a 90 nits sky on the HDR display - and it would stand out a lot more when viewed in person.
SDR out-of-spec on the left, vs calibrated HDR:
SDR is bright, vivid, and unnatural.
Instead of using the extra brightness to extend the dynamic range, the flat gray sky is now extremely bright. Instead of 90 nits as intended, the sky might be pushed to 450 nits.
There's no extra detail - the sun still barely stands out against the sky due to the limited dynamic range,
but it's bright.