I think that's how most people understand it, but it's not just about what happens above the Tv's capabilities, it's what happens on the way to it.
This graphy below shows the measured outputs from an OLED TV, the yellow line represents the standard for HDR and the grey line represents what the tv is actually outputting.
What you see here is that as the TV approaches it's highest values, it actually starts to deviate away from the standard, tone mapping the image with a curve into that 700nit area , it does this to ensure that there is no sudden steps in the colours of brightness.
Another TV may tone map differently
Here is a samsung LCD
This one here behaves differenty, it starts of meeting the standard, then when it gets to 300 nits, everything becomes much brighter, this keeps going all the way up to it's max 1400 at which point it just hard clips.
This is an example when the TV is being fed 1000nit metadata.
It may well be that TVs also tone map differently depending on the metadata, I mean that is what it is there for.