I could be wrong - perhaps you have been supplied with information from Microsoft about what they're doing - but I think you made some assumptions here that are not correct.
I should point out that I don't have any FreeSync hardware, but do have experience with G-Sync - and a lot of the issues that you have reported appear to behave exactly the same as they do on a G-Sync monitor.
It doesn't look like there is any kind of low framerate compensation active here at all.
For LFC to operate, the maximum refresh rate must be at least double the minimum refresh rate - and FreeSync requires 2.5x because games are highly variable.
With a 40-60Hz range, it could not possibly support LFC because 39x2 = 78Hz, and for FreeSync 39x2.5 = 97.5Hz.
So with a 40Hz minimum refresh rate, the monitor would have to be
running at 100Hz - not just supporting it.
I don't believe the Xbox One will output anything higher than 60Hz, which means that a display would have to support a range of 24-60Hz for LFC to be active, if the Xbox even supports LFC. Nothing currently supports a range that low.
What you are seeing when the monitor reports 60Hz on the counter for games like
Dark Souls 3 is FreeSync being deactivated, not some form of LFC.
Now, as for tearing: VRR can only guarantee a tear-free experience inside the active range.
If you are outside the active range and V-Sync is disabled, it will tear.
If you go outside the active range with V-Sync enabled, it transitions to standard V-Sync behavior.
With an LFC-capable setup, the active range is effectively 0 FPS up to the display's maximum refresh rate.
But with G-Sync, if you disable V-Sync you will find that you can still get tearing when the framerate is pushing against the upper limit of the display's range.
This is why games that otherwise run well, but set the display to 60Hz automatically, often have people reporting screen tearing problems on G-Sync monitors.
Bayonetta is one example where I saw that complaint a lot. It runs at 60 FPS on a modest gaming PC, but since it sets the refresh rate to 60Hz, that will cause tearing if V-Sync is disabled - despite G-Sync also being enabled.
With my ASUS monitor, there is a "Turbo" button that will force the refresh rate to its maximum while the game is running to prevent that, and for other monitors there are other potential solutions like running in borderless mode or using software that lets you set the refresh rate via a keyboard hotkey.
My understanding is that this is caused by frame-time spikes that most software won't pick up.
With a 100Hz G-Sync display, I have found that under extreme circumstances, you can push the display into tearing in some games as much as 20% below the maximum refresh rate if you're doing things to cause highly variable performance in the game, if V-Sync is disabled.
But that's why G-Sync originally didn't give you the option to disable V-Sync at all.
With G-Sync and V-Sync enabled, it should not be possible for a game to tear at any framerate. What can happen is that if your framerate hits the maximum refresh rate, it will transition over to standard V-Sync behavior and start to feel quite laggy - particularly if you are using a mouse.
This is why people with G-Sync displays use a framerate limiter (RTSS is recommended) to cap the framerate 3 FPS below the maximum refresh rate. That 3 FPS cap prevents the display transitioning out of G-Sync into V-Sync behavior, and you still benefit from having V-Sync enabled to prevent sudden frame-time changes from tearing (a framerate limiter alone cannot do this) without fully transitioning over to V-Sync behavior and adding latency.
That's likely why games such as
F1 2017 are still tearing even if the framerate is reporting 60 or is in the very high 50s.
As for
Final Fantasy XV stuttering inside the VRR range, well I don't have that specific game on PC and the situation may be different on the Xbox anyway.
I have found several games on PC which have stuttering issues that G-Sync cannot solve however. As far as I can tell, it's a frame presentation issue with the game itself.
SOMA and
Dishonored 2 both look like they have very bad "microstutter" on a G-Sync display if you disable their internal 60 FPS framerate limiters.
Dishonored 2 running at 90 FPS via G-Sync on my 100Hz display - whether it's using the internal framerate limiter or an external one - looks like it's running at 45 FPS. Use the internal 60 FPS cap, and it runs buttery smooth (so long as you have the latest patch).
So in some situations it appears to be the engine at fault.
It will be interesting to see what Samsung do with their 2018 TVs.
The problem with limited VRR ranges is that most LCD panels do not like refreshing at less than 40Hz or so. LCDs cannot hold an image indefinitely. It will start to fade to white the longer it is held, so they need to refresh in order to sustain the image and prevent flickering. I don't believe that is the case for OLED though.
So I don't expect them to natively support <40Hz despite being a television with "24Hz" capabilities.
But their televisions are already using 4K 120Hz native panels.
You cannot send them a 4K120 signal due to the current limitations of HDMI, but they update at 120Hz for things like motion interpolation and displaying 24p content without judder.
In theory - and this is pure speculation on my part - a television with a 40-120Hz capable panel could
report that it supports a 1-60Hz range for input, and perform LFC inside the display using a range of 48-120Hz since the panel itself can do it. But I'm being very hopeful there - nothing currently does this.
Samsung's higher end TVs from 2018 are supposed to get a software update with support for 120hz and Freesync later this year.
60Hz VRR.
120Hz will need full bandwidth HDMI 2.1 support - which will hopefully be the 2019 displays.