I have a Viewsonic XG2401 144Hz, it's incredible but honestly, I wish I'd gone with a 120Hz instead. I can't feel the difference between 120 and 144Hz (anything lower and I can) but it's much easier to get 120fps than 144fps.
Since it appears to have FreeSync, there's no need to be hitting 144 FPS in games. The whole point of VRR is that the entire range from 0 to your maximum refresh rate will be synchronized, rather than having to lock to divisors of the refresh rate. If it runs at 120 FPS, the display will be updating at 120Hz. Refresh rate is only a maximum on a VRR display.
If it was a fixed refresh rate 144Hz display, you'd have to run games at 144/72/48/36 FPS if you wanted them to be smooth - and in that case, 120Hz may be preferable (120/60/40/30). Not that I'd recommend it, since it's a VRR display, but can't you set it to 120Hz anyway?
Panel type is preference, i hate IPS due to the glow they have, my monitor has a VA panel, i much rather put up with the "blur" of the VA than the glow of IPS, there is no perfect monitor, yet.
Instead of "IPS Glow" which varies depending on the panel used (AUO AHVA are much worse than LG IPS in my experience) you get a completely washed-out image in the edges due to VA gamma shift.
Here's an IPS ultrawide (ASUS PG348Q) compared against a VA ultrawide (HP Omen X35).
Look at how badly washed-out the VA panel gets at an angle. With larger displays, or ultrawides, you even see that starting to happen in the corners when viewed straight-on.
"Glow" is a non-issue on the PG348Q since it uses an LG IPS panel.
This image has the on-axis photo up top, and the wide-angle photo at the bottom, using perspective-correction:
Even in the lower-right corner, I'm not seeing any glow. Just a loss of brightness due to the steep angle.
Eh all IPS have some glow in dark rooms. IPS also has awful contrast ratios. I'd never buy another one but that's me.
The thing with VA contrast ratios is that they only apply when viewed at a perfect 90°. Contrast falls off rapidly, and the higher contrast the panel is, the narrower the "sweet spot" is. Just moving your head slightly is enough to change the image on a VA panel, and you don't get the full contrast ratio in the corners of the display.
This is a 5000:1 native VA panel viewed up close:
The camera exaggerates how washed-out the image gets at the edges, but it's a good demonstration of how narrow the viewing angle is on those panels.
I've always found it strange that people think V-sync is "suppose" to be enabled for VRR. I know we have to because some games are just made that way and cause issues but in theory it shouldn't be that way and you should be able to play without tearing with Vsync off (I keep it off in almost every game, except for the ones that give me troubles and lead to tearing)
The entire point of VRR is that it's your monitor that adapts its refresh rate to the framerate, thereby leading to no mismatch between refresh rate and frame output..which means no unfinished frames being pushed out to display...and as such no tearing.
The benefit of no judder due to duplicate frames in a VRR is more like a side effect...it's not its primary purpose.
But by enabling Vsync you basically make that side effect the primary purpose as you are making the game do the job to prevent the mismatch. Which kind of defeats the primary purpose of Gsync.
In short most games that are properly made should be played without vsync to get the full benefit of VRR. Just be sure to cap the framerate at your monitor's maximum refresh rate else it'll lead to tearing if the framerate goes over it.
G-Sync originally did not even have the option to disable V-Sync. It's supposed to be enabled.
The only reason there's an option to disable it now is because AMD launched FreeSync without low frame-rate compensation, and gave users the option to disable V-Sync instead.
NVIDIA only added that option so they weren't "missing a feature" despite LFC being a requirement for G-Sync displays from day one.
When you get within ~20% of the display's maximum refresh rate, there is the possibility of tearing from sudden changes in performance; e.g. turning the camera very quickly to a scene which has significantly lower complexity.
So long as there is a frame rate limiter set at least 3 FPS below the maximum refresh rate, having v-sync enabled does not cause it to engage fully (no noticeable latency) but will prevent screen tearing in those scenarios.
A frame-rate limiter alone cannot do this. You'd have to set it 20-30 FPS below the monitor's maximum refresh rate to avoid tearing, instead of enabling v-sync.
There are legit reasons for emulation since 144hz isn't divisible by 30/60, but 120 is. 144hz can cause micro stutters in that situation. Otherwise, yeah as long as you have free/gsync no reason not to do 144.
RetroArch has a "Sync to exact content frame rate" option for VRR displays, so this should not be necessary.
Emulators for 3D systems generally don't seem to have an issue working with G-Sync normally. It's only 2D emulators where I ever had issues with G-Sync.