I recently reinstalled windows on my pc, and looked at pcmr on reddit to find a post of someone complaining about gsync, on nvidia systems, not beeing enabled by default everywhere.
That reminded me the pain it is to help someone enable it, with an ugly and hard to understand (for noobs) nvidia tutorial, and even worse with a freesync display.
On my system, with an amd card, and a freesync premium display, once the drivers were installed, freesync was enabled and no issues, nothing do fiddle with, it was just enabled automatically for all the system and windows to use.
Wonder why nvidia can’t do that.
It even set automatically my display to 165hz (tho maybe that could have been because it already was at 165 before the reinstall?).
There is still the trick to lower the max fps 3/4fps lower than the max hz of the display to teach, for better smoothness. But that is just an easy to do trick.
What is “freesync”?
It’s amd’s side of Nvidia’s gsync, but with a different way of working.
Both do about the same thing : match the monitor hz to the fps, in a range of minimum and maximum hz.
So if your game is doing 103fps and monitor can do 40-144hz. The monitor will match 103hz.
It reduces tearing and can maybe reduce the perception of lag. It doesn’t remove it. If you have frame drops you will still see them.
For the ur way of working :
-Gsync uses a physical chip in the monitor to do what it has to do. In addition of beeing a paid technology, it adds to the cost, and nvidia also does a quality control check on the monitors, which also increases costs. Gsync can only be used with nvidia gpus.
There are some limitations with these tho, they can only be used with display port 1.4+(or 1.2+, i don’t remember) or hdmi 2.1+ because of variable refresh rate support. Except for amd gpus and freesync. Amd gpus support freesync with older hdmi versions.
Wow thanks for the explanation! I’d heard something about variable refresh rates but I don’t think my monitors support it and I never looked into it