Since the dawn of time (of computer hardware) it has always been the display monitor that is 100% in charge of how many times per second a computer program, such as a video game, can update the screen. For example, a 60-hertz display forces the computer to update 60 times per second, period, and if it misses that timeframe then tough luck, the next opportunity is the next screen refresh, which effectively means the game updates at 30 Hz. There is no in-between.
Unless, of course, the computer simply ignores the display's refresh rate by turning off its own vertical sync, and just updates the image at whatever speed it likes. This frees up the game's framerate but has the rather annoying consequence of so-called screen tearing: The picture changes in the middle of the monitor updating the display, and thus you get the upper section of the display still showing the previous frame while the lower section is showing the next frame. If the image is eg. scrolling horizontally, there will be a clear cutline, where the two pictures are offset. Since the computer rarely can hold an exactly perfect timing on this, this horizontal cutting line, this tear, tends to jump randomly, and can be quite annoying.
Some time in the beginning of the 2010's Nvidia introduced a technological solution, which they called G-Sync, that effectively reverses this relationship: Instead of the display monitor deciding the refresh rate and the computer adapting to it, we do it the other way around: Now it's the computer that decides the refresh rate and the monitor adapts to it, refreshing the display as soon as a new picture is sent to it (within certain maximum and minimum limits). AMD soon followed this by introducing their open FreeSync standard.
This offers the best of both worlds: The computer is not limited to a highly limited set of refresh rates (such as 60 Hz and fractions of it), and thus the refresh rate can vary freely (within limits), and there is no screen tearing. If the game renders at 53 Hz at one point and then at 47 Hz, and then at 71 Hz, it doesn't matter, the monitor will display every frame as soon as it becomes available.
Nvidia has always marketed this feature to professional and hard-core gamers, essentially arguing that they can get the maximum display performance out of video games without any fractional framerate limits, and without the annoyance of screen tearing. They argue that if a particular PC can run a particular game at, say, 107 frames per second, you'll get 107 frames per second, not 60, and thus your reaction times will be better and gameplay will be smoother. You'll always get the most out of your rig, rather than being limited by the display (up to the maximum refresh rate supported by the display, eg. 144 Hz.)
However, I think that for a more regular gamer (like me) there's a much better practical reason why adaptive sync technology, such as G-Sync, is especially handy and useful. It's a bit related to the above, but it's not about the efficiency of the game and how much it eg. could improve your reaction times etc.
As a more normal gamer I don't really care if a game is running at, let's say, 50 Hz, or 70 Hz, or 120 Hz. It all looks essentially the same to me. (It's only at about 40-45 Hz or thereabouts that I start noticing the slower framerate in casual gameplay.)
Much more important and practical for me is that G-Sync almost completely frees me from having to worry about optimizing the game settings to hit that golden 60 Hz threshold.
With a regular non-adaptive-syncing display, when you use vsync (which is a must if you don't want screen tearing), if your PC is not capable of running a particular game consistently at 60 Hz or above, it will drop below that at points, which with vsync means that it will immediately drop down to 30 Hz. This can be quite annoying, especially if the game keeps constantly and randomly switching between 60 and 30 Hz. This random constant switching between the two framerates causes an uneven jitter in the game, which is even more annoying than if the game just always ran at 30 Hz.
In fact, prior to getting a G-Sync monitor, if a game was really heavy to render and I didn't want to reduce the resolution any further, I would just limit it to 30 Hz. Better a constant 30 Hz than random jumping between 60 and 30. The constant 30 is less annoying.
G-Sync, however, solves this problem beautifully. It almost completely removes any need to worry about hitting that magical 60 Hz threshold. With G-Sync it doesn't matter if the game occasionally drops a bit below 60 Hz, like to 55 Hz, or even 50. It still looks good, and you don't even notice. With G-Sync you can completely forget about that magical number "60", because it doesn't mean anything anymore. There is no magical framerate number.
(As mentioned, the only situation where I start noticing some jitter is if the framerate drops to about 40 Hz and below. However, with most games with my rig that doesn't happen all that often. I have a quite hefty gaming PC. But even then, the transition is still very smooth, not jumpy like without G-Sync, and it jumping between 60 and 30 Hz, so even when the framerate drops to something like 40, it's not as annoying looking.)
This is the main reason why I enjoy G-Sync. It's not the high framerates that matter to me, but the ones below 60, which were a pain before.