cua cà mau cua tươi sống cua cà mau bao nhiêu 1kg giá cua hôm nay giá cua cà mau hôm nay cua thịt cà mau cua biển cua biển cà mau cách luộc cua cà mau cua gạch cua gạch cà mau vựa cua cà mau lẩu cua cà mau giá cua thịt cà mau hôm nay giá cua gạch cà mau giá cua gạch cách hấp cua cà mau cua cốm cà mau cua hấp mua cua cà mau cua ca mau ban cua ca mau cua cà mau giá rẻ cua biển tươi cuaganic cua cua thịt cà mau cua gạch cà mau cua cà mau gần đây hải sản cà mau cua gạch son cua đầy gạch giá rẻ các loại cua ở việt nam các loại cua biển ở việt nam cua ngon cua giá rẻ cua gia re crab farming crab farming cua cà mau cua cà mau cua tươi sống cua tươi sống cua cà mau bao nhiêu 1kg giá cua hôm nay giá cua cà mau hôm nay cua thịt cà mau cua biển cua biển cà mau cách luộc cua cà mau cua gạch cua gạch cà mau vựa cua cà mau lẩu cua cà mau giá cua thịt cà mau hôm nay giá cua gạch cà mau giá cua gạch cách hấp cua cà mau cua cốm cà mau cua hấp mua cua cà mau cua ca mau ban cua ca mau cua cà mau giá rẻ cua biển tươi cuaganic cua cua thịt cà mau cua gạch cà mau cua cà mau gần đây hải sản cà mau cua gạch son cua đầy gạch giá rẻ các loại cua ở việt nam các loại cua biển ở việt nam cua ngon cua giá rẻ cua gia re crab farming crab farming cua cà mau
Skip to main content

FreeSync vs. G-Sync

If you’ve ever experienced screen tearing in a PC game, you know how annoying it can be — an otherwise correctly rendered frame ruined by gross horizontal lines and stuttering. You can turn on V-Sync, but that can be detrimental to system performance.

Recommended Videos

Nvidia and AMD have stepped up to solve the issue while preserving frame rates, and both manufacturers have turned to adaptive refresh technology for the solution. That often leads to a very obvious recommendation: If you have an Nvidia GPU, use G-Sync. If you have an AMD GPU, use FreeSync.

Get your weekly teardown of the tech behind PC gaming
Check your inbox!

But if you have a choice in monitors or graphic cards, you may be wondering exactly what the differences are and which syncing technology is best for your setup. Let’s break it down to reveal which is a better option for you.

Performance

G-Sync and FreeSync are both designed to smooth out gameplay, reduce input lag, and prevent screen tearing. They have different methods for accomplishing these goals, but what sets them apart is that the former keeps its approach close to the vest, while the latter is shared freely. Nvidia’s G-Sync works through a built-in chip in the monitor’s construction. FreeSync uses the video card’s functionality to manage the monitor’s refresh rate using the Adaptive Sync standard built into the DisplayPort standard — the result is a difference in performance.

Users note having FreeSync enabled reduces tearing and stuttering, but some monitors exhibit another problem: Ghosting. As objects move on the screen, they leave shadowy images of their last position. It’s an artifact that some people don’t notice at all, but it annoys others.

Many fingers point at what might cause it, but the physical reason for it is power management. If you don’t apply enough power to the pixels, your image will have gaps in it — too much power, and you’ll see ghosting. Balancing the adaptive refresh technology with proper power distribution is hard.

Acer Predator XB2 review full.

Both FreeSync and G-Sync also suffer when the frame rate isn’t consistently syncing within the monitor’s refresh range. G-Sync can show problems with flickering at very low frame rates, and while the technology usually compensates to fix it, there are exceptions. FreeSync, meanwhile, has stuttering problems if the frame rate drops below a monitor’s stated minimum refresh rate. Some FreeSync monitors have an extremely narrow adaptive refresh range, and if your video card can’t deliver frames within that range, problems arise.

Most reviewers who’ve compared the two side-by-side seem to prefer the quality of G-Sync, which does not show stutter issues at low frame rates and is thus smoother in real-world situations. It’s also important to note that upgrades to syncing technology (and GPUs) are slowly improving these problems for both technologies.

Selection

One of the first differences you’ll hear people talk about with adaptive refresh technology, besides the general rivalry between AMD and Nvidia, is the difference between a closed and an open standard. While G-Sync is proprietary Nvidia technology and requires the company’s permission and cooperation to use, FreeSync is free for any developer or manufacturer to use. Thus, there are more monitors available with FreeSync support.

In most cases, you can’t mix and match between the two technologies. While the monitors themselves will work irrespective of the graphics card’s brand and can offer both Freesync and G-Sync support, G-Sync is only available on Nvidia graphics cards. Freesync works on all AMD cards and some Nvidia cards, too. But there’s a catch — it’s only guaranteed to work correctly on FreeSync monitors that are certified Nvidia G-Sync Compatible. The cards have undergone rigorous testing and are approved by Nvidia to ensure that FreeSync runs smoothly across the card range. Here’s a current list of certified monitors.

If you go the Nvidia route, the monitor’s module will handle the heavy lifting involved in adjusting the refresh rate. They tend to be more expensive than Freesync counterparts, although there are now more affordable G-Sync monitors available, like the Acer Predator XB241H.

Most recent-generation Nvidia graphics cards support G-Sync. Blur Busters has a good list of compatible Nvidia GPUs you can consult to see if your current card supports it. Nvidia, meanwhile, has special requirements for G-Sync rated desktops and laptops for a more thorough check of your system.

A FreeSync monitor on a desk.

You won’t end up paying much extra for a monitor with FreeSync. There’s no premium for the manufacturer to include it, unlike G-Sync. FreeSync in the mid-hundreds frequently comes with a 1440p display and a 144Hz refresh rate (where their G-Sync counterparts might not), and monitors without those features can run as low as $160.

Premium versions

Picture of a monitor supporting G-Sync.
Image used with permission by copyright holder

G-Sync and Freesync aren’t just features; they’re also certifications that monitor manufacturers have to meet. While basic specifications allow for frame syncing, more stringent premium versions of both G-Sync and Freesync exist, too. If monitor manufacturers meet these more demanding standards, then users can feel secure that the monitors are of higher quality, too.

AMD’s premium options include:

  • FreeSync Premium: Premium requires monitors to support a native 120Hz refresh rate for a flawless 1080p resolution experience. It also adds low frame rate compensation (LFC), which copies and extends frames if the frame rate drops to help smooth out more bumpy experiences.
  • FreeSync Premium Pro: Previously known as FreeSeync 2 HDR, this premium version of FreeSync is specifically designed for HDR content, and if monitors support it, then they must guarantee at least 400 nits of brightness for HDR, along with all the benefits found with FreeSync Premium.

Nvidia’s G-Sync options are tiered, with G-Sync compatible at the bottom, offering basic G-Sync functionality in monitors that aren’t designed with G-Sync in mind (some Freesync monitors meet its minimum requirements). G-Sync is the next option up, with the most capable of monitors given G-Sync Ultimate status:

  • G-Sync Ultimate: Ultimate is similar to FreeSync Premium Pro, a more advanced option available on the more powerful GPUs and monitors that are designed for HDR support and low latency. It used to demand a minimum brightness of 1,000 nits, but that was recently reduced to demand just VESA HDR400 compatibility, or around 400 nits.

Conclusion

The G-Sync from Nvidia and the Freestyle feature from AMD both come with amazing features that can improve your game levels. Personally, when you compare the two, the G-Sync monitors come with a feature list that is slightly better, especially for the products rated at the G-Sync Ultimate level. That said, the difference between the two isn’t so great that you should never buy a Freesync monitor. Indeed, if you already have a decent graphics card, then buying a monitor to go with your GPU makes the most sense (side note for console owners: Xbox Series X supports FreeSync and PS5 is expected to support it in the future, but neither offer G-Sync support).

If you eliminate the price for any additional components, you can expect to shell out a few hundred dollars on a G-Sync monitor, at least. Even our budget pick is available for about $330. Unfortunately, due to product shortages, prices can vary significantly for a compatible G-Sync graphics card. Mid-range options like the RTX 3060 are releasing shortly and will offer fantastic performance for around $400, but they’ll also be in short supply. Any other new generation cards will also be tough to find and could set you back at least $500 when available. 

If you need to save a few bucks, FreeSync monitors and FreeSync-supported GPUs cost a bit less, on average. For example, the AMD Radeon RX 590 graphics card costs around $200. That said, all of the power-packed graphics cards were pretty difficult to find in the early part of 2021. It may be best to wait a few months and then buy a new RX 6000 card for a more budget-friendly price instead of buying MSRP right now.

Max Kwass-Mason
Former Digital Trends Contributor
Max is a student at Columbia University, studying Philosophy and Computer science in the scholars program. He's interested…
Next-gen GPUs are coming ‘later this year’ — but which?
RX 7900 XTX slotted into a test bench.

What's going on with next-gen graphics cards? I've been asking myself that question for months now. Reports about Nvidia's RTX 50-series and AMD's RDNA 4 first pointed to a 2024 release, but most sources now agree that we won't see any new GPUs until 2025. Except EK Water Blocks, a company that now claims that we'll see an announcement "later this year."

EK Water Blocks makes liquid cooling solutions, and it's partnered with both Nvidia and AMD, which makes it harder to determine which GPU manufacturer it's talking about here. According to the latest leaks, both GPU makers aren't launching their new products this year, although one source (admittedly uncertain) claimed that we'd have an announcement this month. This is now the second leak in as many days that implies good news in 2024.

Read more
After a decade, Nvidia is fixing the worst part of G-Sync
Alan Wake 2 on the Alienware 27 QD-OLED gaming monitor.

It only took 11 years. Nvidia is finally doing away with its proprietary G-Sync module that's been the bane of the best gaming monitors for several years. Although Nvidia eventually started offering G-Sync branding with any variable refresh rate (VRR) tech, display brands have still needed to shell out for a dedicated G-Sync module for proper certification -- and pass along that cost to customers. Going forward, the G-Sync module is no more.

Nvidia's G-Sync tech isn't going anywhere, however. The company announced that it partnered with Mediatek to add every G-Sync feature to Mediatek scalers. If you're unfamiliar, a scaler is basically a small chip inside your monitor that handles resolution scaling, along with a bunch of other processing including the on-screen display (OSD), color adjustments, HDR, and input selection. The G-Sync module itself was a scaler. Now that you'll rarely find a gaming monitor without its own scaler, those features are being rolled into the chip already built into the display.

Read more
AMD just launched a free tool all serious PC gamers should have
A screenshot of Frame Latency Meter running on top of a game render.

AMD has just unveiled Frame Latency Meter (FLM), an open-source Windows utility designed to measure the response time of games based on mouse movements. FLM measures the time it takes for a mouse movement to translate into a new frame on the screen, providing insights into system performance.

This tool is particularly aimed at advanced gamers, power users, and game developers who are keen on optimizing whole-system latency or reducing input lag. If you're new to frame latency measurements, they are commonly used online to approximate input lag by measuring button-to-pixel latency.

Read more