Back in 2013, Nvidia introduced a new technology called G-Sync to eliminate screen tearing and stuttering effects and reduce input lag when playing PC games. The company accomplished this by tying your display’s refresh rate to the actual frame rate of the game you were playing, and similar variable refresh-rate (VRR) technology has become a mainstay even in budget monitors and TVs today.

The issue for Nvidia is that G-Sync isn’t what has been driving most of that adoption. G-Sync has always required extra dedicated hardware inside of displays, increasing the costs for both users and monitor manufacturers. The VRR technology in most low-end to mid-range screens these days is usually some version of the royalty-free AMD FreeSync or the similar VESA Adaptive-Sync standard, both of which provide G-Sync’s most important features without requiring extra hardware. Nvidia more or less acknowledged that the free-to-use, cheap-to-implement VRR technologies had won in 2019 when it announced its “G-Sync Compatible” certification tier for FreeSync monitors. The list of G-Sync Compatible screens now vastly outnumbers the list of G-Sync and G-Sync Ultimate screens.

  • conciselyverbose@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    69
    arrow-down
    32
    ·
    4 months ago

    This is silly.

    Gsync solved a problem that couldn’t be solved before they made it. They stayed committed to that good solution until there was an alternative that reached a reasonable level of performance, then supported both until they could get close without the expensive extra hardware.

    Was it worth it? For most people no. But it’s still technically superior today and there are loads of options without the extra cost.

      • Overshoot2648@lemm.ee
        link
        fedilink
        English
        arrow-up
        8
        ·
        4 months ago

        They could literally just transition to Vulkan with a Metal wrapper for pre-existing software ate any time but no, they have to keep their ecosystem locked down for some reason.

        • xavier666@lemm.ee
          link
          fedilink
          English
          arrow-up
          1
          ·
          4 months ago

          but no, they have to keep their ecosystem locked down for some reason

          money 😘

    • barsoap@lemm.ee
      link
      fedilink
      English
      arrow-up
      81
      arrow-down
      4
      ·
      4 months ago

      VESA Adaptive-Sync goes back to the eDP stardard, 2009. AMD simply took that and said “Hey why aren’t we doing that over external DisplayPort”. And they did.

      So instead of over-engineering a solution that nobody asked for to create vendor lock-in nobody (but fanboys with Stockholm Syndrome) want they exposed functionality that many many panels already had, anyway, because manufactures don’t use completely different control circuitry for laptop (eDP) and stand-alone monitors.

      And, no, nvidia’s tech is not superior. From what I gather they have stricter certification requirements but that’s it.

      • AngryMob@lemmy.one
        link
        fedilink
        English
        arrow-up
        10
        arrow-down
        3
        ·
        4 months ago

        Gsync modules have a lower sync window before LFC kicks in (usually around 30), and faster pixel response (overdrive) anywhere in the sync window. Those are benefits for both high framerate content and low framerate content.

        Even today freesync usually bottoms out around 48. That constantly puts you at the LFC boundary for a lot of AAA games if youre on a popular midrange graphics card and aiming for 60fps average.

        • Eideen@lemmy.world
          link
          fedilink
          English
          arrow-up
          4
          ·
          4 months ago

          Is a problem that LFC is used? As it only duplicate frames.

          When the framerate drops below the minimum refresh rate of the display, frames are duplicated and displayed multiple times so that they can sync to a refresh rate that is within the displays refresh rate range. For example, a display with a 60 – 144Hz refresh rate, would be able to sync the frames of a game running at 40 FPS, by doubling them so that the display could sync and run at 80 Hz. A display with LFC effectively results in the removal of the minimum refresh rate boundary.

          https://www.amd.com/en/products/graphics/technologies/freesync.html

        • barsoap@lemm.ee
          link
          fedilink
          English
          arrow-up
          3
          ·
          4 months ago

          That constantly puts you at the LFC boundary for a lot of AAA games if youre on a popular midrange graphics card and aiming for 60fps average.

          That constantly puts you at the point where you should lower graphics settings. Average fps might be a thing to put on benchmarks, but for actual playing you want to go by minimum fps (non-cutscene if necessary). And it’s not like Adaptive Sync can’t go down that low, protocol-wise, it’s that monitor producers don’t care to.

          Overdrive, too, is a matter of implementation not the sync protocol.

          • AngryMob@lemmy.one
            link
            fedilink
            English
            arrow-up
            1
            ·
            4 months ago

            Part of the point of vrr for the end user is to simplify worrying about settings and your system performance, isnt it? The average person is gonna pick a graphics preset and play. If the game feels smooth off the rip, thats the preset theyll stick with. They arent going to make sure that the heaviest scenes stay above their LFC threshold. They don’t even know what half this shit means. And arguably they wont even notice LFC stutter in the first place, which is probably why, like you said, manufactures dont care to make the threshold lower.

            To be clear though i agree with you. I do manage settings to keep my minimum where i like it. And having an older gsync chipped monitor which lets me put that minimum around 45fps is quite nice for path traced games and the like.

            I also want to be able to replace this monitor someday and not lose that option.

        • frezik@midwest.social
          link
          fedilink
          English
          arrow-up
          16
          ·
          4 months ago

          Just to address this from a high level, I see this as typical of Nvidia and AMD approaches. Nvidia makes something that’s engineered to perfection, but adds a bunch of requirements on it that make it expensive and supports vendor lock-in. Even if you’re willing to put with that to have The Best, you might hesitate when finding out what assholes Nvidia are about everything.

          AMD then makes something 95% as good, and it’s cheap and you can work with them without yelling.

          See also: FSR vs DLSS.

    • Tetsuo@jlai.lu
      link
      fedilink
      English
      arrow-up
      20
      arrow-down
      9
      ·
      4 months ago

      The problem was solved by Nvidia, then AMD made it cheap and accessible and not requiring a dedicated hardware module.

      For years and years Nvidia increased artificially by up to 150 euros many Gsync screens and for no legitimate reason. Initially there was NO compatibility with free sync at all.

      Nvidia wasn’t kindly solving a gamers problem at least to after the first year of release of that tech. They were forcibly selling expensive hardware modules nobody needed or wanted. And long after freesync showed you could do it just as well without this expensive requirements.

      This hardware module they insisted on selling wasn’t solving a technical problem but a money one.

      I don’t even think anyone was ever able to differentiate between the different qualities of “sync techs”.

      • conciselyverbose@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        11
        arrow-down
        6
        ·
        edit-2
        4 months ago

        There absolutely was a legitimate reason. The hardware was not capable of processing the signals. They didn’t use FPGAs on a whim. They did it because they were necessary to handle the signals properly.

        And you just haven’t followed the tech if you think they were indistinguishable. Gsync has supported a much wider variance of frame times over its entire lifespan.