• verdi@tarte.nuage-libre.fr
    link
    fedilink
    Français
    arrow-up
    1
    ·
    edit-2
    14 hours ago

    Oh, Digital Foundry. I’ll see myself out, I thought it was an actual analysis. Even Tim from Hardware Unboxed is more thorough and knowledgeable than Alex…

    And that’s even before mentioning the enshitification of DF to get exclusive hardware reveal videos out which 100% compromises any single word coming out of them. Member when NVIDIA threatened Steve from GN with loss of access if he didn’t basically become a mouthpiece? Well, DF never lost access, so the logical conclusion is…

  • NihilsineNefas@slrpnk.net
    link
    fedilink
    English
    arrow-up
    7
    arrow-down
    4
    ·
    1 day ago

    How about instead of wasting power on genai interpolation, we use that processing power to show the frames the game is outputting?

    • Coelacanth@feddit.nu
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      1
      ·
      1 day ago

      Like the other user said, DLSS is literally more power efficient than native rendering if you care about power draw or whatever. You can harp on it all you want for not achieving perfect visual fidelity, especially on modes like Ultra Performance, but efficiency is literally the whole point of it.

      • NihilsineNefas@slrpnk.net
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        4
        ·
        1 day ago

        Oh I’m not harping on it for visual fidelity, I’m harping on it for being a useless feature thats become a major part of the genai problem with the 50 series and as a result part of why ram/gpus/cpus prices are massively overinflated, despite having barely any increase in actual graphical power over the previous generation.

        • Coelacanth@feddit.nu
          link
          fedilink
          English
          arrow-up
          5
          arrow-down
          1
          ·
          1 day ago

          You can rage against the machine if that makes you happy, but DLSS is patently not a useless feature. It lets you sacrifice visual fidelity for performance, that’s it. Many people find it useful. Any hardware you buy will be obsolete at some point. You may be able to play new releases in native resolution now, but in a few years your card won’t keep up anymore. Instead of buying a new card, you can keep using your old one and turn on DLSS. That’s useful. DLDSR is also a fantastic use of AI that is especially impactful on older games, but will make almost any game look better if you use it, particularly games that don’t have good native anti-aliasing.

          DLSS is also a very minor part of the AI landscape - in fact I think the only reason Nvidia hasn’t scrapped selling gaming cards entirely is that it’s part of their “legacy”. If you want to hate on every scrap of AI in existence because of a dogmatic hatred of AI in general then that’s fair enough, but then say so instead of calling a technology useless and inefficient when it’s neither.

          • NihilsineNefas@slrpnk.net
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            1
            ·
            14 hours ago

            You know what also reduces visual fidelity for more frames, works with every generation of game since the beginning of the ‘settings’ option? Running the game on lower graphics.

            I call it useless because it’s only ‘use’ is faking a higher frame rate, and making it so that games developers dont have to put in effort when they’re optimising their game.

            The main gripe I have with it is that nvidia decided to implement it and sell it as a new generation of card, and by not changing any architecture in their cards, pushing for a “Hey Datacenters; buy more cards, they’re designed to work with your AI systems.” Method of shifting their stock, because they knew they could make a quick buck off the AI bubble.

            As a result the general public get absolutely shafted with higher prices for the same tech as the previous generation, but with a bit of code in it that works well with the program that’s actively making the world a worse place.

            I don’t have a dogmatic hatred against AI, I very much enjoy the fact that nowadays we can use analytical machines to search the absolutely astronomical level of data that comes from observatories and telescopes like Webb and Hubble without having to put an undergrad through what equates to mental torture by sifting through all the data for them.

            What I dislike is that generative ai’s current use is deepfakes, conspiracy content, shortform video slop, astroturfing political opinions and giving a summary of the wrong answers across every search engine, and some of the worst writing that has ever been put to digital paper, to name a couple.

    • Steve@communick.news
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      1
      ·
      edit-2
      1 day ago

      That’s kind of a silly question.
      The whole point is that this is more efficient.
      You trade 10% rendered frames for 300% generated frames.

      Besides, it’s always an option to just turn it off.
      That’s what I usually do. Then I don’t have to give it annother thought.

  • Coelacanth@feddit.nu
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 day ago

    Interesting. Will have to look deeper at some other comparisons but I guess preset K is still the go-to for Quality and Balanced?