I’m so done with win11, and currently 12 of my 15 machines are linux anyway, but AFAIK HDR (on nvidia gpu) is still impossible? Are you guys all on AMD or just not using hdr for gaming/media? So instead of relying on outdated info, just asking the pros :)

  • artyom@piefed.social
    link
    fedilink
    English
    arrow-up
    12
    arrow-down
    5
    ·
    1 day ago

    We’re definitely not all on AMD but most of us are.

    Personally I don’t understand what all the hubbub is about HDR anyway. It always makes the picture look all washed out and desaturated so I keep it off. I’m obviously missing something.

    • WolfLink@sh.itjust.works
      link
      fedilink
      arrow-up
      8
      ·
      edit-2
      24 hours ago

      It always makes the picture look all washed out and desaturated

      This is a typical symptom when part of the HDR pipeline is working but not all of it. The HDR image is getting (poorly) converted to SDR before it’s being displayed.

      Actual HDR is richer colors than SDR. Note that you basically need an OLED monitor to display it properly. On most LCD monitors that advertise “HDR” support, it won’t look very different.

        • WolfLink@sh.itjust.works
          link
          fedilink
          arrow-up
          6
          ·
          edit-2
          23 hours ago

          Then you definitely have some settings wrong.

          Make sure the monitor is set to HDR mode (in the monitor’s built-in settings), and the OS is set to treat the monitor as HDR. Depending on the OS there may be other things to play with. E.g. I was getting the issue with things looking washed out after the latest bazzite update until I manually installed VK_HDR_LAYER

          Here is a site I usually use to test that HDR is working correctly: https://www.wide-gamut.com/test (you may need a chrome based browser, Firefox doesn’t always render it correctly)

          • artyom@piefed.social
            link
            fedilink
            English
            arrow-up
            2
            ·
            23 hours ago

            I was getting the issue with things looking washed out after the latest bazzite update until I manually installed VK_HDR_LAYER

            Ah that may be it then

    • Dyskolos@lemmy.zipOP
      link
      fedilink
      arrow-up
      10
      arrow-down
      1
      ·
      1 day ago

      Yes, you seem to be missing actual HDR :-) It looks washed out and desaturated if you’d view SDR content while HDR is enabled. Or the monitor can’t. Or whatever else. I even have problems with jellyfin on windows to get it right. That things needs a separate app to actually work. So HDR’s the only good thing about win11, as it mostly works.

      I really wanna finally ditch that horrorshow, but going back to SDR feels like going back from 4K to 480p.

      • brucethemoose@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        edit-2
        1 day ago

        I suggest you strip Windows to the bone (including Defender), dual boot Linux, and delegate Windows as a “HDR media OS”

        This is what I do, and it works well. Sufficiently neutered, Windows if really quick and out of the way.

        • Dyskolos@lemmy.zipOP
          link
          fedilink
          arrow-up
          3
          ·
          1 day ago

          It’s already neutered, but dual-booting really isn’t an option. As long as win remains a bootable option, why even add another one, i see no benefit in running both and wasting time switching regularly. Soon i wouldn’t even switch and ditch HDR :)

          • brucethemoose@lemmy.world
            link
            fedilink
            arrow-up
            2
            ·
            edit-2
            1 day ago

            I don’t really see the logic to that, as switching is near effortless. It takes a couple of seconds to reboot and select the other OS. Ditching HDR, on the other hand, is painful.

            Each to their own though.

            • Dyskolos@lemmy.zipOP
              link
              fedilink
              arrow-up
              2
              ·
              1 day ago

              I want to leave win, if i’d still have to keep it (and not as a very specific vm), why goes through the hassles (everything is already working now) of booting anything else? win is still there and still has to be maintained. I would not gain anything but double the work.

              • brucethemoose@lemmy.world
                link
                fedilink
                arrow-up
                1
                ·
                edit-2
                1 day ago

                You’d gain HDR!

                Windows is nearly effortless to maintain if you only use it for entertainment.

                Maybe we just have different priorities, but right now, I’d be miserable and wasting so much time if I was stuck on Linux only, even though I use Linux like 90% of the time. Some media and some games just won’t look right.

                And to emphasize, it would take sooo much time to massage this issue on Linux. Dual booting saves me a ton of maintenance and tweaking.

                • Dyskolos@lemmy.zipOP
                  link
                  fedilink
                  arrow-up
                  1
                  ·
                  1 day ago

                  Err, I already main windows (for my rig that is, not the servers). I have HDR and everything else, i would not gain anything i have not right now with dualbooting linux. Except the illusion I have ditched windows while being on linux :)

                  Everything else i could somehow cope with the loss, replace or code from scratch for linux, except HDR :(

    • Overwrite7445@lemmy.ca
      link
      fedilink
      arrow-up
      7
      arrow-down
      1
      ·
      1 day ago

      Washed out and desaturated is the opposite of what it should do. Sounds like you may be looking at SDR (non HDR) content with HDR enabled?

        • moody@lemmings.world
          link
          fedilink
          arrow-up
          1
          ·
          1 day ago

          You may have to enable HDR for Linux on each monitor individually from the display settings, and then enable HDR for the game itself from within its own settings.

          • artyom@piefed.social
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            1 day ago

            I only have 1 (large) monitor. I toggle it on and everything looks like butt. Then toggle it off and it looks normal again. It’s an HDR OLED display.

            • Domi@lemmy.secnd.me
              link
              fedilink
              arrow-up
              2
              ·
              9 hours ago

              If you’re on KDE, you can use the “sRGB color intensity” to quickly test if your content (e.g. via mpv or Proton) is really in HDR.

              If the content changes while going from minimum to maximum in “sRGB color intensity”, it’s SDR, if it does not change, it’s HDR.

              I also have an OLED monitor and HDR looks fantastic on KDE.

        • Overwrite7445@lemmy.ca
          link
          fedilink
          arrow-up
          1
          ·
          1 day ago

          With my configuration, it does. Though it likely also depends on GPU drivers, Monitor, etc. I’m just speculating as I have no clue beyond “works for me”.

          Any games that have an HDR option in their graphics settings would support HDR I hope. Maybe if the display is already in HDR while in desktop mode, it’s possible that launching the game isn’t able to toggle it off?

    • brucethemoose@lemmy.world
      link
      fedilink
      arrow-up
      2
      ·
      edit-2
      1 day ago

      Completely disagree.

      Even setting gaming aside, I’ve started taking family/fun photos in HDR instead of JPEG, and on things that can render them (like smartphones and TVs), they are gorgeous. I can’t imagine going back now.

      I took this on a walk this week, completely unedited:

      HDR HEIF

      If your browser doesn’t render that, here’s my best attempt at an AVIF conversion:

      HDR AVIF

      And JPEG-XL:

      HDR JXL

      On my iPhone or a TV, the sun is so bright it makes you squint, and as yellow-orange as real life. The bridge in shadow is dark but clear. It looks just like I’m standing there, with my eyes adjusting to different parts of the picture.

      I love this! It feels like a cold, cozy memory.

      Now if I crush it down to an SDR JPEG:

      It just doesn’t* look* right. The sun is a paper-white blob, and washed out. And while this is technically not the fault of encoding it as SDR, simply being an 8 bit JPEG crushed all the shadows into blocky grey blobs.


      …This is the kicker with HDR. It’s not that it doesn’t look incredible. But the software/display support is just not there.

      I bet most browsers viewing this post aren’t rendering those images right, if at all.

      Lemmy, a brand new platform, doesn’t even support JXL, HEIF, or AVIF! It doesn’t support any HDR format at all; I had to embed them from catbox.

      • brucethemoose@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        1 day ago

        Here’s an interesting test:

        Say what you will about Safari and iOS, but it rocks with image format support. HDR JPEG XL and AVIF render correctly, and look like the original HEIF file from the camera.

        Helium (a Chrome fork) is on the left, Firefox on the right, running CachyOS Linux with KDE on a TV, HDR enabled from AMD output.

        Firefox fails miserably :(

        Chrome sorta gets the AVIF right, though it seems to lose some dynamic range with the sun.