I’m so done with win11, and currently 12 of my 15 machines are linux anyway, but AFAIK HDR (on nvidia gpu) is still impossible? Are you guys all on AMD or just not using hdr for gaming/media? So instead of relying on outdated info, just asking the pros :)
I’m running 4k HDR 120Hz on NVidia 5070 in KDE Plasma on CachyOS.
To enable in Proton games it requires an extra environment variable.
The AMD part is actually the opposite, since AMD drivers on Linux can’t do HDMI 2.1, but NVidia can.
The AMD part is actually the opposite, since AMD drivers on Linux can’t do HDMI 2.1, but NVidia can.
Thats not quite true, you can do HDR with 4k @ 120hz and HDMI 2.0 but you will be limited to 8bits per channel of which will exhibit pronounced chroma banding, specially noticible in skies gradients. If you lower either resoltution or frequency you can get 10bit back too.
HDMI 2.0 can also support 4k 120hz but it will be limited to 4:2:2 chroma subsampling. It’s fine for the typical TV viewing distance and 2x hidpi scaling but sucks for desktop usage, specially at no hidpi scaling.
You can also get a DP 1.4 to HDMI 2.1 adapter and get full HDR 10bit color and 4:4:4 chroma 4k@120hz at the same time, no problem. The trouble is usually VRR, which tends to be very finicky or not work at al… :(
The point still stands that it’s harder/less-supported on AMD.
And for the record, VRR also works on my setup, but I have it disabled due to flickering caused by the TV-side.
Would’ve prefered a debian-based, but that surely is a point for cachy. Plus it’s german/eu. Does it do multiple monitors stress-free too? Thanks!
Does it do multiple monitors stress-free too? Thanks!
I’m not the same guy you were talking to, but if you use Wayland, multi-monitor should work without any issues.
Great, thanks!
Debian-based systems usually a few years behind on versions, which is probably why it doesn’t work there.
We’re definitely not all on AMD but most of us are.
Personally I don’t understand what all the hubbub is about HDR anyway. It always makes the picture look all washed out and desaturated so I keep it off. I’m obviously missing something.
It always makes the picture look all washed out and desaturated
This is a typical symptom when part of the HDR pipeline is working but not all of it. The HDR image is getting (poorly) converted to SDR before it’s being displayed.
Actual HDR is richer colors than SDR. Note that you basically need an OLED monitor to display it properly. On most LCD monitors that advertise “HDR” support, it won’t look very different.
I have an OLED.
Then you definitely have some settings wrong.
Make sure the monitor is set to HDR mode (in the monitor’s built-in settings), and the OS is set to treat the monitor as HDR. Depending on the OS there may be other things to play with. E.g. I was getting the issue with things looking washed out after the latest bazzite update until I manually installed VK_HDR_LAYER
Here is a site I usually use to test that HDR is working correctly: https://www.wide-gamut.com/test (you may need a chrome based browser, Firefox doesn’t always render it correctly)
I was getting the issue with things looking washed out after the latest bazzite update until I manually installed VK_HDR_LAYER
Ah that may be it then
Yes, you seem to be missing actual HDR :-) It looks washed out and desaturated if you’d view SDR content while HDR is enabled. Or the monitor can’t. Or whatever else. I even have problems with jellyfin on windows to get it right. That things needs a separate app to actually work. So HDR’s the only good thing about win11, as it mostly works.
I really wanna finally ditch that horrorshow, but going back to SDR feels like going back from 4K to 480p.
I suggest you strip Windows to the bone (including Defender), dual boot Linux, and delegate Windows as a “HDR media OS”
This is what I do, and it works well. Sufficiently neutered, Windows if really quick and out of the way.
It’s already neutered, but dual-booting really isn’t an option. As long as win remains a bootable option, why even add another one, i see no benefit in running both and wasting time switching regularly. Soon i wouldn’t even switch and ditch HDR :)
I don’t really see the logic to that, as switching is near effortless. It takes a couple of seconds to reboot and select the other OS. Ditching HDR, on the other hand, is painful.
Each to their own though.
I want to leave win, if i’d still have to keep it (and not as a very specific vm), why goes through the hassles (everything is already working now) of booting anything else? win is still there and still has to be maintained. I would not gain anything but double the work.
You’d gain HDR!
Windows is nearly effortless to maintain if you only use it for entertainment.
Maybe we just have different priorities, but right now, I’d be miserable and wasting so much time if I was stuck on Linux only, even though I use Linux like 90% of the time. Some media and some games just won’t look right.
And to emphasize, it would take sooo much time to massage this issue on Linux. Dual booting saves me a ton of maintenance and tweaking.
Err, I already main windows (for my rig that is, not the servers). I have HDR and everything else, i would not gain anything i have not right now with dualbooting linux. Except the illusion I have ditched windows while being on linux :)
Everything else i could somehow cope with the loss, replace or code from scratch for linux, except HDR :(
Washed out and desaturated is the opposite of what it should do. Sounds like you may be looking at SDR (non HDR) content with HDR enabled?
Is Linux not able to switch HDR on and off as necessary?
You may have to enable HDR for Linux on each monitor individually from the display settings, and then enable HDR for the game itself from within its own settings.
I only have 1 (large) monitor. I toggle it on and everything looks like butt. Then toggle it off and it looks normal again. It’s an HDR OLED display.
If you’re on KDE, you can use the “sRGB color intensity” to quickly test if your content (e.g. via mpv or Proton) is really in HDR.
If the content changes while going from minimum to maximum in “sRGB color intensity”, it’s SDR, if it does not change, it’s HDR.
I also have an OLED monitor and HDR looks fantastic on KDE.
With my configuration, it does. Though it likely also depends on GPU drivers, Monitor, etc. I’m just speculating as I have no clue beyond “works for me”.
Any games that have an HDR option in their graphics settings would support HDR I hope. Maybe if the display is already in HDR while in desktop mode, it’s possible that launching the game isn’t able to toggle it off?
Completely disagree.
Even setting gaming aside, I’ve started taking family/fun photos in HDR instead of JPEG, and on things that can render them (like smartphones and TVs), they are gorgeous. I can’t imagine going back now.
I took this on a walk this week, completely unedited:
If your browser doesn’t render that, here’s my best attempt at an AVIF conversion:

And JPEG-XL:
On my iPhone or a TV, the sun is so bright it makes you squint, and as yellow-orange as real life. The bridge in shadow is dark but clear. It looks just like I’m standing there, with my eyes adjusting to different parts of the picture.
I love this! It feels like a cold, cozy memory.
Now if I crush it down to an SDR JPEG:

It just doesn’t* look* right. The sun is a paper-white blob, and washed out. And while this is technically not the fault of encoding it as SDR, simply being an 8 bit JPEG crushed all the shadows into blocky grey blobs.
…This is the kicker with HDR. It’s not that it doesn’t look incredible. But the software/display support is just not there.
I bet most browsers viewing this post aren’t rendering those images right, if at all.
Lemmy, a brand new platform, doesn’t even support JXL, HEIF, or AVIF! It doesn’t support any HDR format at all; I had to embed them from catbox.
Here’s an interesting test:


Say what you will about Safari and iOS, but it rocks with image format support. HDR JPEG XL and AVIF render correctly, and look like the original HEIF file from the camera.
Helium (a Chrome fork) is on the left, Firefox on the right, running CachyOS Linux with KDE on a TV, HDR enabled from AMD output.

Firefox fails miserably :(
Chrome sorta gets the AVIF right, though it seems to lose some dynamic range with the sun.
I do know that Valve have been working closely with KDE to get it working there. So you should check if you’re on the latest Plasma desktop and you probably need Wayland.
Beyond that I have no idea. I don’t have any HDR capable device.
With no device it sure won’t phase you at all :) I do really regret having made the hdr-switch back then.
I have the same issue with HDR on Linux I had on Windows - support. But, on KDE Plasma it works well enough that I usually forget it’s working. Before I switched to KDE Plasma though I was on Gnome and it was a tad more difficult.
Sounds great, but…AMD, right?
Oh yes, sorry. I’ve been team AMD for so long I forget lol.
lol, me too, but only for CPU and only recently :)
I’m on a Sony OLED with a 3090. I game some, and color grade photos/videos in HDR.
…And I can’t get HDR to look right in KDE, even with the display running off my AMD IGP. It has basically zero options for me to tweak.
So I use Windows for that.
Honestly, it’s hard enough on Windows. It’s a coin flip as to whether apps works or not, and the TV needs adjustments for some, lest they crush black or blow out highlights/colors. Many games, specifically, need configurable mods to look right.
One of my saddest video workflows is transcoding on Linux, and downloading the result to my iPhone to see if it looks right.
Damn. Nah, as long as i’d still need windows, I see totally no benefit in dual-booting. I could live with a VM for the banking stuff or so, but dualbooting. Meh :( And yes, it’s already sucky enough on win. Though win11 made it better.
Thanks for your reply!
Dual booting is not bad!
What I do is share an NTFS partition between Windows and Linux for bulk data. If they’re DRM free, you can literally run the same games off the same drive.
Something goes wrong? I can just delete the windows partition and start over in 30 minutes, without losing hardly anything. It’s so much better as a “disposable” OS.
I also use two EFI partitions (the default Windows one and a new one for Linux) so there is zero possiblity of the OSes interacting.
To be blunt, I would never do banking in Windows if you can do linux. It’s just too much of a risk.
I use macrium reflect (which i would deerly miss on linux), so any mistake is just seconds away, and a complete restore in mere minutes. But as long as i HAVE to use win for hdr in games/media, i do have to use win. so dual-booting saves none of the risk unless win goes into a vm. wouldn’t even need another one, my domain controllers (and dns and such) are already win-vms.
My screen only does HDR600, but it does work.
It looks a little nicer than with it off, so I do keep it on. SDR content does not suffer.
I’m on KDE wayland with an AMD GPU.
Sounds great, besides the AMD part. So it might work, but not with my nvidia :( Thanks!
https://wiki.archlinux.org/title/HDR
Arch wiki says NVIDIA should now work, too.
As of Mesa 25 it should just work. Even if you’re on a distro that doesn’t have it working yet, seems HDR on NVIDIA is not far off, and can be made to work right now if you know how with arch.
For my setup, I literally just enable the setting in the KDE display settings.
Nah, I don’t wanna spend more time fiddling than actually doing something, so no arch for me :) Although cachyos is based on arch IIRC. either way, good to know!
Thanks!
It is.
If you’re on cachy you should be able apply any relevant changes same as arch.
But if I’m reading the wiki right, you should already be good to go, provided you’re on a DE that supports HDR.
I’ve got HDR working with an Nvidia card on bazzite but the current workaround means I can’t use HDR and Steam Input at the same time. This is using the gnome variant. I think the situation may be better with kde.
From what i’ve gathered here so far, KDE seems to be more hdr-friendly, yes. But good to hear it kinda works, even with nvidia!
15 machines? Do you use a rack? Seriously though much like the other responder hdr and actually most video things are not important to me and I tend to adopt them only when they are on the most affordable of options. Granted there are certain things important to me that I will go for earlier. Energy efficiency or more environmental disposal wise (have not yet encountered anything that significantly is but it would sway me). Also I have wanted to get organic leds because having very black blacks is a big deal for me. So basically I guess im saying I mostly don’t care about that stuff.
5 physical, rest VMs. I’m not crazy :-) Wasn’t important to me either until i just switched because the new monitor could. Now i wouldn’t wanna go back as it would feel like from 4K=>480p. Couldn’t care less about energy-efficiency though, i already waste an 8-person-average (according to my provider) for my hobbies alone :)
Im so bad I get annoyed when I can’t stream things at standard def and have to do at least 720 as I want to limit the bandwidth im using.
lol, ok that’s a solid reason. Though i now have 5 x 1gbit, i started with 3600baud back in the day. Bandwith IS an issue :-)








