So I’ve been trying to install the proprietary Nvidia drivers on my homelab so I can get my fine ass art generated using Automatic1111 & Stable diffusion. I installed the Nvidia 510 server drivers, everything seems fine, then when I reboot, nothing. WTF Nvidia, why you gotta break X? Why is x even needed on a server driver. What’s your problem Nvidia!

  • WasPentalive@beehaw.org
    link
    fedilink
    arrow-up
    1
    ·
    1 year ago

    Nvidia does not ‘hate’ Linux, Nvidia simply never thinks about Linux. They need to keep secrets so people can’t buy the cheap card and with a little programming turn it into the expensive card.

  • fx_@feddit.de
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    Nvidia doesn’t hate linux, it just don’t care and the linux community hates nvidia

  • GenderNeutralBro@lemmy.sdf.org
    link
    fedilink
    arrow-up
    1
    ·
    edit-2
    1 year ago

    Linux is their bread and butter when it comes to servers and machine learning, but that’s a specialized environment and they don’t really care about general desktop use on arbitrary distros. They care about big businesses with big support contracts. Nobody’s running Wayland on their supercomputer clusters.

    I cannot wait until architecture-agnostic ML libraries are dominant and I can kiss CUDA goodbye for good. I swear, 90% of my tech problems over the past 5 years have boiled down to “Nvidia sucks”. I’ve changed distros three times hoping it would make things easier, and it never really does; it just creates exciting new problems to play whack-a-mole with. I currently have Ubuntu LTS working, and I’m hoping I never need to breathe on it again.

    That said, there’s honestly some grass-is-greener syndrome going on here, because you know what sucks almost as much as using Nvidia on Linux? Using Nvidia on Windows.