Since the shutdown of SD on Colab, is there any option for running SD without disposable income?

I know about StableHorde, but it doesn’t seem to really… Well, work. Not for people without GPUs to gain Kudos on at least. It always gives a 5+ minute long queue and then ends up erroring out before that time runs out.

EDIT: It took me a while to set up. but as it turns out, my best option is in fact my 10-year-old computer with a 2GB AMD card. Using the DirectML fork of the WebUI with --lowvram runs pretty damn well for me. It’s not as fast as Colab was, but it’s not slow by any means. I guess the best advice in the end is, even if you’re on a shitbox, try it, your shitbox might surprise you. So take note, though, that running on 2GB Vram doesn’t work for everyone, only the luckiest of broke mfs can do that it seems.

  • BareHandedPoopScoop@waveform.social
    link
    fedilink
    English
    arrow-up
    5
    ·
    edit-2
    1 year ago

    There are a few.

    Draw things https://drawthings.ai

    Diffusionbee https://diffusionbee.com

    Are apps available on macos. Not sure about other platforms.

    Other than that a more complex approach is installing automatic1111 which is a web interface that runs locally. https://www.youtube.com/watch?v=kqXpAKVQDNU

    Just had a little look at ^ this guy’s YouTube channel and he has guides to install stuff on windows too if that’s needed for ya.

    • Ganbat@lemmyonline.comOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      I struggled to replace a dying HDD this month, there’s no way I can afford a SD-capable GPU.

        • Ganbat@lemmyonline.comOP
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          1 year ago

          I never knew it was possible to run on CPU. Well, thanks for the idea, but that doesn’t seem usable. SD outputs are terrible 9 of 10 times, and at ten+ minutes a generation…

          • Domi@lemmy.secnd.me
            link
            fedilink
            English
            arrow-up
            4
            ·
            1 year ago

            If you use ComfyUI you can queue as many images as you want, so you can let it run over night. Not ideal but probably the best solution without GPU.

            • Ganbat@lemmyonline.comOP
              link
              fedilink
              English
              arrow-up
              2
              ·
              edit-2
              1 year ago

              TBH, it would take a long time to come up with a prompt that didn’t produce trash even when generating quickly on Colab. Trying to do this on my CPU sounds like it could end up taking weeks or even months just to get a few good images.

              EDIT: Well, it took me a long while to get set up, but as it turns out, I’m one of the lucky ones who can run on 2GB VRAM. I generated this in about a minute.

              • Domi@lemmy.secnd.me
                link
                fedilink
                English
                arrow-up
                2
                ·
                edit-2
                1 year ago

                Good to hear you got it working.

                If you want to speed it up even further and you’re willing to boot Linux from a USB, ROCm is much faster than DirectML right now.

                edit: Also, you can run without UI, saving even more VRAM.

                • Ganbat@lemmyonline.comOP
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  1 year ago

                  Interesting about ROCm, I’ll have to look into that. As for running without the UI, I honestly don’t think I know enough to do that right now, lol.