I’d like to set up a local coding assistant so that I can stop using Google to ask complex questions to for search results.

I really don’t know what I’m doing or if there’s anything that’s available that respects privacy. I don’t necessarily trust search results for this kind of query either.

I want to run it on my desktop, Ryzen 7 5800xt + Radeon RX 6950xt + 32gb of RAM. I don’t need or expect data center performance out of this thing.

Something like LM Studio and Qwen sounds like it’s what I’m looking for, but since I’m unfamiliar with what exists I figured I would ask for Lemmy’s opinion.

Is LM Studio + Qwen a good combo for my needs? Are there alternatives?

  • ryokimball@infosec.pub
    link
    fedilink
    English
    arrow-up
    7
    arrow-down
    2
    ·
    21 hours ago

    I have heard good things about LM Studio from several professional coders and tinkers alike. Not tried it myself yet though, but I might have to bite the bullet because I can’t seem to get ollama to perform how I want.

    TabbyML is another thing to try.

    • wasp_eggs@midwest.socialOP
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      21 hours ago

      Thanks for the reply!

      I had noticed TabbyML but something about their wording made me rethink and then the next day I saw a post on here regarding the same phrasing, I decided to leave it alone after that

      • Scrubbles@poptalk.scrubbles.tech
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        1
        ·
        20 hours ago

        Yeah I tried tabby too and they had like a mandatory "we share your code " line and I hoped out. Like if you’re going to do that I might as well just use claude