• 33 Posts
  • 765 Comments
Joined 2 年前
cake
Cake day: 2023年6月23日

help-circle


  • ikidd@lemmy.worldtoSelfhosted@lemmy.worldSelfhost an LLM
    link
    fedilink
    English
    arrow-up
    1
    ·
    18 小时前

    OpenWebUI is pretty much exactly what you’re looking for. It can start up an ollama instance that you can use for your other applications over the network, and chat with it as you see fit. If you have an API key from an outside subscription like OpenRouter or Anthropic, you can enter it and use the models avaialable there if the local ones you’ve downloaded aren’t up to the task.