Wondering about services to test on either a 16gb ram “AI Capable” arm64 board or on a laptop with modern rtx. Only looking for open source options, but curious to hear what people say. Cheers!

  • truxnell@infosec.pubEnglish
    4·
    5 months ago

    I run ollama and auto1111 on my desktop when it’s powers on. Using open-webui in my homelab always on, and also connected to openrouter. This way I can always use openwebui with openrouter models and it’s pretty cheap per query and a little more private that using a big tech chatbot. And if I want local, I turn on the desktop and have local lamma and stab diff.

    I also get bugger all benefit out of it., it’s a cute toy.

    • kiol@lemmy.worldOPEnglish
      1·
      5 months ago

      How do you like auto1111 as I’ve never head of it

      • L_Acacia@lemmy.mlEnglish
        2·
        4 months ago

        The project is a bit out of date for newer models, Though Older ones work great.

        I recommand ComfyUi if you want fine grained control over the generation and you like to tinker.

        Swarm / Reforge / Invoke if you want neat, up to date UI.