Im using Ollama on my server with the WebUI. It has no GPU so its not quick to reply but not too slow either.

Im thinking about removing the VM as i just dont use it, are there any good uses or integrations into other apps that might convince me to keep it?

  • bizarroland@fedia.io
    link
    fedilink
    arrow-up
    4
    ·
    5 months ago

    I have a 4070 sitting around collecting dust that I got from a trade, I’ve been thinking about setting it up with whispr and TTS and having a way to talk to my house.

    I have a couple of smart home integrations, mostly air conditioning, light switches, security, and doors.

    What I would like would be to have a few speakers on the walls that can talk to my server where I can say something like, hey computer, turn on the lights in the dining room and the lights in the dining room would turn on without transmitting that information to Google or Amazon.

    • Possibly linux@lemmy.zip
      link
      fedilink
      English
      arrow-up
      2
      ·
      5 months ago

      I am really curious if you can get the traditional smart functionality along with a LLM. Maybe have some sort of keyword the prompts the AI. You also could write a custom generated system prompt that includes the weather, time and any other information