petsoi@discuss.tchncs.de to Linux@lemmy.ml · 7 months agoRunning Generative AI Models Locally with Ollama and Open WebUI - Fedora Magazinefedoramagazine.org199016
74Running Generative AI Models Locally with Ollama and Open WebUI - Fedora Magazinefedoramagazine.orgpetsoi@discuss.tchncs.de to Linux@lemmy.ml · 7 months ago19
Show ContentThe Hobbyist@lemmy.zip4·7 months agoAlternatively, you don’t even need podman or any containers, as open-webui can be installed simply using python/conda/pip, if you only care about serving yourself: https://docs.openwebui.com/getting-started/quick-start/ Much easier to run and maintain IMO. Works wonderfully.
Show ContentVincent@feddit.nl8·7 months agoAnd llamafile is a binary you can just download and run, no installation required. “Uninstallation” is deleting the file.
Alternatively, you don’t even need podman or any containers, as open-webui can be installed simply using python/conda/pip, if you only care about serving yourself:
https://docs.openwebui.com/getting-started/quick-start/
Much easier to run and maintain IMO. Works wonderfully.
And llamafile is a binary you can just download and run, no installation required. “Uninstallation” is deleting the file.