petsoi@discuss.tchncs.de to Linux@lemmy.ml · 7 months agoRunning Generative AI Models Locally with Ollama and Open WebUI - Fedora Magazinefedoramagazine.org199016
74Running Generative AI Models Locally with Ollama and Open WebUI - Fedora Magazinefedoramagazine.orgpetsoi@discuss.tchncs.de to Linux@lemmy.ml · 7 months ago19
Show ContentVincent@feddit.nl8·7 months agoAnd llamafile is a binary you can just download and run, no installation required. “Uninstallation” is deleting the file.
And llamafile is a binary you can just download and run, no installation required. “Uninstallation” is deleting the file.