Technology fan, Linux user, gamer, 3D animation hobbyist

Also at:

[email protected]

[email protected]

  • 1 Post
  • 7 Comments
Joined 1 year ago
cake
Cake day: July 24th, 2023

help-circle

  • Probably better to ask on [email protected]. Ollama should be able to give you a decent LLM, and RAG (Retrieval Augmented Generation) will let it reference your dataset.

    The only issue is that you asked for a smart model, which usually means a larger one, plus the RAG portion consumes even more memory, which may be more than a typical laptop can handle. Smaller models have a higher tendency to hallucinate - produce incorrect answers.

    Short answer - yes, you can do it. It’s just a matter of how much RAM you have available and how long you’re willing to wait for an answer.