• Showroom7561@lemmy.ca
    1·
    1 year ago

    that runs 100% offline on your computer.

    Goddamn, that’s wonderful!

  • xigoi@lemmy.sdf.org
    1·
    1 year ago

    “100% Open Source“

    [links to two proprietary services]

    Why are so many projects like this?

    • ☆ Yσɠƚԋσʂ ☆@lemmy.mlOP
      01·
      1 year ago

      I imagine it’s because a lot of people don’t have the hardware that can run models locally. I do wish they didn’t bake those in though.

  • Pasta Dental@sh.itjust.works
    1·
    1 year ago

    having contributors sign a CLA is always very sus and I think this is indicative of the project owners having some plans of monetizing it even though it is currently under AGPLv3. Their core values of no dark patterns and whatnot seem like a sales argument rather than an actual motivation/principle, especially when you see that they are a bootstrapped startup.

  • Aria@lemmygrad.ml
    0·
    1 year ago

    So what exactly is this? Open-source ChatGPT-alternatives have existed before and alongside ChatGPT the entire time, in the form of downloading oogabooga or a different interface and downloading an open source model from Huggingface. They aren’t competitive because users don’t have terabytes of VRAM or AI accelerators.

    • ☆ Yσɠƚԋσʂ ☆@lemmy.mlOP
      01·
      1 year ago

      It’s basically a UI for downloading and running models. You don’t need terabytes of VRAM to run most models though. A decent GPU and 16 gigs of RAM or so works fine.

    • circuscritic@lemmy.ca
      0·
      1 year ago

      Depends. Are either of those companies bootstrapping a for-profit startup and trying to dupe people into contributing free labor prior to their inevitable rug pull/switcheroo?

      • ☆ Yσɠƚԋσʂ ☆@lemmy.mlOP
        01·
        1 year ago

        Do explain how you dupe people into contributing free labor and do a switcheroo with an open source project. All the app does is just provide a nice UI for running models.

    • ☆ Yσɠƚԋσʂ ☆@lemmy.mlOP
      01·
      1 year ago

      Depends on the size of the model you want to run. Generally, having a decent GPU and at least 16 gigs of RAM is helpful.