• bizarroland@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    4
    ·
    3 months ago

    If you’re like me and you work with computers for a living and you don’t really want to put in the hard work of fixing computers at home, you can do what I did. Which is to download an abliterated local AI and tell it what the problem is and what specs you’re working with and it will almost always fix it for you in like five minutes.

    And when it doesn’t fix it in five minutes, it will destroy your operating system with whatever commands it tells you to paste in a terminal, and you were going to be wiping and reinstalling it anyway, so nothing lost.

    • Twongo [she/her]@lemmy.ml
      link
      fedilink
      arrow-up
      7
      arrow-down
      1
      ·
      3 months ago

      all this time spent on setting up a local llm and reinstall a whole system instead of reading the documentation 😭😭😭

      • bizarroland@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        3 months ago

        There are apps for it now like LM Studio.

        It makes setting up and running an LLM super easy.

        And with how fucked search is, thanks to the enshitification of the last few years, it has become rather difficult to find specific fixes for specific issues.

        All of that information has been shoved into the LLMs, and nobody profits from running the open-weight models, it’s not stealing your personal information and sending it off somewhere.

        It’s just a leftover from some other process that they don’t want anymore and they’re making it available to everyone else to help drum up interest in AI shit.

        I don’t like it.

        I’m just sharing a way to still make things work in 2025.