Surprised pikachu face

    • utopiah@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 months ago

      I like Ollama, and recommend it to tinker, but I admit this “LLM Explorer” is quite neat thanks to sections like “LLMs Fit 16GB VRAM”

      Ollama just works but it doesn’t help to pick which model best fits your needs.

      • Knock_Knock_Lemmy_In@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        2 months ago

        pick which model best fits your needs.

        What is the need I have to put the effort in to install all this locally. Websites win in terms of convenience.

        • utopiah@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          2 months ago

          I don’t think I understand your point, are you saying there is no benefit in running locally and that Websites or APIs are more convenient?

        • morriscox@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          2 months ago

          I want to work on my stuff in peace and in private without worrying about a company grabbing my stuff and using it for themselves and to give/sell it to other outfits, including the government. “If you have nothing to hide…” is bullshit and needs to die.