• IGuessThisIsForNSFW@yiffit.net
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 month ago

    I had a roommate who reviewed and scored responses for Google Bard. A ton of it was people generating posts for whatever business/crypto/alpha-male grift they were running. The main thing though was really really specific fetish stuff.

  • tehmics@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 month ago

    90% of mine is just programming syntax. The rest is shit that Google can’t answer anymore. Then 1% is me trying to trick it into telling me about illegal stuff

  • MonkderVierte@lemmy.ml
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    1 month ago

    I asked it yesterday, why touchscreen input is separate on Linux (and needs software’s support) and how to map it to mouse input. Can’t really google that.

  • 1984@lemmy.today
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 month ago

    There are models you can download and run at home that doesn’t have the politically correct censorship inside. It’s very nice to not have artificial politeness for example, and the models actually answers your actual questions.

    You need a powerful computer for some of them though.

      • Sabata@ani.social
        link
        fedilink
        English
        arrow-up
        0
        ·
        1 month ago

        Ollama software and you can pick a model that dose what you want. Mistral and Llama are currently best IMO but it changes often.

      • merari42@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        1 month ago

        For a user without much technical experience using a ready-made gui like Jan.ai with automatic model download and ability to run models with the ggml library on consumer grade hardware like mac M-series chips or cheap GPUs by either Nvidia or AMD is probably a good start.

        For a little bit more technically proficient users Ollama is probably a great choice to start to host your own OpenAI-like API for local models. I mostly run gemma2 or small llama 3.1 like models with that.

        • 1984@lemmy.today
          link
          fedilink
          English
          arrow-up
          0
          ·
          1 month ago

          I was also kind of blown away by the Firefox nightly version, where they have a new sidebar. In that sidebar, you have buttons for having chat gpt open if you want. But that’s not the impressive part. It also lets you choose from other models like huggingface, so anyone can try them and understand how the open models are without any installation.

          Very cool.