I’ve tried several types of artificial intelligence including Gemini, Microsoft co-pilot, chat GPT. A lot of the times I ask them questions and they get everything wrong. If artificial intelligence doesn’t work why are they trying to make us all use it?

  • Feathercrown@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 months ago

    I find that a lot of discourse around AI is… “off”. Sensationalized, or simplified, or emotionally charged, or illogical, or simply based on a misunderstanding of how it actually works. I wish I had a rule of thumb to give you about what you can and can’t trust, but honestly I don’t have a good one; the best thing you can do is learn about how the technology actually works, and what it can and can’t do.

    • Kintarian@lemmy.worldOP
      link
      fedilink
      arrow-up
      0
      ·
      2 months ago

      For a while Google said they would revolutionize search with artificial intelligence. That hasn’t been my experience. Someone here mentioned working on the creative side instead. And that seems to be working out better for me.

      • Feathercrown@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        2 months ago

        Yeah, it’s much better at “creative” tasks (generation) than it is at providing accurate data. In general it will always be better at tasks that are “fuzzy”, that is, they don’t have a strict scale of success/failure, but are up to interpretation. They will also be better at tasks where the overall output matters more than the precise details. Generating images, text, etc. is a good fit.

        • Kintarian@lemmy.worldOP
          link
          fedilink
          arrow-up
          0
          ·
          2 months ago

          That sounds about right. I heard that the recommendation from AI to put glue on your pizza was from a joke on Reddit about how to keep cheese from falling off the pizza. So obviously the AI doesn’t know what a good source of information is from a bad source of information. But as you say something that’s fuzzy and doesn’t need to be 100% accurate works pretty well apparently. Also my logic is a little fuzzy once in awhile myself.