• xthexder@l.sw0.com
    link
    fedilink
    arrow-up
    12
    arrow-down
    1
    ·
    edit-2
    2 months ago

    The way you phrased that perfectly illustrates the current problem AI has: In a problem space as large as natural language, there are nearly an infinite number of ways it can be wrong. So no matter how much data we feed it, there will always be some “brand new sentence” someone asks that breaks it and causes a wrong answer.

    • maniclucky@lemmy.world
      link
      fedilink
      arrow-up
      6
      ·
      2 months ago

      Absolutely. It’s why asking it for facts is inherently bad. It can’t retain information, it is trained to give output shaped like an answer. It’s pretty good at things that don’t have a specific answer (I’ll never write another cover letter thank blob).

      Now, if someone were to have the good sense to have some kind of lookup to inject correct information between the prompt and the output, we’d be cooking with gas. But that’s really human labor intensive and all the tech bros are trying to avoid that.