• wizardbeard@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    1
    ·
    2 months ago

    But it is, and it always has been. Absurdly complexly layered statistics, calculated faster than a human could.

    This whole “we can’t explain how it works” is bullshit from software engineers too lazy to unwind the emergent behavior caused by their code.

    • doctordevice@lemmy.ca
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 months ago

      I agree with your first paragraph, but unwinding that emergent behavior really can be impossible. It’s not just a matter of taking spaghetti code and deciphering it, ML usually works by generating weights in something like a decision tree, neural network, or statistical model.

      Assigning any sort of human logic to why particular weights ended up where they are is educated guesswork at best.