• doctordevice@lemmy.ca
    link
    fedilink
    English
    arrow-up
    1
    ·
    4 months ago

    I agree with your first paragraph, but unwinding that emergent behavior really can be impossible. It’s not just a matter of taking spaghetti code and deciphering it, ML usually works by generating weights in something like a decision tree, neural network, or statistical model.

    Assigning any sort of human logic to why particular weights ended up where they are is educated guesswork at best.