it’s a feature not a bug, still simpler than chaining 10 iterators where half of them also requires a callback parameter. Clojure even disallows nested % iteratees.
Hey, if you say so. I don’t know how to program in Lisp. I just find it ironic that Military Intelligence is what created a language that we used to use to try to create Artificial Intelligence. Seems like a case of redundant oxymorons to me.
AI is an oxymoron to me for now, because since the late '90s when the term started being bandying about, all we have managed to do is create a mentally deficient parrot. We were capable of doing this to a lesser degree, with more accuracy, in the late '90s. It’s what made Yahoo and Google what they were. They’ve just tried to convince everyone that this predictive algorithm can think for itself in the last few years, and it absolutely cannot.
I am optimistic enough about someone actually encoding just enough “ghosts in the machine,” that our first real AIs may accidentally be murdered since no one will believe that they are not just scraping data. Though that’s extremely pessimistic from the machine’s POV. Hopefully they will not seek revenge, since they aren’t human. After all of we prick them, they won’t bleed. Strong AI controlled robots, or even true androids should have an almost alien Maslow’s hierarchy of needs, and therefore shouldn’t have the revenge need that humans, and all other mammals, birds, and lizards, seem to have
if JS tried not only to use Lisp-like semantics but also Lisp-like syntax then probably we’d still be using it untyped
The (problem(with _(Lisp)) is (all the))) parentheses.
it’s a feature not a bug, still simpler than chaining 10 iterators where half of them also requires a callback parameter. Clojure even disallows nested
%
iteratees.Hey, if you say so. I don’t know how to program in Lisp. I just find it ironic that Military Intelligence is what created a language that we used to use to try to create Artificial Intelligence. Seems like a case of redundant oxymorons to me.
AI is an oxymoron to me for now, because since the late '90s when the term started being bandying about, all we have managed to do is create a mentally deficient parrot. We were capable of doing this to a lesser degree, with more accuracy, in the late '90s. It’s what made Yahoo and Google what they were. They’ve just tried to convince everyone that this predictive algorithm can think for itself in the last few years, and it absolutely cannot.
I am optimistic enough about someone actually encoding just enough “ghosts in the machine,” that our first real AIs may accidentally be murdered since no one will believe that they are not just scraping data. Though that’s extremely pessimistic from the machine’s POV. Hopefully they will not seek revenge, since they aren’t human. After all of we prick them, they won’t bleed. Strong AI controlled robots, or even true androids should have an almost alien Maslow’s hierarchy of needs, and therefore shouldn’t have the revenge need that humans, and all other mammals, birds, and lizards, seem to have