I can’t wait for this current “A.I.” craze to go away. The tech is doofy, useless, wasteful, and a massive energy consumer. This is blockchain nonsense all over again, though that still hasn’t fully died yet, unfortunately.
The total market cap across all cryptocurrencies is currently about 2.5 trillion dollars, which isn’t far below its all-time high of 3 trillion. If that’s something you’d say “hasn’t fully died yet” then AI’s not going to go away any time soon by that standard.
I didn’t specify cryptocurrencies. They were not the only “good” attached to blockchain hype. Besides, they are primarily money laundering schemes and also used to steal from the financially illiterate. Touting market caps doesn’t change the actual real-world use case.
Like blockchain there is some niche usefulness to the technology, but also like blockchain it’s being applied to a myriad of things it is not useful for.
The term AI was coined in 1956 at a computer science conference and was used to refer to a broad range of topics that certainly would include machine learning and neural networks as used in large language models.
I don’t get the “it’s not really AI” point that keeps being brought up in discussions like this. Are you thinking of AGI, perhaps? That’s the sci-fi “artificial person” variety, which LLMs aren’t able to manage. But that’s just a subset of AI.
‘Intelligence’ requires understanding. The machine has no understanding, because it is not conscious. You can fiddle around with the definitions of these words until you’re blue in the face but this will be true in rain, sun, hail, puffed wheat, etc.
Did you check the link I posted? The term “Artificial Intelligence” is literally used for the sorts of topics in computer science that LLMs fall under, and has been for almost 70 years now.
You are the one who is insisting that the meaning of the words should now be changed to something else.
Arguably you are the one misusing the term. Even painfully mundane tasks like the A* pathfinding algorithm fall under the umbrella of artificial intelligence. It’s a big, big (like, stupidly big) field.
You are right that it’s not AGI, but very few people (outside of marketing) claim that it is.
I’m going to argue quite strongly that my general, all purpose understanding of the words ‘artificial’ and ‘intelligence’ constitute the ‘correct’ definition for the term, and I don’t really care how ‘ai’ is defined ‘in industry’. It’s not intelligent, therefore it’s not artificial intelligence. You can redefine ‘intelligent’ in this context to mean whatever you like, but unless the general definition of the word changes then it doesn’t mean jack about shit.
It is, machine learning, neural networks and all the other parts in LLMs and generative algorithms like midjourney etc are all fields of artificial intelligence. The AI Effect just means the goalposts for what people think of as “proper” AI are constantly moving.
This might be the case ‘in the industry’, but I would argue quite strongly that it represents a gross misuse of the word ‘intelligence’. Like a fun new definition of the word, that doesn’t mean anything close to what it usually means.
Words often have multiple meanings in different contexts. “Intelligence” is one of those words.
Another meaning of “Intelligence” is “the collection of information of military or political value.” Would you go up to CIA headquarters and try to argue with them that “the collection of information of military or political value” lacks understanding, and therefore they’re using the wrong word and should take the “I” out of their name?
The colloquial use of “AI” is basically the Hollywood concept of a conscious computer. Nobody knows about AI as it’s used in computer science industry. Nor does it matter in regular discourse. In this sense it’s not AI. It’s a disservice to lead the on laypeople to believe it’s something it’s not.
Whereas I have been finding uses for it to produce things that simply could not have produced myself without it, making it far more than a mere “productivity boost.”
I think people are mainly seeing what they want to see.
This work will have lots of applications in the future. I personally stay as far away from it as I can because I just have zero need for it to write souless birthday card messages for me but to act like the work is doing nothing is kinda stupid.
Every stage it’s been at people would say “oh this can’t even do X” and then it could and they’d so “oh it can’t do Y” and then it could and they’d say…do I really need to go on?
The biggest issue with it all right, for me anyway, now is that we’re trying to use it for the absolute dumbest shit imaginable and investors are throwing tonnes of money, that could solve real problems we don’t need AI for, into the grinder while poverty and climate change run rampant around us.
I can’t wait for this current “A.I.” craze to go away. The tech is doofy, useless, wasteful, and a massive energy consumer. This is blockchain nonsense all over again, though that still hasn’t fully died yet, unfortunately.
The total market cap across all cryptocurrencies is currently about 2.5 trillion dollars, which isn’t far below its all-time high of 3 trillion. If that’s something you’d say “hasn’t fully died yet” then AI’s not going to go away any time soon by that standard.
I didn’t specify cryptocurrencies. They were not the only “good” attached to blockchain hype. Besides, they are primarily money laundering schemes and also used to steal from the financially illiterate. Touting market caps doesn’t change the actual real-world use case.
Like blockchain there is some niche usefulness to the technology, but also like blockchain it’s being applied to a myriad of things it is not useful for.
deleted by creator
Drugs(silk road), scams&malware(pay 5 Bitcoin to unlock PC), money laundering&pump dump (unregulated market), and Nvidia hype (should have bought amd at 5$)
“we ran out of useful things to do with computing at the consumer level and now we are inventing problems” - “just bill’em” gates, 1984.
Also it’s not fucking ai is it. I actually find the blatant misuse of this term incredibly annoying to be honest.
The term AI was coined in 1956 at a computer science conference and was used to refer to a broad range of topics that certainly would include machine learning and neural networks as used in large language models.
I don’t get the “it’s not really AI” point that keeps being brought up in discussions like this. Are you thinking of AGI, perhaps? That’s the sci-fi “artificial person” variety, which LLMs aren’t able to manage. But that’s just a subset of AI.
‘Intelligence’ requires understanding. The machine has no understanding, because it is not conscious. You can fiddle around with the definitions of these words until you’re blue in the face but this will be true in rain, sun, hail, puffed wheat, etc.
Did you check the link I posted? The term “Artificial Intelligence” is literally used for the sorts of topics in computer science that LLMs fall under, and has been for almost 70 years now.
You are the one who is insisting that the meaning of the words should now be changed to something else.
Yes, no one seems to raise this anymore. AI to me has always been something akin to computer sentience.
Things like ‘self healing’ systems are being badeged as AI when they’re little more than an application load balancer.
Arguably you are the one misusing the term. Even painfully mundane tasks like the A* pathfinding algorithm fall under the umbrella of artificial intelligence. It’s a big, big (like, stupidly big) field.
You are right that it’s not AGI, but very few people (outside of marketing) claim that it is.
I’m going to argue quite strongly that my general, all purpose understanding of the words ‘artificial’ and ‘intelligence’ constitute the ‘correct’ definition for the term, and I don’t really care how ‘ai’ is defined ‘in industry’. It’s not intelligent, therefore it’s not artificial intelligence. You can redefine ‘intelligent’ in this context to mean whatever you like, but unless the general definition of the word changes then it doesn’t mean jack about shit.
So what is intelligence in your general, all-purpose understanding?
Are newborns intelligent? How about dogs? Ants?
You may argue that current AI is still behind an average human adult and therefore not intelligent, but academia is a bit more nuanced.
It is, machine learning, neural networks and all the other parts in LLMs and generative algorithms like midjourney etc are all fields of artificial intelligence. The AI Effect just means the goalposts for what people think of as “proper” AI are constantly moving.
This might be the case ‘in the industry’, but I would argue quite strongly that it represents a gross misuse of the word ‘intelligence’. Like a fun new definition of the word, that doesn’t mean anything close to what it usually means.
Words often have multiple meanings in different contexts. “Intelligence” is one of those words.
Another meaning of “Intelligence” is “the collection of information of military or political value.” Would you go up to CIA headquarters and try to argue with them that “the collection of information of military or political value” lacks understanding, and therefore they’re using the wrong word and should take the “I” out of their name?
AI was a computer science term before any industry adopted it.
The colloquial use of “AI” is basically the Hollywood concept of a conscious computer. Nobody knows about AI as it’s used in computer science industry. Nor does it matter in regular discourse. In this sense it’s not AI. It’s a disservice to lead the on laypeople to believe it’s something it’s not.
It has its uses, but it is being massively overhyped.
Having trialled Copilot and a few other AI tools in my workplace, I can confidently says it’s a minor productivity booster.
Whereas I have been finding uses for it to produce things that simply could not have produced myself without it, making it far more than a mere “productivity boost.”
I think people are mainly seeing what they want to see.
Yes, it enables you to create something like an image without any train in and quickly.
There some skill to using it a tool, just like with any other tools.
apparently so far the research disagrees with the productivity claims https://www.cio.com/article/3540579/devs-gaining-little-if-anything-from-ai-coding-assistants.html
This work will have lots of applications in the future. I personally stay as far away from it as I can because I just have zero need for it to write souless birthday card messages for me but to act like the work is doing nothing is kinda stupid.
Every stage it’s been at people would say “oh this can’t even do X” and then it could and they’d so “oh it can’t do Y” and then it could and they’d say…do I really need to go on?
The biggest issue with it all right, for me anyway, now is that we’re trying to use it for the absolute dumbest shit imaginable and investors are throwing tonnes of money, that could solve real problems we don’t need AI for, into the grinder while poverty and climate change run rampant around us.