Does AI actually help students learn? A recent experiment in a high school provides a cautionary tale.
Researchers at the University of Pennsylvania found that Turkish high school students who had access to ChatGPT while doing practice math problems did worse on a math test compared with students who didn’t have access to ChatGPT. Those with ChatGPT solved 48 percent more of the practice problems correctly, but they ultimately scored 17 percent worse on a test of the topic that the students were learning.
A third group of students had access to a revised version of ChatGPT that functioned more like a tutor. This chatbot was programmed to provide hints without directly divulging the answer. The students who used it did spectacularly better on the practice problems, solving 127 percent more of them correctly compared with students who did their practice work without any high-tech aids. But on a test afterwards, these AI-tutored students did no better. Students who just did their practice problems the old fashioned way — on their own — matched their test scores.
I mean, is it really that surprising? You’re not analyzing anything, an algorithm just spits text at you. You’re not gonna learn much from that.
You could always try reading the article
In the study they said they used a modified version that acted as a tutor, that refused to give direct answers and gave hints to the solution instead.
That’s like cheating with extra steps.
Ain’t getting hints on your in class exam.
So it’s still not surprising since ChatGPT doesn’t give you factual information. It just gives you what it statistically thinks you want to read.
Which, in a fun bit of meta, is a decent description of artificial “intelligence” too.
Maybe the real ChatGPT was the children we tested along the way