Has reliance on technology made us dumber?

I was driving home from Palo Alto to San Francisco, a trip I had made dozens of times. Only this time I faced a problem: a phone with no power; a trip without GPS. I missed my exit and got hopelessly lost on the streets less than a mile from my home. How embarrassing: I claim to love this city, and yet at that moment I felt like I hardly knew it. Suddenly deprived of my technique, I couldn’t find my way because I never really had to learn it.

I’m not arguing against using GPS. But I mention it to demonstrate that efficient technology can be a barrier to learning. Only through effort and repetition, without shortcuts, can we retain truly useful knowledge.

Much has been written about GPT-3, one of the world’s most advanced artificial intelligence systems. It can do things that would have been considered science fiction just a few years ago, such as B. create realistic-sounding articles or translate between languages ​​it has never seen before. It does this by learning from a large amount of text and then making predictions based on that data.

(It also wrote that last paragraph with only the prompt “Much has been written about GPT-3”. I’d like to think I’d never deign to use that writing cliché, “like science fiction.”)

This type of AI-generated text is making waves in science. It is a tipping point from which we should be cautious about how we proceed. A recent Vice article detailed how a community of students used GPT-3 (and other similar AI word processors) to do the grunt work of writing essays, add context, and save time. Because the text generated by the AI ​​was “unique,” it allowed students to bypass anti-plagiarism detection software. “I only use AI to do the things I don’t want to do or that I find meaningless,” said one student.

Is the student cheating? You could argue convincingly both ways. It may be easier to ask if the student is cheating himself, to which the answer is certainly yes. The things students don’t want to do reinforce the bond. Write, reflect, record, over and over again.

Practice creates masters. We’ve all heard of the “10,000 hour rule” — the amount of intense practice it supposedly takes to master something — but we have many ways of saying the same thing: repetition is remembering. Remembering means learning and mastering.

Hermann Ebbinghaus, a psychologist who studied the benefits of repetition, illustrated this with his “forgetting curve” – ​​showing how knowledge escapes over time if not consciously remembered – and “interval learning”, repetition at regular intervals . His work has influenced the way we learn for more than a century. It’s the difference between becoming an expert and just passing an exam. Does a student deserve a “1” grade if the algorithm does the legwork? He or she doesn’t become more aware of the issue than I was on my way home.

Also, experts bluntly warn against the capabilities of today’s AI. Nathan Baschez, creator of Lex.Page, a word processor that can be used to invoke GPT-3 to fill in your sentences, told me it should be used with great caution in “high stakes” environments like journalism or science .

“GPT-3 can just make up facts that aren’t true and say other things that are nonsense,” he said. But it only gets better. It’s always learning. Are we?

Dave Lee is the FT’s San Francisco correspondent. consequences @FTMag on Twitter to be the first to know about our latest stories

Comments are closed.