I’m a big fan of classic science fiction. I generally avoid dystopian themes, but some are just too good to ignore, from A Boy and his Dog to Hunger Games. When ChatGPT started getting all that popular press a few years back, I was looking forward to finally living in that shiny future promised by Heinlein, Asimov, Clarke, and Roddenberry finally coming true, maybe even a flying car (the current prototypes still aren’t there yet, BTW). But the news of the last few years has had more Brave New World and 1984 vibes.
So when I read a recent NPR report on AI in schools, it felt like another example of how we are engineering frustration out of the human experience. The report describes software that is so sensitive to a student’s frustration that it pivots the curriculum before they even have a chance to get annoyed. On paper, it is a triumph of user experience; in practice, it might be a silent deletion of the very thing that makes a mind grow.
The Lesson of the Eloi
When H.G. Wells sent his Time Traveller into the year 802,701, he didn’t find a high-tech utopia or a charred wasteland. He found the Eloi: beautiful, peaceful, and intellectually vacant creatures living in a world of total automation.
Wells’ speculation in his passage on [suspicious link removed] hits quite close to home in the age of generative AI:
“Strength is the outcome of need; security sets a premium on feebleness.”
The Eloi weren’t born “slow” because of biology. They were essentially optimized into that state by an environment that removed every possible hurdle. They had won the game of civilization so thoroughly that they lost the ability to play it.
The parallel to AI-driven education isn’t that the technology is failing, but that it is succeeding too well. If the machine handles every productive struggle (sensing your confusion and immediately smoothing the path), it isn’t just teaching you. It is doing the mental heavy lifting on your behalf. You don’t get stronger by watching your trainer lift the weights, even if the trainer is a hyper-personalized LLM.
The Mirror of “Useful” Atrophy
It isn’t just about the classroom; AI is becoming a universal solvent for friction. History suggests that when we remove friction, we usually lose the muscle that was meant to overcome it.
-
The GPS Effect: We traded the frustration of paper maps for a blue dot that tells us where to turn. The result is that our internal spatial awareness is basically a legacy system. We can get anywhere, but we often have no idea where we are.
-
The Calculator Trade-off: We offloaded long division to a chip. This was a fair trade for most, but it established the precedent: if a machine can do it, the human brain is officially off the clock for that specific skill.
-
The Infinite Search: We stopped memorizing facts because we treat our devices as an external hard drive for our personalities.
Not all of that has been a bad thing, unless we get to live one of those post-EMP stories (which I avoid reading to avoid remembering it isn’t that far-fetched). I, for one, am glad that Einstein said “Never memorize something that you can look up,” because rote memorization is a struggle for me, but I really do enjoy exercising mental muscle memory. Which is where using AI the wrong way will lead to an atrophy that doesn’t need a major solar event to make us realize things went too far. It doesn’t just provide answers; it simulates the thinking.
The Verdict: Designing for Resistance
We should be optimistic about AI’s potential to amplify us, but we have to be wary of the passenger mindset. If we use these tools to abolish difficulty, we aren’t empowering ourselves. Instead, we are prepping for a very comfortable life as Eloi.
The challenge for educators, and for anyone using an AI “intern” in their daily workflow, is to intentionally design productive friction back into the system. We need AI that makes the work more meaningful and not just more invisible.
Mastery requires resistance. If the road is perfectly flat and the bike pedals itself, you aren’t traveling; you are just being delivered.
© Scott S. Nelson