Does use of artificial intelligence make us “dumber?” More to the point, are thinking skills diminished? And, if so, in what instances? And how much depends on human agency: how people decide to use tools?
To be sure, any augmentation of human capabilities by technology (muscles, thinking, sight, hearing, speech) has effects. When calculators became widespread, we can plausibly argue that arithmetic proficiency did decline.
Perhaps use of GPS navigation has weakened our spatial reasoning and mental mapping abilities.
Steam power and electrification didn't just replace muscle, they eliminated entire categories of physical labor and the skills associated with them.
But then human effort climbed the abstraction ladder. Instead of knowing how to mill grain by hand, we developed agricultural engineering, supply chain logistics, and food safety science. The cognitive overhead that would have gone into mastering physical crafts redirected toward managing increasingly complex systems.
The issue is the possible impact, overall, on human agency.
AI use for writing is arguably not only about output but thinking. When we work to articulate an argument, we also are thinking. The act of writing clarifies muddled ideas, reveals contradictions and forces precision, which is why, even prior to widespread use of AI for generating text, I used to argue that people who do not write well do not think well.
Doing long division by hand doesn't help you understand mathematical concepts better than using a calculator. It's just slower.
But working to structure an essay or find the right word arguably is learning, not just a means to it.
So human agency is what matters. If we simply outsource composition itself to AI, we lose a chance to develop thinking skills.
On the other hand, if AI handles the mechanical aspects (grammar, basic structure, first-draft generation) while humans focus on higher-order concerns, we might see a net gain (creative insight, critical thinking).
Our use of personal computers, word processors and cloud-based information arguably led to:
Handwriting quality and knowledge of cursive declined
Ease of revision, but also
Less ability to compose “on the fly,” in real time, without extensive revision
Those might arguably be called “costs” or “losses” balanced by potential gains in other areas:
Easier or more-ambitious experimentation with style and form
Easier information access and depth
Less mechanical burden and more emphasis on ideas
The key variable was *how people used the tools*. Someone who uses a word processor as a crutch to avoid thinking hard about structure produces worse writing. Someone who uses it to create clearer structure or incorporate research more fluidly can produce better work.
With calculators, PCs, the internet and steam engines, humans retained clear agency over what problems to solve and how to tackle them. The tools executed human-designed solutions.
AI writing tools can obscure where human thinking ends and machine generation begins. If you prompt an AI to "write an essay arguing X," you've outsourced not just execution but problem identification, reasoning and idea development.
But AI also can be used to brainstorm, explore ideas or generate alternative ways of expressing a concept. That arguably is more analogous to how we used previous cognitive tools.
The "repurposing toward higher-level skills" argument might be highly contextual, depending on:
Whether lower-level skills are truly separable from higher-level ones (more true for calculation, less true for writing-as-thinking)
Whether users maintain agency over problem definition and solution strategy
Whether we repurpose time and effort to outcomes AI can't easily replicate.
Human attention freed from drafting mechanics could enable deeper research, more creative synthesis, better audience analysis, and more ambitious intellectual projects.
In my own work, AI allows me to ask bigger questions, in areas outside my existing domain, that I wouldn’t have bothered to ask in the past, as the research would have taken too long.
Which outcome prevails likely depends less on the technology itself and more on how we use it.