One frequently hears worries that use of artificial intelligence is going to diminish human critical thinking skills. The fear is at least partly correct, to the extent that critical thinking is said to involve the ability to evaluate sources and determine information reliability and truthfulness or bias.
And, at least for the moment, users often do not have access to the full range of sources or reasoning that any AI engine uses to derive an answer to a question.
But lots of us might counter that humans in real life often do not seem to use critical thinking all that much, to begin with.
If artificial intelligence emerges as a general-purpose technology that transforms work and the economy in major ways, then AI arguably will also change what we need from our education systems.
If "learning how to learn" once meant mastering the acquisition, organization, and synthesis of information through traditional human effort, that focus must change if AI can handle most of those tasks, much as search now replaces “going to the library.”
In the age of AI, "learning how to learn" shifts from content mastery to process mastery and other skills such as vetting of sources, determining the reliability or bias of information.
The core goal of human education is no longer to create a knowledge repository but to learn how to think critically. To a greater extent, AI handles the heavy lifting of information retrieval and drafting, freeing people to frame questions, critique answers and create what is new.
The most valuable skill is no longer solving a textbook problem, such as how to optimize a supply chain, but evaluating novel, ambiguous, and human-centric problems such as environmental, social costs and other externalities that the AI models overlooked.
Many will argue the capacity for empathy, leadership, collaboration, and building trust remains essential for nearly all high-value work and social functioning, so that might mean educating students on the importance of empathetic communication skills and practices, for example.
On the other hand, content mastery still has value, in particular for critical thinking. The idea that some content mastery is still required for sophisticated prompt engineering is widely accepted, as the quality of the AI's output is limited by the quality of the human's input.
In other words, domain-specific knowledge still matters.
Beyond that, if people do not wish to think, AI won't make much of a difference.
No comments:
Post a Comment