Using GenAI could lead to changes in critical thinking, says new study. © rimmdream- stock.adobe.com
The paper adds to a wealth of other research that looks at the effects of GenAI on cognitive processes such as memory and creativity, raising further questions about how we employ GenAI both in the workforce and in education.
Read the latest print edition of School News online HERE.
Researchers recruited 319 white-collar workers (termed “knowledge workers”) who regularly used generative AI. Participants were asked to self-describe three real examples where they used generative AI to complete a task, and share how they used critical thinking in the process.
Researchers found that the more confident a user was in the GenAI tool and/or the less confident they were in their own skills and abilities, the less critical thinking they employed.
Critical thinking was defined according to an existing framework of six types of learning: knowledge, comprehension, application, analysis, synthesis and evaluation.
The paper also found that when a task was considered “low stakes”, respondents reported using less critical thinking.
“While AI can improve efficiency, it may also reduce critical engagement, particularly in routine or lower-stakes tasks in which users simply rely on AI, raising concerns about long-term reliance and diminished independent problem-solving,” reads the paper.
“Knowledge workers’ trust and reliance on GenAI doing the task can discourage them from critically reflecting on their use of the tools. Users often adopt a mental model that assumes AI is competent for simple tasks… This mental model, however, can lead to overestimating AI capabilities.”
Researchers found that users who had a positive experience with a GenAI tool where an accurate or high-quality answer had been provided in the past were more likely to trust GenAI and therefore less likely to employ critical thinking skills.
Worryingly, some respondents believed information provided by GenAI was always accurate and of high quality.
Some respondents who reported high trust in the GenAI also reported low self-confidence.
“This self-doubt led them to accept GenAI outputs by default.”
The paper also noted that although respondents perceived GenAI to be competent at low stakes tasks and therefore requiring less oversight in these instances, “it is risky for users to only apply critical thinking in high-stakes situations.
“Without regular practice in common and/or low-stakes scenarios, cognitive abilities can deteriorate over time, and thus create risks if high-stakes scenarios are the only opportunities available for exercising such abilities.”
Additionally, GenAI was found to have an effect on how users employed critical thinking. In the knowledge and comprehension domains, critical thinking changes from information gathering to verification.
When applying knowledge, users employ critical thinking to integrate AI responses appropriately, rather than problem-solving for a given task.
For the analysis, synthesis and evaluation aspects of critical thinking, users employed task stewardship, or oversight of the task, rather than executing these aspects directly.
Researchers posited a possible explanation could be that the use of GenAI causes users to “underinvest” in critical thinking. However, as the study did not establish causation (i.e using GenAI always leads to diminished critical thinking): “it is possible that fostering workers’ domain expertise and associated self-confidence may result in improved critical thinking when using GenAI.”
Another interpretation of results is that GenAI may be a form of “cognitive offloading” where users depend on AI for tasks where they lack confidence.
“Confidence in AI is associated with reduced critical thinking effort, while self-confidence is associated with increased critical thinking effort.”
This year, UNESCO dedicated the International Day of Education to the theme of Artificial Intelligence. The theme was aimed at generating conversations about AI and its uses for education.
In high income countries, more than two-thirds of secondary school pupils are using GenAI to produce work and teachers are using the technology to help with marking, lesson preparation, and administrative tasks like sending emails. Still, according to a UNESCO survey, only 10 percent of schools and universities have an official framework for using AI. Additionally, more schools and counties are implementing technology ban policies, like New Zealand’s recent mobile phone ban in classrooms.
Although there is widespread hope that AI could improve the education sector by providing classroom tools such as chatbots that provide targeted learning support, and lightening teacher workload, there are also valid concerns about the ethics of using AI in the classroom, and its implications for teaching and learning.
Teachers must work for almost a decade before they accrue more income than a minimum…
The NZATE has withdrawn from the English curriculum rewrite citing transparency, timing and content concerns.
Charter schools are opening their doors - but are they really better for learning? The…
Real stories of dedication, challenges, and triumphs from educators in NZ. Part four comes from…
Administration costs of the school lunch programme are being passed onto schools, say Principals.
American education research and funding is being slashed by the new Trump administration. What does…
This website uses cookies.