Does it matter if students do tests on computers or on paper?
Australian researchers ask if online tests make a difference to student performance - could technology be influencing exam results?
Australian students are increasingly taking tests on computers. This includes major tests used to check national progress on literacy and numeracy.
The idea is this prepares students “for the future”, because “technology is everywhere”.
But as our two recent studies suggest, the way students respond to test questions on computers may not be the same as on paper.
Read the latest print edition of School News HERE
This is a particular issue amid concern over the latest round of NAPLAN results, which appear to show too many Australian students are not learning basic skills in English and maths. NAPLAN (for Years 3, 5, 7 and 9) has been fully online for two years.
Our research
In our recent study, we reviewed 43 studies comparing tests on computer and paper. This included research from 18 different countries (including Australia, the United States, Germany and the United Kingdom). Fourteen of these studies focused on school-aged children.
In general, the studies showed for younger school students (who had less computer skills), test scores tended to be higher when done on paper. This effect dropped off as students got older.
We also found when it came to computer testing, scores were worst when students needed to answer complex questions involving multiple steps.
This is due to the demands placed on working memory (the part of your thinking that allows you to hold onto multiple pieces of information at one time – for example, a list of names and coffee orders). When working memory has too many pieces of information at once, we experience “high cognitive load”.
Students may experience this if they are unfamiliar with using a particular computer or particular program, testing platform or browser.
Students may also experience high cognitive load when the questions they answer become more complex. Not only are they working out the answer, but they are working out how to use the computer (or reminding themselves how to use it) at the same time.
Comparing students on paper and on a computer
We also saw this phenomenon at work in our own 2023 study, even when students were well into high school and familiar with the computers used in a science test.
We compared the differences in test scores for computer and paper-based tests with Year 9 students. This study involved 263 science students from two schools in Perth, where students learn using their own devices. Within this sample, there were 14 individual classes taught by seven different teachers.
Students completed one test on their own computer and another (featuring very similar questions) on paper. We categorised the questions in each test as “easy” or “hard”.
When students answered easy questions, they achieved higher scores (by about 7%) on the computer-based assessment. When students completed hard questions, they performed better (by about 12%) on paper-based assessment.
This suggests the computer mode adds to the cognitive load students experience when answering questions. This is a bit like the way a computer’s memory might become overloaded if you run too many programs at the same time, and it slows down and doesn’t perform as well.
This finding is similar to that of a 2018 study that looked at the verbal skills of a group of children aged between four and 11.
What about working memory capacity?
In our study on Year 9 students, we also tested students’ working memory capacity, by giving them increasingly long lists of numbers to remember.
We then controlled for this using statistics. This allowed us to compare the computer and paper test scores while assuming all students had the same working memory capacity. Under these conditions, we found there was no difference in test scores between paper and computer.
This suggests students with lower working memory capacities are most disadvantaged by computer-based tests. People with attention-deficit hyperactivity disorder (ADHD) are one group that particularly struggle with working memory. We know there will typically be one or two students per classroom who have ADHD.
What can we do differently?
Computers of course have an important role to play in education, and are powerful learning tools. But our research shows taking a test on a computer is not the same as taking the same test on paper. Schools should consider:
giving students extra working time when completing complex tasks or tests on a computer
teaching students word processing skills from an early age to increase their ability to type and navigate computer programs
minimising any digital distractions, either during tests or during class work. This includes pop-ups, multiple tabs and online games.
Additionally, families should think about providing everday opportunities at home for younger children to learn to type (such as emails, messages and shopping lists). This will help to build their skills and confidence with keyboards and computers.
Peter Whipp also contributed to the research on which this article is based.
James Pengelley, Adjunct Lecturer, School of Education, Murdoch University; Anabela Malpique, Senior Lecturer in Literacy, Edith Cowan University, and Nina Rovis-Hermann, Lecturer in Education Psychology, Murdoch University