By Henry Kronk, eLearningInside News. This article was originally posted at news.elearninginside.com
A recently published study found that 10-year-old students tended to score worse on reading comprehension tests when they took them on a screen versus on paper. The effect, surprisingly, was most pronounced among high achieving girls.
Researchers Hildegunn Støle, Anne Mangen, and Knut Schwippert from the University of Stavanger in Norway were able to test over 1,000 students on their reading comprehension using both digital and paper methods. “Assessing children’s reading comprehensionon paper and screen: A mode-effect study,” was published in Computers & Education on February 28.
Some researchers are beginning to agree that learners do better with print versus screens
Investigators have been looking into the effects of screens versus paper and other traditional learning materials for decades at this point. But most research has focused on older students and young adults. What’s more, it has for the most part delivered inconsistent results. Many independent studies have found learners to perform better on paper. Others have found more positive results with screens. Many studies have observed no statistically significant differences.
But a 2018 meta-analysis provided clarity on the issue. Not only did this group find that most students tended to test better with paper, they actually witnessed the effect of paper over screens increase between 2000 and 2017. In other words, learners testing more recently benefited more from paper than their peers did fifteen years ago.
Despite the frequent use of learning via screens and digital devices by learners aged 10-years-old or younger, there have been fewer studies looking into mode effect (paper versus screen). Those that exist also have arrived at conflicting conclusions.
To further investigate the subject, Støle et al. were able to deliver Norway’s 2015 National Reading Test to a group of 1139 students via both paper and computer. The national standardized test was officially digitized in 2016.
More learners perform better on paper than screens
When the results were in, the researchers received a detailed picture of how a large body of students perform on a reading comprehension test using both paper and screens.
Using the test’s point system, 347 scored one point of standard deviation higher on the paper test, while 26 students scored two points or higher. In contrast, 155 learners scored one point higher on the computer test and 12 learners got two points higher or more. Meanwhile, 599 students performed the same on both.
When breaking down the learners into achievement groups, the researchers also discovered an interesting effect. They had hypothesized that the effect of paper versus screens would be lower among higher achieving students, and more pronounced with lower achievers. “It is conceivable that good readers are relatively better at understanding texts presented in diverse media than poor readers are,” they write.
The effect was felt most by high-performing girls
However, this hypothesis was not supported. While more learners at every quartile performed better on paper, the effect was most pronounced among the top 25% of testers. When looking into the mode effect on boys versus girls, there was no statistically significant difference—except again among top performing girls.
The researchers have a few ideas about why paper might be preferable to screens. With paper, the position of text is fixed. But screen can involve scrolling, making it harder to relocate information that was previously read.
As the researchers write, “In the user test, we observed a number of children tracing text lines with a finger on their PC screens. This reading behavior was also found among college students of c. 19 years of age taking a reading comprehension test on computers … For some children in our user test, this behavior carried over to a medium (computer) for which it may not be very efficient, as it likely slows down reading. It might further indicate that some children find it difficult to perform concentrated reading on screens.”
As the authors conclude, “Our results show that 10-year old children across levels of reading competence, in average performed significantly better on a reading test presented on paper than on screen. We did not find that [computer-based assessment] affected poor readers more than advanced readers. On the contrary, our results revealed that the mode effect (in favor of paper) was largest for high-performing girls. This finding occurs in spite of Norwegian children’s plentiful access to and experience with digital devices and the internet.”
eLearning Inside News announces Op-Ed Section
Starting today, eLearning Inside News we will be accepting submitted opinion articles from elearning and EdTech stakeholders.
Who can submit?
Op-Ed submissions from anyone who has a stake in education technology and eLearning initiatives. Maybe you’re the CEO of an established EdTech development company. Maybe you’re a 9th grade student with a unique take on your history teacher’s use of your school’s LMS. Maybe you’re a parent, teacher, EdTech researcher, administrator, IT expert, community organizer, or non-profit leader. If you have interacted with education technology in some way, we want to hear from you.
The ideal eLearning Inside News Op-Ed:
- Takes a strong stance on a relevant topic relating to EdTech or eLearning
- Backs up this strong stance with some form of reliable evidence
- Is between 500 and 1,000 words in length
- Prioritizes opinion over self-promotion
It’s alright to include links to sites and initiatives (and we would love it if you linked to our post as well) but these articles should first and foremost engage with ideas, topics, trends, and the EdTech community.
Send your Op-Ed submissions to [email protected] with the subject line: Op-Ed Submission.