In 2018, the OECD put through 15 year olds from the 36 member countries (not all of them, more on that later) plus 43 invitees through a test measuring math, reading and science. One year later, the results of the 7th run in 20 years are in, and not many are happy. Perhaps with the exception of Estonia.
Countries, educators and the media are obsessed with the rankings. In a way, this is by design, as PISA is done with an express intent to leverage peer pressure. But underneath these figures, an interesting landscape of social contexts and political make for a rich and layered story of educational attainment, even if at the cost of informing policy to a much lesser extent than desired.
So what is PISA, and what does it mean for your country and your practice?
PISA scores are an indicator (and we all know what happens with indicators)
“Some people argued that the PISA tests are unfair (…) But then life is unfair“
— PISA Director Andreas Schleicher
Past LMSPulse readers would remember Mr. Goodhart and the paradox after its name. In education, an indicator is almost always a symptom of larger, complex phenomena. Focusing on increasing an indicator often leads to effective efforts in manipulating the indicator, to the point of no longer being a realistic representation of the phenomena.
PISA was built as an enhancement of a previous indicator: Years of schooling. As the Insights and Interpretations brochure recalls, schooling years “are not reliable indicators of what people actually know and can do.”
Perhaps the takeaway is that indicators have a shelf life. In the case of PISA the question is if, 20 years later, is still relevant or should give room for a more encompassing metric.
Singapore still an international reference, but Estonia gets all the laurels
But which one offers the best lesson on how to deliver top-worthy innovation? Hard to say, unless the answer is increasing per-student spending. (And you don’t need PISA to know that.)
In fairness, the Singapore example does appear as replicable, and in fact many countries have sought to adopt its “Mastery Method.” There’s a bit of a fight on whether the Singapore way is originally the Shanghai way. In any case, it’s still worth wondering why countries that adopted these methods haven’t risen to the top yet.
Disappointed, embarrassed, heartbroken, unsurprised: Kids in The Philippines, Dominican Republic, Kosovo half as capable
I would not advise you to check out local social media reactions for the worst performing countries. (Spoiler alert: Outcry, politicians rationalizing it.) There’s no denying there are acute educational problems in most countries, but these do not account for the seeming flaws in the administering of the tests.
The Philippines is at the absolute bottom in per student expenditure, which explains the low scoring but highlights issues in Dominican Republic, whose spending is similar to that of decent performer Ukraine. It’s advisable to make sure your students don’t go hungry. Beyond that, spending efficacy is worth considering.
The report recognized Albania, Moldova, Peru and Qatar as those with the largest gains. Some countries failed to keep an upward trend, with at least 8 of them showing poorer performance over the years. Korea, the Netherlands and Thailand have “Increasingly negative” trends; Australia, Finland, Iceland and New Zealand show “Steadily negative trends;” and Sweden is also worsening, but less than before.
Estonia pulled a top ranking despite spending 3% of GDP on education. Australia spends 3.9%. Spending per student is higher as well as teacher’s salaries. What gives? Estonia does spend more than Australia on education on a GDP per capita basis, but again so do Korea and the UK.
Some countries test all their children, some countries pick their testers
You might think this form of selective sampling is an attempt from low performers to inflate their scores. You are not wrong. Yet, it is in a way justifiable, as it is tries to stack up educational efforts, aware but ultimately not prey to the socioeconomic development realities in which they are designed and implemented. If you remain unconvinced, I don’t blame you.
A requirement to participate in the test as a 15 year old is to have been enrolled in school since age 7. This alone imposes a great restriction on countries with low early enrollment. However, the selective sampling does not appear to be manipulating a country’s standing significantly, perhaps with the exception of China. The most populate country on earth only submits results from their wealthiest cities, and might there be evidence of skirting out low performers from metro areas.
It’s really, really hard to make past outcomes comparable
The PISA tests undergoes significant changes every time it’s administered, placing serious doubts on how much we can see past rankings to measure the changes. For starters, the test was computer based in most of the cases, many for the first time.
The problem becomes worse when we consider the many issues you encounter when trying to measure up entire cultures among one yardstick. You will inevitably end up making a lot of arbitrary decisions.
- The most poignant of all is trying to establish a reading comprehension assessment methodology for dozens of different languages with different structures and uses. Trying to evaluate the same passage does not really help matters, but it does add a good deal of translation burdens.
- As national curricula are not aligned, to some extent PISA performance is really PISA topics alignment. And no, the lesson should not be to transform your country’s education system in the shape of PISA. Of course, many politicians are already on the task.
But given how PISA is built by the consensus of experts, it is not a direct metric but rather the reflection of its time and the concerns that were prevalent back then.
While some national efforts move the needle, PISA is ultimately a reflection of national values, institutional strength and economic conjuncture
Surprised countries with the highest HDI score better? Neither do we.
This is problematic if we remember than PISA is meant to lead to innovation in educational policy. What is Estonia and Singapore’s secret sauce? Being rich and corruption-free. So if you’re in a country near the lower end of the ranking, you might be better off asking Slovenia, Latvia, Russia or Belarus how they manage to stay above the median, beating the likes of Italy, Israel or Luxembourg. Vietnam is also a good example of scores above their income threshold, but they seem to be absent from this year’s ranking.
Considerations for elearning professionals
As with such a complicated (problematic, some would claim) ranking, the significance and implications to draw from PISA depend directly on the narrative skills from researchers, policy makers, and elearning professionals like you. At the end of the day, countries will likely try and measure each other up through any sort of means, and PISA might not be the worst one possible.
- Just take the scores with a grain of salt. Despite being its main goal, drawing policy advice from standardized testing is complicated.
- Realize that the results are heavily a byproduct of the institutions in which the test taker is embedded. (Economically speaking, culture is an institution.)
- Do use lower and lowering scores in your country to support your learning project or initiative. In my experience, officials (especially from government) like hearing how your intervention will contribute to a tangible outcome.