Measuring Engagement: Learning Analytics in Online Learning [paper by Griff Richards]

6116

If you’re interested in learning more about measuring student engagement in your Moodle course, this is a great paper/primer to start with.  The paper was written by Griff Richards of Thompson Rivers University in Canada and highlights some key findings from recent studies of Moodle and Blackboard and how the researchers went about establishing a metric for student engagement in online courses.

WIRIS

Get a copy of the paper here: http://goo.gl/YDEVR

I’ve been thinking a lot about engagement in courses at StraighterLine (which offers self-paced online college courses via Moodle) and there were some really great quotes and other papers cited that gave me a broader lit review to explore. A few of my favorite quotes:

Some measures of engagement can be misleading. For example, Beer, Clark and Jones (2010) plotted the number of learning management system (LMS) page visits (“hits”) by academic achievement for several  thousand students (see Figure 2). They noted a solid correlation between the number of LMS hits and the student grade. Did this mean that students who clicked more web pages were more engaged? Perhaps, but the explanation might simply be that they had to click on the web pages to take their tests and submit their assignments, and students who completed their work were more likely to succeed. Beer et al. also noted that  students using the Blackboard LMS had much higher hits than students using Moodle LMS. Does that Measuring Engagement mean Blackboard is a more engaging LMS? No, it reflects that Moodle has a more efficient access architecture. Learning Analytics is not simply about counting hits or mapping discussions, it is about intelligent and thoughtful interpretation of data in the context of human activity.

Regarding who the information is useful to,

We have not yet learned to interpret all the data we see, and thus we are just beginning to develop a set of appropriate interventions. Early analytics focuses on information such as “learner X is at risk” that can be sent to faculty or staff so that they can do a better and more timely intervention in remedial teaching or personal counseling. However the same information could be sent directly to the learner. Figure 3 portrays a hypothetical display or “dashboard” that shows the learner how fast they are moving through the course compared to others, and how well they are achieving the course outcomes. The dashboard shows the personal display for a learner who is progressing ahead of the class, but who is just keeping up in terms of achievement.

We see by this example how a student logging into the learning management system could be presented with a display that shows their progress and achievement in comparison to their classmates, or possibly to all students who have ever taken the course. In the case of a self‐paced or independent study course, the comparison could be made to the historical progress trends of all previously enrolled students. Of course, not all learners might want to know this information they might prefer to Richards remain blissfully in the dark rather than face undue stress and competition that plague high need‐achievement personalities, and learners at risk for failure. However, early detection of risk should be the first line of defense in providing remedial action. The ethical issue should not be “Should we inform learners of their standing?” but “What form of remedial action should we counsel them to take?” Is it possible to have a similar dashboard for engagement – perhaps one that shows the lack of interaction with other students in group work and the consequences of the lack of social capital on future income?

figure3

What do you think, are you providing students information about their engagement when they log in? What metrics are you using to track student engagement or establish a baseline “reading” to measure their work online?