Ed Foster of Nottingham Trent University, UK, is an associate partner of the LACE project. In this guest post he asks what it will take for institutions to benefit from the use of learning analytics to support student retention and engagement.
If you wanted to invest in software that could accurately predict where a student was going to be in 7, 14 or 140 days, you probably shouldn’t buy the latest piece of learning analytics kit. For most institutions, it would be better to use timetabling software. If it’s set up properly, you ought to be able to accurately predict where a student will be for the remainder of the academic year. Clearly not every student attends all classes, but studies in the UK show that attendance is generally good enough to be able to make a better than 50/50 guess. Attendance in any given UK university class follows fairly predictable patterns, at the start of the year attendance is typically 80-100% and it quickly falls away; by the final term it can be as low as 40% (Colby, 2004, Burd & Hodgson, 2006, Newman-Ford et al, 2008). With relatively little work, you can probably predict fairly accurately the chances of any given student being in any given classroom at any given time, probably even for students who haven’t yet enrolled.
I know the argument’s flawed. I know that just looking at a cohort and taking a guess on which 80% will be in a class next Tuesday morning is of very limited use. Attendance is likely to be extremely important for any analytics system, for example Farsides & Woodfield (2003) reported that “application, specifically seminar attendance was by far and away the strongest and most consistent predictor of academic success” (p1239). However, that’s not the point I’m seeking to make.
I know that some people will be excited by the prospect of the latest update to their timetabling system. For these people, high points in their year will include that list of new features described in vendor press releases. For most students and staff though, good timetabling software is only a hygiene factor. We tend to notice when timetabling doesn’t work, but it’s pretty invisible when it does. Moreover, no one expects that timetabling software is going to change the face of higher education. Timetabling software enables students and staff to be in the right seminar rooms, lecture theatres and computer labs at the right time so that learning can take place. Given the scale of our universities and the pressures on space, it has become an important enabling tool, but that’s all it is.
Yet when we start talking about analytics, we get excited that potentially we can start to spot those students at risk of underperforming or even failing. We engage in heated conversations about methodology, data sources, accuracy, predictive power etc. I do wonder if we’re so wrapped up in the excitement of the new, that we’re missing the fact that we may just discover that, like the humble timetabling software, learning analytics tools are just enablers, just hygiene factors. The really important part, I would suggest, is what happens once learning analytics have identified the students at risk.
I’m confident that in the next five years, we will see brilliant, inspired developments in the field of learning analytics and within the same period, most universities will be using at least one learning analytics resource. It seems likely therefore that learning analytics will become as ubiquitous as timetabling software is now. However, will institutions actually benefit from having learning analytics to support student retention and engagement?
I think that depends on how we embed and operationalise it. Timetabling software enables the possibility of learning to take place, nothing more. For learning to take place still requires skilled teachers, good curriculum design, effective resources and engaged learners. Perhaps we should remember this when we’re building learning analytics platforms. I believe that learning analytics will lead us to a place where we can help steer students towards better outcomes. But I’d argue that’s perhaps all it does. I would argue that to actually change students’ outcomes it will still require skill, resources and motivation from both the institution and the student.
Author: Ed Foster
Ed Foster is an expert in the fields of student transition and retention. He works is the Student Engagement Manager at Nottingham Trent University where he leads the development of the NTU Student Dashboard using the Solutionpath StREAM tool. Ed is the lead for the Erasmus+ ABLE Project (Achieving Benefits from LEarning analytics) 2015-2018 in partnership with colleagues from KU Leuven and U Leiden. The project focuses on what happens once students have been identified as being at risk. He is organizing the Learning Metrics, Learning Analytics symposium at Nottingham Trent University 10th December 2015. Please don’t ask him how timetabling software works.