An article appeared in the Times Higher Education online magazine recently (April 3, 2014) under the heading “More data can lead to poor student choices, Hefce [Higher Education Funding Council for England] learns”. The article was not about learning analytics, but about the data provided to prospective students with the aim of supporting their choice of Higher Education provider (HEp). The data is accessible via the Unistats web site, and includes various statistics on the cost of living, fees, student satisfaction, teaching hours, and employment prospects. In principle, this sounds like a good idea; I believe students are genuinely interested in these aspects, and the government and funding council see the provision of this information as a means of driving performance up and costs down. So, although this is not about learning analytics, there are several features in common: the stakeholders involved, the idea of choice informed by statistics, and the idea of shifting a cost-benefit balance for identified measures of interest.
Given these features in common with learning analytics, it is interesting to note the conclusions of the research that was conducted for Hefce and published on their website: “UK review of the provision of information about higher education: Advisory study and literature review“. The key findings of the report are summarised as:
The decision-making process is complex, personal and nuanced, involving different types of information, messengers and influences over a long time. This challenges the common assumption that people primarily make objective choices following a systematic analysis of all the information available to them at one time.
Greater amounts of information do not necessarily mean that people will be better informed or be able to make better decisions.
There is, I think, a lot of literature from different sources that supports these points. I am reminded of “regret theory”, which comes from the decision theory community (see “Decision Theory – a brief introduction” – pdf – by Sven Ove Hannson for an accessible account). This captures the idea that we put a higher cost on missed opportunities such that it may be perfectly rational, if we accept that it is rational to account for feelings, to opt for a course of action with less than optimal objective utility. For example, the decision to join a work-place lottery syndicate is rarely based on the expectation of reward but, so long as the outlay is small, is likely to be influenced by the thought of not sharing a winning ticket.
The question we should ask is: so what can be done to address these findings?
The Times Higher report quotes a Hefce representative as suggesting that different levels of detail might be one solution. The “advisory study and literature review” contains much that characterises the problem, but I think it falls into a trap of approaching the problem almost exclusively from an academic perspective. At this point, I would like to add that I vigorously approve of academic thinking; indeed, one of the threats to successful learning analytics is to see it as a fundamentally engineering response to management interests. We do need thoughtful approaches that embed diverse social and psychological theorisation. I do not dispute the problem description, and the account of behavioural factors, or the idea that different levels of description might help. The missing piece is, I think, that the principles given in section 7 do not provide the scaffolding for action.
The Unistats example and, I argue by extension, real-world implementation of learning analytics, are cases where further research is less likely to be effective than attending to the processes of design. By this, I mean that the method by which the software is designed is more important than knowing facts and principles of behaviour. Modern practices in software design recognise two related factors that are useful, and of which the report writers appear to have no awareness: iterative process, and user centred design. Rather than attempting to characterise and describe the “known unknowns” up-front, the idea is to leave these as partially-latent requirements and to have a design process that accounts for them as it happens. User research becomes a continuous part of design. This approach is clearly described in the UK Government Digital Service manual (I remain pleasantly surprised at the way UK government IT has been transformed in recent years).
The problem with the Unistats website is that it does not match the decision-making process as it really is for prospective students. The design feels too data-centric and insufficiently user-centric. I get the same kind of feeling when I see many analytics dashboards. In this situation, I see user-centric design coupled with iterative/agile action and informed by analytics on what people actually do in the site would help to more accurately:
- Match the web site content, structure, and pathways to the prior knowledge of its users. Applicants to Higher Education typically have a very poor model of both subject matter, and overall lifestyle/experience.
- Balance those aspects where objective-oriented rational choice is supportable, and where intuition, sentiment, or taste comes into play.
- Capture the actual enquiry and decision-making process, finding out which parameters are likely to be known at the start (e.g. likely grades), and those which will be refined or identified during the enquiry process.
OK, this is only an anecdote, and to be fair the Unistats site was built with an imposed short deadline, but still, this is a story to learn from and bear in mind when large-scale initiatives predicated on data-informed decision-making are proposed. It is telling, too, that Hefce does not envisage changes until 2017. That kind of step approach is not, I am sure, either the future of web development, or of effective learning analytics in practice.
An aside: an interesting idea, which I heard about in the aftermath of the recent LAK14 conference is “fast and frugal heuristics“. The essential observation is that attempts at decision-making that try to take in more aspects of multi-variate situations in order to increase their objective rationality may be less successful than simple rules of thumb. This provides an alternative lens to look at how decisions may be supported through analytics, a particular way of understanding “less is more”. What this might mean for Learning Analytics would be an interesting discussion point.