Learning Analytics Watchdog – a job description of the future for effective transparency?


We should all be worried when data about us is used, but when the purpose for which it is used and the methods employed are opaque. Credit ratings and car insurance are long-standing examples we have got used to, and for which the general principles are generally known. Importantly, we believe that there is sufficient market-place competition that, within the limits of the data available to the providers, the recipes used are broadly fair.

Within both educational establishments and work-place settings, an entirely different situation applies. There is not the equivalent of competition and our expectations of what constitutes ethical (and legal) practice is different. Our expectations of what the data should be used for, and by whom, differ. The range of data that could be used, and the diversity of methods that could be harnessed, is so enormous that it is tempting not to think about the possibilities, and to hide one’s head in the sand.

One of the ideas proposed to address this situation is transparency, i.e. that we, as the subjects of analytics can look and see how we are affected, and as the objects of analytics can look and see how data about is is being used. Transparency could be applied at different points, and make visible information about:

  • the data used, including data obtained from elsewhere,
  • who has access to the data, as raw data or derived to produce some kind of metric,
  • to whom the data is disclosed/transferred,
  • the statistical and data mining methods employed,
  • the results of validation tests, both at a technical level and at the level of the education/training interventions,
  • what decisions are taken that affect me.

Frankly, even speaking as someone with some technical knowledge of databases, statistics and data mining, it would make my head hurt to make sense of all that in a real-world organisation! It would also be highly inefficient for everyone to have to do this. The end result would be that little, if any, difference in analytics practice would be caused.

I believe we should consider transparency as not only being about the freedom to access information, but as including an ability to utilise it.  Maybe “transparency” is the wrong word, and I am risking an attempt at redefinition. Maybe “openness for inspection” would be better, not just open, but for inspection. The problem with stopping at making information available in principle, without also considering use, applies to some open data initiatives, for example where public sector spending data is released; the rhetoric from my own (UK) government about transparency has troubled me for quite some time.

It could be argued that the first challenge is to get any kind of openness, argued that the tendency towards black-box learning analytics should first be countered. My argument is that this could well be doomed to failure unless there is a bridge from the data and technicalities to the subjects of analytics.

I hope that the reason for the title of this article is now obvious. I should also add that the idea emerged in the Q&A following Viktor Mayer-Schönberger’s keynote at the recent i-KNOW conference.

WatchdogOne option would be to have Learning Analytics Watchdogs: independent people with the expertise to inspect the way learning analytics is being conducted, to champion the interests of the those affected, both learners and employees, and to challenge the providers of learning analytics as necessary. In the short term, this will make it harder to roll-out learning analytics, but in the long term it will, I believe, pay off:

  • Non-transparency will ultimately lead to a breakdown of trust, with the risk of public odium or being forced to take down whole systems.
  • A watchdog would force implementers to gain more evidence of validity, avoiding analytics that is damaging to learners and organisations. Bad decisions hurt everyone.
  • Attempts to avoid being savaged by the watchdog would promote more collaborative design processes, involving more stakeholders, leading to solutions that are better tuned to need.

Watchdog image is CC-BY-SA Gary Schwitzer, via Wikimedia Commons.



About Author

Adam works for Cetis, the Centre for Educational Technology and Interoperability Standards, at the University of Bolton, UK. He rather enjoys data wrangling and hacking about with R. He is currently a member of the UK Government Open Standards Board, and a member of the Information Standards Board for Education, Skills and Children’s Services. He is a strong advocate of open standards and open system architecture. Adam is leading the workpackage on interoperability and data sharing.


  1. Pingback: This Week in Learning Analytics (September 14 – 20, 2014) | Timothy D. Harfield

  2. Hi Adam,

    Wholeheartedly agree with the objective, but wondering if we can afford such a watchdog right now? So if that’s just one approach to tackling an agreed goal, what are the others we might consider, so we can trade off their pros and cons? A worthy matrix for the LACE Project to catalyse 😉

    For instance:

    define bronze/silver/gold criteria for analytics projects to benchmark themselves at different levels of the system – they are encouraged to demonstrate their transparency. Maybe I t becomes a funding requirement, like open access to projects reports has become. Conventions emerge for evidencing different kinds of levels.

    “Bronze level: student insight into an analytic: student is informed that the analytic is being run on their data, with contact name for further details”

    “Silver: as above, plus student has a user interface onto the analytic”

    “Gold: as above, plus student was involved in the co-design of the analytic, and has had time to feed back views”



    • Simon – thanks for the comment. Yes, we should definitely consider a range of options, not all of which would be realistic to establish, even if potentially effective.

      I like the idea of levels of openness, and I agree that involvement/co-design is an important aspect of establishing trust as well as fostering good design decisions. The difficulty I see is that there are some issues which require a degree of technical and conceptual sophistication to understand, so co-design participants must still be trusting.

      Maybe a Watchdog is an over-strong response to the question of trust. Maybe a level system with a “right to query” would be more pragmatic …

      Cheers, Adam

  3. Pingback: Learning Analytics Watchdog – a job description of the future for effective transparency? | Adam Cooper