What are the main barriers in making use of learning analytics?


Back to the full list of FAQs about learning analytics:

Most commonly discussed issues grouped in 6 barriers:

  1. Availability of data. It has often been pointed out that the data generated by learners in institutional systems is not ‘big’ data, and that techniques for ‘small’ data may be more appropriate for education. However, in some cases the barrier is not so much that educational data is small, but rather that it is non-existent. The LEA’s Box project reported on a pilot at their recent summer camp [1] in which teachers were using analytics software to prepare reports on their students, but entering the data by hand, because there wireless network was not sufficient to connect tablets. If this is the case for an enthusiastic school in the developed world, then it may be expected that connnectivity and software infrastructure will be a major problem in other less favourable contexts..
  2. Accessibility of data. When data is available (as is increasingly the case), it is not necessarily accessible to analytics applications. Institutions and/or their employees may have concerns about ethics which prevent their participation. An overview is available in [2], and in the UK Jisc has recognised the importance of this issue, and has recently (June 2015) developed a draft Code of Practice for Learning Analytics which seeks to address some of these concerns [3]. The laws governing the gathering and storage of data generated by users (and particularly by children) also vary between countries, which may mean that approaches which have been successful in one context may not be replicable elsewhere.
  3. Interoperability. Cooper, in the LACE briefing Learning Analytics Interoperability – The Big Picture in Brief [4] points out that the diversity of IT systems brings with it the challenges of analysing data which is stored in different formats in multiple locations. This problem can be addressed by limiting the scope to a single method of capturing data (which may imply a single system), but this may not be practicable in all cases. Cooper concludes that in addressing this barrier “progress may be made by adopting common approaches to generic concepts but useful learning analytics will usually require common approaches to domain-specific concepts, which require definition by communities of practice”.
  4. Resistance from users. The intended users of learning analytics applications, particularly teachers and parents, may have personal concerns about privacy, and political concerns about surveillance. These concerns can lead to boycotts or campaigns, as in the well known case of InBloom [5], and current discussions about ClassDojo [6].
  5. Impact on professional roles. Learning analytics initiatives have long had an explicit link to data driven management techniques. Early examples include [7] (2001) and [8] (2006), while today the Michael and Susan Dell Foundation promotes Performance-driven education, arguing that the benefits include “so district administrators have access to school performance data to facilitate decisions about supports, rewards and consequences
    ” and “Balanced scorecard processes” [9]. These developments may be resisted by professionals who fear an extension of managerial control and accountability, and a symmetrical constraint on their professional autonomy, and they would find support for their concerns in the critique by Seddon [REF] among others(2008). These concerns lead to barriers which range from a generalised mistrust of learning analytics innovations, to well formulated critique and active resistance.
  6. Hype. It is clear that analytics methods have great potential to bring about change in research, pedagogy, and educational management, and that they offer substantial business opportunities. Most learning analytics applications seek to provide actionable insights based on evidence, so one would hope that there would be plenty evidence available to guide users in planning their use of learning analytics. However, while there are plenty of reports of adoption, solid evidence of the positive impact of learning analytics is thin on the ground, as we have found in the LACE Evidence Hub http://evidence.laceproject.eu/. As with many new technologies, adoption has often been driven by stressing the enticing opportunities, while skirting around the potential obstacles, leading to a recommendation from attendees at the LACE Spring Briefing higher education workshop: “Address the problem of over-claiming and mis-selling by vendors”. While the opportunities may be real, exaggerated expectations can lead to exaggerated cynicism if the promised benefits do not materialise, creating a barrier to further deployment.

References
[1] LEA’s Box 1st Learning Analytics Summercamp. http://css-kmi.tugraz.at/mkrwww/leas-box/summercamp15.html
[2] S. Slade and P. Prinsloo, “Learning Analytics: Ethical Issues and Dilemmas,” Am. Behav. Sci., vol. 57, no. 10, pp. 1510–1529, Mar. 2013.
[3] N. Sclater, “Code of practice for learning analytics,” 2014. https://www.jisc.ac.uk/sites/default/files/jd0040_code_of_practice_for_learning_analytics_190515_v1.pdf
[4] A. Cooper, “Learning Analytics Interoperability – The Big Picture in Brief,” 2014. http://www.laceproject.eu/blog/learning-analytics-interoperability-briefing/
[5] O. Kharif, “Privacy Fears Over Student Data Tracking Lead to InBloom’s Shutdown,” Bloomberg Businessweek, 2014. http://www.bloomberg.com/bw/articles/2014-05-01/inbloom-shuts-down-amid-privacy-fears-over-student-data-tracking
[6] N. Singer, “Privacy Concerns for ClassDojo and Other Tracking Apps for Schoolchildren,” New York Times, pp. 4–6, 2014. http://www.nytimes.com/2014/11/17/technology/privacy-concerns-for-classdojo-and-other-tracking-apps-for-schoolchildren.html
[7] M. A. Lachat, “Data-Driven High School Reform: The Breaking Ranks Model,” 2001. http://www.brown.edu/academics/education-alliance/publications/data-driven-high-school-reform-breaking-ranks-model
[8] J. A. Marsh, J. F. Pane, and L. S. Hamilton, “Making Sense of Data-Driven Decision Making in Education: Evidence from Recent RAND Research,” Santa Monica, CA, 2006. http://www.rand.org/pubs/occasional_papers/OP170.
[9] J. Seddon, J. “Systems Thinking in the Public Sector: The Failure of the Reform Regime and a Manifesto for a Better Way,” 2008. Axminster: Triarchy Press