Evidence of the month: learning analytics meets measurement theory

0

The LACE Evidence Hub brings together evidence about learning analytics from all around the world and relates it to four propositions

  • Learning analytics improve learning outcomes
  • Learning analytics improve learning support and teaching, including retention, completion and progression.
  • Learning analytics are taken up and used widely, including deployment at scale.
  • Learning analytics are used in an ethical way.

Everyone is welcome to add evidence for or against these propositions to the Hub site.

The ‘Evidence of the Month’ on the site for April 2015 is a paper from this year’s Learning Analytics and Knowledge (LAK15) conference, ‘Crowd-sourced learning in MOOCs: learning analytics meets measurement theory‘.

This paper reports findings from a study of MOOC log file data relating to a large University of Melbourne MOOC that ran in 2013. The study investigated a series of hypotheses: (i) there is skill involved in using forums to learn (ii) MOOC participants use forums differently as they progress from novice to expert in this skill (iii) this progression is reflected in log file data (iv) log file data can be used to measure a learner’s skill in learning through forums. The study provides provisional support for each of these hypotheses.

The paper also sets out how skill in the use of forums for learners can develop. The proposed framework identifies how learners at each level are likely to view knowledge, how they are likely to view forums in the context of learning, and how they are likely to use MOOC forums. The framework has five levels:

  • Level 1: Novice, dependent learning
  • Level 2: Beginning, independent learner
  • Level 3, Proficient, collegial learner
  • Level 4, Competent, collaborative learner
  • Level 5 Expert, learning leader.

For example, at Level 1: ‘Learning is about consuming stable knowledge in a domain, comprised mainly of cognitive understanding or skill; seeks efficient transfer from authoritative or reputable sources; follows procedural guidance from teachers, relinquishing responsibility for learning process; calibrates performance on formal assessments and accepts standards inherent in them. Forums are essentially social adjuncts to courses; the information is possibly unreliable and misleading. Never visits forums.’

The relationship between achievement levels on the course studied and level on this scale was apparently strong. Of those rated at expert level 5, 78% received at least a pass score and 67% a distinction score. However, of those rated at level 1, less than 1% received a pass score.

Citation: Milligan, Sandra. (2015). Crowd-sourced learning in MOOCs: learning analytics meets measurement theory. Paper presented at the Learning Analytics and Knowledge (LAK15), Poughkeepsie, NY, USA. | URL: http://dl.acm.org/citation.cfm?id=2723596


View Twitter conversations and metrics using [Topsy]

Share.

About Author

Rebecca is a lecturer at The Open University in the UK, focused on educational futures, learning analytics, MOOCs, augmented learning and online social learning. She is a member of the steering committee of the Society for Learning Analytics Research (SoLAR) and was Workshops Chair of the second Learning Analytics and Knowledge conference (LAK 2012). She co-chaired the 1st and 2nd International Workshops on Discourse-Centric Learning Analytics, held in Belgium and the US, as well as the first UK SoLAR Flare (a national learning analytics event). Her most recent publication is the book ‘Augmented Education’, published by Palgrave in May 2014. Rebecca is working on the LACE work package relating to learning analytics in higher education.

Leave A Reply