In this guest post Niall Sclater, a consultant at Sclater Digital and currently doing research on Learning Analytics for Jisc, reports on the Workshop on Ethics & Privacy Issues in the Application of Learning Analytics organized by LACE and SURF in Utrecht, the Netherlands, on October 28, 2014. The post first appeared in Effective Learning Analytics by Jisc.
I’m back from yesterday’s excellent Workshop on Ethics & Privacy Issues in the Application of Learning Analytics in Utrecht organised by LACE and SURF. Hendrik Drachsler from the Open University of the Netherlands kicked off the session by presenting a background to learning analytics and some of the resulting ethical and privacy issues. He mentioned the situation in the Netherlands where universities are now partially funded on the basis of how many students they graduate, and concerns that that gives them an incentive not to accept students who are predicted to fail.
He also discussed the InBloom debacle in the US – “a perfect example of not taking care about privacy issues”. There was another situation in the Netherlands where an app used on tablets in schools collected data on which further analysis was carried out. Problems arose because this analysis wasn’t described in the terms and conditions of use.
Hendrik mentioned that his call for ethical and privacy issues in the application of learning analytics had produced over 100 issues. These were then put into four categories: privacy, ethics, data and transparency. The aim of the day was to discuss these issues and begin to look for solutions to them.
The group decided that there are often no clear boundaries between these categories. Certainly I’ve found it artificial to try to split legal issues from ethical ones when carrying out my recent literature review of the area. Much of the law is based on ethics – and sometimes an ethical stance has to be applied when interpreting the law in particular situations.
The workshop wasn’t purely full of Hendriks, but a second Hendrik, Hendrik vom Lehn then gave an informative presentation on practical considerations around some of the legal issues arising from learning analytics. Much of what he said, and subsequent discussions during the day, related to the EU Data Protection Directive. Hendrik thought that a common misconception about the Directive is that the scope of “personal data” is much broader than most people think, and includes absolutely everything which makes data personally identifiable.
Another interesting concept in the Directive is that data can potentially be processed without consent from individuals if it is in the “legitimate interest” of the organisation to do so. However in practice it’s likely to be better to inform students and obtain their consent for data collection and the resulting analytics. Hendrik also discussed the US concept of “reasonable expectation”: people whose data is being processed should have reasonable expectations of what is being done with it. Thus if you start using it in new ways (e.g. the recent Facebook mood altering experiment) you’re on dangerous ground.
Anonymisation is often proposed as an alternative to obtaining consent, but this can be difficult to achieve. It’s particularly problematic in small groups where behaviours can easily be attributed to an individual.
Hendrik felt that where grading is based on learning analytics or can in some way affect the career of the student, this could have legal implications. Another issue he mentioned, which I hadn’t come across before, was the subordinate position of students, and that they might feel obliged to participate in data collection or learning analytics activities because they were being graded by the same person (or institution) that was analysing them. Would any consent given be regarded as truly voluntary in that case?
A member of the audience then asked if there was a difference between research and practice in learning analytics. Hendrik suggested that ethically our approach should be the same but from a legal perspective there may be a difference.
So what happens if a student asks for all the data that an institution has about them? Hendrik thought that the Directive implied that we do indeed need to make everything we know about students available to them. However there might be a possible conflict between full data access and the wider goals of learning analytics – it might make it easier for students to cheat, for example. Also it may be difficult to provide meaningful access for an individual while excluding other students’ data.
Another potentially difficult area is outsourcing and data transfers to third parties. This is particularly problematic of course when that data is being transferred outside the European Union. For students the process of understanding what is happening to their data – and accessing it – can then become more complex and they may have to go through several steps. Ownership of the data is not complete in this situation for any party (though in a later discussion it was proposed that “ownership” is not a helpful concept here – more relevant are the EU concepts of “data controller” and “data processor”).
We then split into groups and had the benefit of some great input from Jan-Jan Lowijs – a privacy consultant from Deloitte. He described the nine general themes in the Directive which we found a useful way to propose answers to some of the 100 issues that had been submitted. These are:
- Legitimate grounds – why you should have the data in the first place
- Purpose of the data – what you want to do with it
- Data quality – minimisation, deletion etc
- Transparency – informing the students
- Inventory – knowing what data you have and what you do with it already
- Access – the right of the data subject to access their data, when can you have access to it and what can you see
- Outsourcing – and the responsibilities of your institution as data controller and the third party as data processor
- Transport of data – particularly problematic if outside the EU
- Data security
Attempting to answer some of questions submitted using the themes as guidance resulted in the following:
Who has access to data about students’ activities?
Students themselves and certified access for teachers, researchers etc, based on theme 2 above (purpose of data)
What data should students be able to view?
All data on an individual should be provided at any time they request it – that’s the situation to aim for, based on theme 6 (access)
Should students have the right to request that their digital dossiers be deleted on graduation?
Yes, so long as there are no other obligations on the institution to keep the data e.g. names, date of birth, final grades, based on theme 3 (data quality)
What are the implications of institutions collecting data from non-institutional sources (e.g. Twitter)?
Consent must be obtained from the students first, based on theme 4 (transparency). A case in Finland where two students sued their university who were re-using their Twitter data was noted.
Something interesting that Jan-Jan also mentioned was that there are differences in data subjects’ attitudes to privacy, and that a number of studies have shown a fairly consistent split of:
- 25% “privacy fundamentalists” who don’t want to share their data
- 60% pragmatists who are happy to share some of their data for particular purposes
- 15% people who “don’t care” what happens to their data
An organisation needs therefore to make an active decision as to whether it attempts to cater for these different attitudes or finds some middle ground.
Some of the conclusions from the day in the final session were:
- It was noted that students were absent from the discussions and should be involved in the future.
- It was suggested that we fully articulate the risks for institutions of learning analytics. What are the showstoppers? Are they reputational or based on a fear of loss of students?
- “Privacy by design” and user-centred design with much better management of their data by users themselves were thought to be vital.
- InBloom was suggested as an “anti-pattern”, to be studied further to establish what we shouldn’t be doing.
- If you think something’s dodgy then it probably is. I have to admit being slightly concerned to hear that one university has equipment in its toilets to ensure that you’re not using your mobile phone to cheat if you have to nip out during an exam. A good rule of thumb proposed by Jan-Jan is that if you feel uneasy about some form of data collection or analysis then there’s probably something to be worried about.
The outcomes from the day were being much more rigorously written up than I have done above by a professional writer – and will be fed into subsequent workshops held by LACE with the aim of producing a whitepaper or longer publication in the area.
View Twitter conversations and metrics using: [Topsy]