The COVID-19 pandemic has shifted in-class, face-to-face learning to remote learning. But the consequences for children’s education are yet to be properly evaluated. Suddenly, education became dependent on fair and meaningful access to digital resources. In turn, this brought risks to their privacy, given the data collected by EdTech. In this blog, Prof Sonia Livingstone discusses the Digital Future Commission’s new report, “Governance of Data for Children’s Learning in UK State Schools,” launched tomorrow (register for the launch). This urges the need for policymakers to be more attuned to the child rights issues at stake when relying on the digital environment, as we emerge into the new normal of post-pandemic learning, whether offline, online or hybrid.
In a recent event on the position of EdTech in relation to the Digital Services Act, I observed that, although the European Commission has committed to mainstream children’s rights in all of its policies and actions, this is no simple task. Indeed, there remains a lack of joined up policy regarding children’s needs and rights in a digital world, and an urgent need for minimum standards for providers of digital products and services that impact on children, to be managed by a trusted body sufficiently resourced to ensure coordination and compliance.
In March 2021, children’s rights in relation to the digital environment were set out in the recently adopted General Comment 25 on the UN Convention on the Rights of the Child. This includes attention to how children’s rights to education, among their other rights including non-discrimination and the prioritisation of their best interests over the commercial interests of others, should be realised by states and other duty bearers, including businesses. It also recognises that, in a digital world, the right to privacy and data protection increasingly mediates their other rights: in effect, without privacy it is hard to learn and participate without being exploited, stay safe, and thrive.
Ideally, there would be no need to advocate for children’s rights in each and every domain. Yet, in relation to data collected from children at school, it seems that the argument is barely underway. UNICEF’s data governance manifesto recently demanded that children’s rights should be addressed by and integrated in data governance policies and regulation, including in relation to education. Public consultation with children globally and in the UK reveals similar and urgent demands. It seems straightforward to us that EdTech providers should undertake robust child rights due diligence and, following General Comment 25, that children’s education should not contribute to their profiling for commercial benefit. But that’s not the situation on the ground.
Observing the high hopes held for the advances of big data, learning analytics and AI to benefit children’s education, the Digital Futures Commission’s workstream on beneficial uses of education data aims to generate recommendations for child-rights-respecting data governance mechanisms that can unlock the potential of education data. Building on defenddigitalme’s recent report, The State of Data 2020, which reveals the challenges for children or their parents to exercise their rights as data subjects, we have begun by examining the blurred responsibilities for children’s privacy and other rights between EdTech companies and schools.
Our forthcoming report, Governance of Data for Children’s Learning in UK State Schools, by Emma Day, unpacks the considerable regulatory uncertainties that complicate schools’ data protection due diligence, including regarding EdTech decision-making and procurement. Schools have few mechanisms, and insufficient resources, to hold EdTech companies accountable for the processing of children’s data. EdTech providers, on the other hand, have considerable latitude to interpret the law, and to access children in real time learning to test and develop their products. Meanwhile, children have little say or opportunity to opt-out of the data processing associated with the digitisation of their education.
Does this matter? The potential for exploitation and harms from unreliable algorithms that drive decisions about children’s learning and educational outcomes is currently displacing the possibility of attending to the opportunities for processing and sharing children’s education data in the public interest, including in children’s best interests. Examples of such constructive use of education data include the development and deployment of education technologies for learners with disabilities, personalised learning and open education.
At present, public and expert debate appears polarised – focusing on either the harms or the benefits – and too rarely weighing their combined significance within a child rights framework. Our report, Governance of Data for Children’s Learning in UK State Schools, is a key step in this direction, and we invite you to the report launch and to engage with the next steps of our work.
This text was originally published on the Digital Futures Commission blog and has been re-posted with permission.
The post gives the views of the authors and does not represent the position of the LSE Parenting for a Digital Future blog, nor of the London School of Economics and Political Science.
Image credit: Photo by Mary Taylor, from Pexels