LSE - Small Logo
LSE - Small Logo

Velislava Hillman

Ioanna Noula

Mitzi László

June 30th, 2021

EdTech users cannot consent. The EU must act fast and take charge of the digitalisation of education.

0 comments | 20 shares

Estimated reading time: 5 minutes

Velislava Hillman

Ioanna Noula

Mitzi László

June 30th, 2021

EdTech users cannot consent. The EU must act fast and take charge of the digitalisation of education.

0 comments | 20 shares

Estimated reading time: 5 minutes

Technology is being increasingly used in education, particularly as schools closed around the world as a result of the pandemic. EDDS (Education Data Digital Sovereignty) recently convened a panel of experts to discuss the role of the EU’s imminent Digital Services Act legislation in regulating the Education Technology sector (EdTech). In this post, Mitzi Laszlo, Head of Strategy at NextCloud, is in conversation with the panel organisers Velislava Hillman and Ioanna Noula, on the importance of regulating and paying close attention to the digital transformation of education.

Introduction

The clear consensus that emerged during the panel discussion on the need for independent auditing of EdTech providers raises hope that this will be addressed through a holistic approach across all the EU’s new legislative packages the Digital Services Act, Digital Markets Act, the Artificial Intelligence Act, and the Data Governance Act.

While the latest Green Deal of the EU produced an elaborate taxonomy including over 70 different activities and aims indicating to investors what is green and what isn’t, we are yet to see a taxonomy of what is considered an ethical and accountable EdTech business. The EdTech market is soaring with a 16.3% CAGR, estimated to reach $404 billion in total global expenditure by 2025. Despite the fact that this would constitute only 5.5% of the $7.3 trillion global education market, there is lots of money in EdTech, but no clear outline of those companies that are ‘green’, ‘ethical’ and transparent about their business models. Moreover, legislation barely addresses this behemoth, which crucially targets one of the most vulnerable segments of society: children.

Velislava Hillman:Investors are all too familiar with the rise of Tesla”; its shares are nine times what they were in 2019 and that’s “not an exception”, ran an Economist piece this month on the ‘green boom’ – the growing investment in climate-friendly products and services. One can only wonder what steers venture capitalists’ choice in investing in EdTech, a market that has grown 32-fold since 2010 reaching $16.1 billion in 2020. The asset management industry, according to the Economist, is increasingly taking into account environmental, social and governance (ESG) factors in order to consider investing in “green companies”. I couldn’t help but think that we need a ‘Greta Thunberg’ for the EdTech industry, because something has to steer investments in EdTech by considering cyber, social and governance (CSG) factors – or any others for that matter – that define and guarantee an ethical, accountable and sustainable EdTech service provider. Is ‘ethical’ tech not cool yet for investors and policy makers? Do venture capitalists consider CSG factors before they pump money into a start-up platform?

Mitzi László: A lot has happened in the last few years. Apart from the implementation of the EU’s General Data Protection Regulation (GDPR) which has become a roll-off-the-tongue symbolic acronym well beyond digital experts, it used to be that working in the Silicon Valley was really the most impressive job. Today there is even a whiff of reluctance from people working for companies in the Silicon Valley to mention it, as we saw in the mainstream Netflix film Social Dilemma. On the flip side, working on delivering digital solutions that are striving to be better for society is blossoming, as we heard in the recent European Union Digital Assembly last week.

What ethical tech really looks like is still a moving target, in part, because we have only just started to understand what makes us uneasy about digital solutions, especially in the public debate. It is clear that large technology corporations have lied to us, the public sector has allowed these corporations to dominate, and somehow the general public have been made to feel individually responsible for using apps that are designed to take advantage of our human vulnerabilities.

The question of what technology is suitable for children opens up a very important debate about going beyond transparency and consent. Children cannot consent, so we have to design a framework of what would be desirable technology and set minimum standards below which tech cannot fall with or without consent. With kids being forced to learn online during the pandemic the question is more pertinent than ever.

Today, we actually do have multiple technology companies who deliver privacy friendly tools to paying customers. It’s no longer a theoretical debate. A lot of these companies are relatively small in comparison with the giants. One of the big challenges is around trust. Because so many people use big tech, if it fails there is not such a sense of responsibility for those individuals and organisations who use it. It is more of a collective failure. Whereas with the smaller newer tools, there is a fear that if it fails or there is a problem the people using the tools will suffer a marketing and public relations blow while explaining why they opted for a less known solution.

Ioanna Noula: Educational technologies are procured almost exclusively by the private sector. This means that the digitalisation of education, which has been expedited by the pandemic, significantly increased the stakes of businesses in education. At the same time, this new market is dominated by a handful of companies with two distinct features: a) their main business model and operations have not been developed with the institution of education or young people in mind (i.e. Google, Microsoft, Amazon) and b) most of them are based in the United States (the only country which has not ratified the UN Convention on the Rights of the Child). In your view, what are the challenges, the opportunities and ways forward in the light of the “inevitability” of the collaboration of the public and the private sectors? Specifically, what is the current role and future prospect of European EdTech SMEs and what are the implications of the terms of competition in the EdTech market for end-users?

Mitzi László: There are elements of the computational infrastructure that need to become public utilities. Companies have a role in delivering public utilities, and the exact relation with the public sector needs to evolve to ensure that the public interest is represented. The business model is a big factor. Privacy violations are inherent to any tool that is funded by a surveillance capitalism business model. Advertising is still possible but only if you do it on a single data point. As soon as you start to build a profile around a person and make predictions on a pattern of behaviour it starts to get very difficult to maintain privacy. As well as the business model the source of the financing plays a big role in the values and interests represented in the technology design. It’s simple, follow the money.

There are lots of little technology companies delivering great products and services in Europe today. One of the challenges is knowing where and how to speak with the public sector. In Europe the public sector is also a large selection of relatively small institutions. If we don’t manage to interface, only big tech will have the big picture.

Velislava, what have you learned from your recent research? Are parents and educators aware of the risks relating to edtech in education? Do they know how to switch to tools which do not pose privacy risks and how can we make it more urgent or emphasise that we need to think about sustainability and real diversity and home-grown products?

Velislava Hillman: More parents and teachers today are becoming aware of things like ‘data collection’, ‘dataveillance’ and tech that seems to ‘know you’ and ‘make decisions for you’ (parents’ and children’s comments). On the one hand, there seems to be a resignation around this – ‘that’s how things are’. Such inevitabilism sets up powerful marketing narratives: ‘Edtech helps with this’, ‘solves that’; it’s here to stay, evolve and reform schools. These narratives are led by the big tech companies.

On the other hand, many parents and teachers actively try to resist this inevitabilism and colonisation through constant ubiquitous data capture happening in their children’s’ schools. There are numerous online forums, blogs, coalitions, lectures and activities discussing the risk of privacy loss from dataveillance and the ‘how’ to actively resist the profiling of children in schools.

I’ve been doing research since last February on student tracking in vocational education and interviewed parents and children who are deeply concerned about how a powerful integration of legislature, corporate influence and enhanced data systems begin to determine children’s futures from an ever-younger age. By legislature I mean the growing push for learning-to-earning; the push for integrating data systems; and corporate participation through curriculum design.

The research looked at American education where corporate influence is really strong – you don’t just study computer science, you train on AWS or Cisco systems; you don’t enrol in environmental studies, you’re inculcated by Shell or Dominion Energy. This may not reflect how things are in Europe and the UK – although parents have shown resistance against corporate influence through academisation of public schools. The point is that we must consider the influence the EdTech industry in particular can have long on one’s future. With advanced data-generating systems interweaving every sector, including education, the power structures change. Those who hold the data systems hold the power, too.

This is why issues surrounding the digitalization of education don’t stop with privacy but concern the labour market and one’s right to their own future. Not only do we need a Greta Thunberg to create more awareness about the data extraction happening in education, and demand for sustainable, accountable and ethical tech  but we need to have lawmakers to provide the necessary instruments to set the necessary standards.

This conversation follows the event organised at the LSE Department of Media and Communications on June 8 titled ‘The Education Technologies and the colonisation of our digital future: The role of EU’s Digital Services Act in regulating EdTech and putting humanity in charge.’ The event was chaired by Professor Nick Couldry with presenters Dr. Velislava Hillman, Dr. Ioanna Noula, Member of the European Parliament Eva Kaili, Dr. Desmond Bermingham, Professor Sonia Livingstone, Dr Ben Wagner and Mitzi László.

This article gives the views of the authors and does not represent the position of the Media@LSE blog, nor of the London School of Economics and Political Science.

Featured image: Photo by Kelly Sikkema on Unsplash

About the author

Velislava Hillman

For the past ten years Dr Hillman has researched at the intersection of learning, digital technologies, children and young people with focus on their personal perspectives, uses and experiences. She has researched in the area of learning and creativity through the use of digital tools in formal and informal learning environments. During her fellowship in Berkman Klein Centre for Internet and Society at Harvard University between 2018 and 2019, Dr Hillman investigated the implications of data collection in K12 education on children’s basic rights and freedoms to self-expression. More recently, Dr Hillman’s interests are in the integration of AI systems in schools, data-driven decision-making and the role and participation of children and young people in increasingly digitised learning environments. As a Visiting Fellow she is focusing on identifying what kind of AI systems are used in compulsory education; learners’ experiences, perceptions and attitudes towards AI systems; the role of such systems in the instruction, learning and assessment processes; and the rights and freedoms of children and young people in relation to AI systems’ diagnostics and learner profiling.

Ioanna Noula

Dr Ioanna Noula is a childhood and education expert holding a PhD in Citizenship Education (University of Thessaly) and a MA degree in Sociology of Education (UCL Institute of Education). Her research interests include citizenship education, critical pedagogy and digitalisation. She is Head of Research and Development and co-founder of the Internet Commission a non-profit organisation focusing on advancing digital responsibility. She has worked as a teaching and research fellow at the UCL Institute of Education, the University of Leeds and the University of Thessaly. Ioanna has conducted research for award winning projects on global citizenship education and active citizenship in the UCL Institute of Education and LSE’s Department of Media and Communications. Her current research focuses on citizenship, critical literacy and digital responsibility.

Mitzi László

Mitzi László is an advocate for equity enhancing computational infrastructures. She is Head of Strategy at Nextcloud and previously worked for Sir Tim Berners-Lee’s company Inrupt/Solid as well as being an independent expert for the European Commission. Read more on https://www.mitzilaszlo.org

Posted In: Children and the Media | Privacy

Leave a Reply

Your email address will not be published. Required fields are marked *