LSE - Small Logo
LSE - Small Logo

Velislava Hillman

December 9th, 2022

The state of cybersecurity in education: the responsibilities of the EdTech sector towards children

0 comments | 15 shares

Estimated reading time: 5 minutes

Velislava Hillman

December 9th, 2022

The state of cybersecurity in education: the responsibilities of the EdTech sector towards children

0 comments | 15 shares

Estimated reading time: 5 minutes

The growing dependence of primary and secondary education on digital technologies has led to increased cybercrime in UK schools. Unlike other information and communication technology sectors, the EdTech industry tends to escape critical research enquiry with regards to their state of cybersecurity. EdTech businesses work in a fast-paced, relatively unregulated environment and their cybersecurity measures remain largely unknown. In a new Media@LSE Working Paper on which this post is based, Velislava Hillman focuses on the state of cybersecurity in education by addressing EdTech businesses, to map the challenges and identify the needs for safety and security in education.

As discourse around digitalised education continues to gloss over issues of data collection, those demanding data on one end (policymakers), and those collecting it on the other (EdTech providers), tend to remain unchallenged. Talking about ‘education data’ or ‘data privacy loss’ as a thing of its own without addressing the businesses that create the problems in the first place, only leads to more absent individual and collective responsibility for children’s privacy in a digitalised education. Solely reiterating the problems risks desensitising audiences to the exploitative practices of businesses driving the data capitalism and legitimating their dominance and power further still. As we witness the progressive dependence of education on EdTech, there has been a drastic shift away from meaningful discourse around: the purpose of education and the growing cybercrime in schools. This has also gone hand in hand with an explosion of unquestioned, industry-led designs for children’s education without any clear code of good practice, benchmarks or, indeed, any mandates and external validation about who meets these.

This research was an attempt to address directly the issues emanating from digitalising education by meeting its providers: the EdTech businesses. Addressing them by talking about cybersecurity – rather than say, pedagogy – allowed them to concretely point to present and tangible issues emerging as a result of their proliferation in schools, and even discuss actionable solutions, much of which was proposed by the businesses themselves. And so, this research was not about the pedagogic value of EdTech or their contribution to learning outcomes, curriculum, or children’s wellbeing. Instead, it aimed to start an honest conversation with EdTech companies about what they do and how they protect children’s privacy and basic rights. Simultaneously it also aimed to exchange knowledge and align their priorities with children’s benefits and fundamental rights.

The cost of cyber insecurity

Cybercrime in education continues to grow in frequency and magnitude. Between 2021 and 2022 around 41% of primary schools and 70% of secondary schools in the UK experienced cyber breaches. Given schools’ reliance on digital technology, cybercrime disrupts education and deprives children of their fundamental right to it, as well as increasing other risks such as that of loss of privacy. Cybercrime leads to loss of sensitive information about children and teachers and leave systems unusable. The cost of cybercrime in primary and secondary education is estimated to exceed the cost of cyber breaches in any other sector globally. Cybercrime leads to loss of sensitive information about children and teachers and leaves systems unusable.

To identify the roles, responsibilities, and gaps that lead to cyber insecurities in education, this research used two methodologies. First, in-depth interviews were conducted with industry representatives – EdTech founders, security and software experts, chief executives and/or technology officers. Interviews were also conducted with representatives of the National Cyber Security Centre (NCSC) and the IASME Consortium which provides the CyberEssentials cybersecurity framework, the National Institute of Standards and Technology (NIST) which provides the NIST cybersecurity framework (NIST CSF) and other frameworks (NIST 800-171/800-53), and the National Initiative for Cybersecurity in Education (NICE). And second, publicly available literature (written in English) was analysed to identify the dominant discourse surrounding cybersecurity in digitalised education.

Summary of the findings

  1. There is a lack of clear guidance for EdTech businesses about what, how, when, and why they can – and should – implement cybersecurity controls that ensure children’s data safety and privacy.
  2. The existing generic cybersecurity frameworks tend to be “tedious” and “bureaucratic” – difficult to implement for most EdTech start-ups.
  3. There are relatively no strict mandates imposed on EdTech businesses to implement any cybersecurity controls. The existing cybersecurity frameworks (e.g., CyberEssentials in the UK, NIST CSF in the US) are voluntary and not easy-to-navigate for EdTech companies to implement.
  4. The costs and resources required to meet cybersecurity standards are typically high, which makes it near impossible for start-ups to level up. This only increases the risks for children.
  5. Some start-ups have a limited understanding of the nature and extent of harm from cybersecurity risks. Some small vendors see security around education data as “not close to the bone” as, say, health data.
  6. There is willingness among EdTech companies to have a dedicated cybersecurity standard and (a) dedicated independent body that can guide them to maturity.
  7. An ideal scenario for a cybersecurity standard is that it is tailored to address the needs and vulnerabilities of primary and secondary education. However, for the EdTech sector to lead towards maturity and good practice such standard should be mandated.
  8. Investors don’t see cybersecurity controls and validation as a deal breaker. Most of the time, investor’s don’t ask questions about cybersecurity risks (and the costs of that). If they do, it is something “little on the side”, not a factor influencing investment decisions.
  9. There is growing awareness about cybersecurity matters among the school education community. School leaders, teachers, and EdTech procurement demand to see some sort of external validation that EdTech providers have what it takes to protect students’ data.
  10.  There is substantial gap in literature about cybersecurity and the primary and secondary school EdTech sector. Most existing literature addresses what the education community should do about cybersecurity matters; little is said about the role of the EdTech sector or governments.
  11.  From lax regulation to self-regulation is not the way to govern the growing EdTech industry – “trust but verify”, as one vendor put it, “marking one’s own homework is not ideal”.

The report findings and a blueprint proposal for cybersecurity underpinning data privacy protection in primary and secondary education globally will be covered during a special event on February 8 2023, from 10:00-11:00am GMT, via Zoom. For more information, contact Velislava Hillman

This article represents the views of the author and not the position of the Media@LSE blog, nor of the London School of Economics and Political Science.

Featured image: Photo by Annie Spratt on Unsplash

About the author

Velislava Hillman

For the past ten years Dr Hillman has researched at the intersection of learning, digital technologies, children and young people with focus on their personal perspectives, uses and experiences. She has researched in the area of learning and creativity through the use of digital tools in formal and informal learning environments. During her fellowship in Berkman Klein Centre for Internet and Society at Harvard University between 2018 and 2019, Dr Hillman investigated the implications of data collection in K12 education on children’s basic rights and freedoms to self-expression. More recently, Dr Hillman’s interests are in the integration of AI systems in schools, data-driven decision-making and the role and participation of children and young people in increasingly digitised learning environments. As a Visiting Fellow she is focusing on identifying what kind of AI systems are used in compulsory education; learners’ experiences, perceptions and attitudes towards AI systems; the role of such systems in the instruction, learning and assessment processes; and the rights and freedoms of children and young people in relation to AI systems’ diagnostics and learner profiling.

Posted In: Children and the Media

Leave a Reply

Your email address will not be published. Required fields are marked *