The internet has quickly become a necessity rather than a luxury in children’s lives. Parents, educators and governments are struggling to keep up with the risks and opportunities it brings. Sonia Livingstone takes a critical look at the plans for an Internet Safety Strategy. She argues that digital literacy needs to be taught in a way that extends far beyond the concept of ‘e-safety’ and limiting screen time. Websites and apps should normally be designed with children in mind. Children and adults alike need a single point of contact for complaints and concerns – not just an ombudsman for under-18s.
The evidence of the risk to children online
The announcement of a new Internet Safety Strategy for the UK offers much to be welcomed. Can it guide industry, schools and parents in a way that is sufficiently responsive to children’s diverse needs and rights, empowering them as well as protecting them in the digital environment in ways that are informed by robust and specific evidence? On behalf of the UK Council for Child Internet Safety (UKCCIS) Evidence Group, my colleagues and I were commissioned by DCMS to review the recent, relevant, UK-based research on children’s online risks and safety to inform the Strategy.
Our key findings reveal the degree of online risk of harm encountered by UK children, also showing that children’s age and gender, digital literacy and resilience all affect their online experiences and wellbeing. However, our review also identified lots of evidence gaps – research funding simply isn’t keeping up with the changing array of risks facing children. Consider some of the latest findings from the 2017 Ofcom report on children’s media use and attitudes:
- 53% of UK 3-4 year olds go online, with YouTube their favourite (if problematic and risky) app.
- 23% of 8-11 year olds have a social media profile, though for most apps the lower age limit (and associated industry-provided protections) is 13.
- Only one in eight 12-15s who saw something worrying or nasty online reported it to an online service designed (or not so well designed) to help them.
- Parental anxiety about their children’s internet use is growing.
- The top websites accessed by 6-14 year olds are heavily commercial and mostly designed for adults, not child audiences.
Are young people ‘digital natives’, as page 8 of the Green Paper claims? Yes and no. Research shows that children and young people are often at home in the digital environment.
- They are often pioneers in relation to digital and social media, using new services before the adults meant to be caring for them have woken up to the new opportunities or risks
- They regard it as ‘their space’, often because adults have imposed too many constraints already on their offline lives (whether walking to school or going out in the evening or even going anywhere by themselves).
- They are keen to learn, know more and engage with the latest aspects of the digital environment.
But children and young people are not highly skilled online, especially in relation to social norms, creative opportunities and the critical evaluation of misinformation, persuasion, exploitation or self-protection. We should not call them digital natives if the consequence is to withdraw resources from their empowerment and protection online – or offline.
Indeed, the agencies that currently provide for children and young people’s needs offline should now also encompass the risk of harm the online world can pose to them. Advising these agencies – including but not restricted to teachers and parents, is crucial. So the new Internet Safety Strategy requires resources. They must dovetail with improved resources for children’s mental health services. By many accounts, these are in crisis, both because of systematic underfunding and because of increasing incidence and/or awareness of youth mental health difficulties.
The safest place to be online
The Government’s ambition, as the Secretary of State announced, is to make Britain “the safest place in the world to be online”. We might usefully contrast this with the ambition of the Lords’ Communications Committee, which headlined its new report Growing up with the internet: “The internet must be made a better place for children.”
It is vital that the cost of making the UK the safest place to be online does not overwhelm its wider ambitions. However, by ‘best’ (or, at least, ‘better’), I urge us to pay attention to educational, democratic, creative and community ambitions – in the public interest. The Green Paper tends to pit technological and market innovation against safety, but it does not propose a strategy for enhancing the benefits of the internet for the British public, including children. This is a weakness, and a missed opportunity – it should not be left to the private sector to fill this crucial gap.
Children, especially, must not be kept safe primarily through highly restrictive measures that constrain both their (online) opportunities and indeed their rights. They need to gain resilience through a measured exposure to risk. If adults impose heavy restrictions on children, then the evidence shows that children will not turn to adults when they encounter a difficulty, but will instead try to evade adult scrutiny. Evidence also suggests that this, in turn, makes adults monitor children’s online lives all the more insistently and insidiously. The result is a negative spiral of distrust between child and parent, student and teacher.
If the media response to the Green Paper is anything to go by, introducing any kind of regulation will be controversial. On the one hand, it seems to many that the tech giants increasingly act more as publishers than platforms, by moderating, editorialising and banning content. In some quarters, this is highly welcome – after all, they must take responsibility for the adverse consequences when the public abuse their services, and many certainly generate enough profit with which to do so.
On the other hand, while the public does not want companies to escape responsibility, nor does it wish them to take responsibility in an unaccountable way, led by commercial rather than the public interest. Hence the importance of independent or regulatory oversight. As The Spectator put it, “it is time for a parliamentary debate on laying out and limiting the powers that politicians ought to have over the new digital publishing houses.”
For at least a decade there have been calls for a reporting mechanism that would inform the public of how the industry has dealt with their concerns or complaints. Little has come of any of these. So it’s great that the government proposes to “provide better information about how startups can deliver safety by design.” But better than what? The only information I am aware of is the UKCCIS Practical Guide for Social Media Companies, on which some of us expended considerable effort. Is it lacking? What is needed? Does it work? We don’t know.
Nonetheless, there have been sporadic and often amnesic efforts to introduce self-regulation and good practice. But the implementation and effectiveness of existing codes and guidance remain unknown, having never to my knowledge been examined through robust independent means. There has never even been a clear definition of scope (which companies are included? All of them?). This which matters considerably when the new entrants to the market often lack sufficient safety and response mechanisms, as well as – it must be recognised – some of the big players, who could and should do better.
The Green Paper emphasises the Lords’ recommendation for safety-by-design. This is crucial, since we cannot teach young children everything about the fast-changing complexities of the internet. Society cannot continue to be reactive, discovering too late that services for “everyone” are used by children, sometimes literally at their own risk. It is inefficient, expensive and damages the reputation of businesses and the trust of parents and civil society. We can anticipate many of the risks based on existing research and prior experience.
The House of Lords develops this recommendation, asking what it would mean to design for a child-rights-friendly internet:
“The Government should establish minimum standards of design in the best interests of the child for internet products. For the avoidance of doubt this is for all products that might reasonably be expected to attract a large proportion of children, not only those designed with children in mind” (para 299) [including] “Minimum standards for child-friendly design, filtering, privacy, data collection, terms and conditions of use, and report and response mechanisms for all businesses in the internet value chain, public bodies and the voluntary sector” (para 366).
But as they add, tellingly:
“We have found that there is resistance to providing services which incorporate the support and respect for rights that would enable a better internet experience for all children as they explore the wider internet” (para 298).
So a code of conduct which brings all this together will be crucial, as will mechanisms of implementation and compliance.
Educating children, parents and adults
The Green Paper emphasises “the crucial role that education will play.” Indeed. I recall the Royal Society’s highly critical evaluation of the state of computing education in UK schools. Putting digital literacy in the computing curriculum is important but insufficient, because it tends to focus on the technical rather than the social and ethical or, most importantly, the critical literacy and understanding required of (young) citizens in a digital age. As Lord Best said, when introducing the Lords’ report in Parliament:
‘We called digital literacy the “fourth pillar” of a child’s education, alongside reading, writing and arithmetic. This is quite different from the important education in computer literacy required in the modern world, and we emphasised the need for teacher training to cover the skills which teaching digital literacy demands. We advocated making this a core ingredient in personal, social and health education, or PSHE, which, we said, should be a statutory subject, inspected by Ofsted, and should cover compulsive use, privacy of data, obsession with body image and the rest, not just the e-safety agenda of risks.’
An integrative, cross-cutting approach is vital, from the early years onwards. This will demand teacher training and support, rethinking the school curriculum, sustained updating and universal implementation and evaluation. It must also be child-friendly. Faced with an increasingly complex, often opaque and illegible digital environment, the burden on children to anticipate and cope with online risks is often too great.
But digital media education cannot prepare all children for every risk the online world serves up to them, and adding ‘digital literacy’ to the 3Rs will be doomed unless the task of understanding and managing the online world is made easier for all. The less responsibility industry takes, the more falls to children and their parents and teachers to figure out for themselves.
How can we reach parents and the wider public? Evidence increasingly shows that what works is an ambitious programme of digital literacy that goes beyond narrow conceptions of e-safety.
It’s regrettable that the Green Paper (p.33) refers to outdated notions of the harm done by screen time. These have insufficient empirical support, and fail to grapple with the central role screen media now play in work, leisure, family and personal relationships. As currently proposed, the strategy risks burdening parents with yet more rules and guilt without positively supporting their role in the digital age.
The UK Council for (Child) Internet Safety
The Green Paper proposes “to improve its accountability, strategic direction and responsiveness to the rapidly changing online landscape.” This is welcome, urgently needed and important. UKCCIS has hitherto fallen short on all three criteria, especially the first two. Sporadic efforts to evaluate its effectiveness (e.g. led by Anna Payne in 2015, with the Evidence Group’s input) have little to show for them.
Does this mean it should be remodelled as the UK Council for Internet Safety? No.
- Children are a special case in relation to the internet, with more than most to gain, especially over the long-term – and more to lose, given their relative vulnerability. They need special attention and provision, and this takes specialised expertise and dedicated services.
- Time and again, we see that when children’s needs are mixed in with those of the general population, they are crowded out, neglected, or rendered simply invisible in the tacit assumption that ‘people’, ‘users’ or ‘the public’ are always adults (who, it is also assumed, are responsible for their own decisions: hence ‘caveat emptor’).
- Now that the UK is deciding that, in the Data Protection Act, children will be able to use information society services from the age of 13 rather than 16, the default age in the General Data Protection Regulation, young children could be at risk without the requirement for parental knowledge or supervision. If the government takes it upon itself to disempower the parents of teenagers in this way, it must certainly ensure teenagers’ safety itself.
- On past performance, I must conclude that the only reason that UKCCIS has achieved anything is because it – and other stakeholders – recognise that children merit special consideration, to be met through the application of particular expertise. So if political expediency regrettably demands that the C should be dropped from UKCCIS, I would want to see guarantees that children’s interests are protected and advanced. Unfortunately, these are not set out, nor even sketched, in the Green Paper.
Independent ombudsman, commissioner or digital champion?
Regrettably, the Green Paper does not propose an independent ombudsman or children’s digital champion, as Growing up with the internet called for. Quoting the Children’s Commissioner’s recent report, the government instead recommends that the UK should either obtain ‘a commitment from industry to build and fund an arbitration service for young people’ (para 244) or establish
‘a Children’s Digital Ombudsman to “mediate between under-18s and social media companies over the removal of content. It should operate in a similar way to the UK Financial Ombudsman Service and be funded by social media companies themselves but be completely independent of them.”
This would enable children to challenge “any content that they have accessed via common social media platforms that they are able to report”, for example pornography or hate speech.’ (para 238)
As the research with children conducted for the Royal Foundation Taskforce on Cyberbullying suggested, children and parents want a single, accessible and independent body to which they can direct their concerns and complaints about social media and internet companies. Their voices have been ignored. Will the Internet Safety Strategy also leave them unsupported?
This post represents the views of the author and not those of the LSE.
Sonia Livingstone OBE is Professor of Social Psychology in the Department of Media and Communications at LSE.