As policy makers in the UK and elsewhere consider how to tackle the spread of misinformation and the problems it causes, the LSE Truth, Trust and Technology Commission published its report earlier this week, ‘Tackling the Information Crisis: A Policy Framework for Media System Resilience.’ Professor Sonia Livingstone, Chair of the Commission, explains here some of the problems the report seeks to address.
When we began the consultation process for the Commission, it was curious to observe how the widespread perception of “a problem” quickly dispersed, on closer examination, into many problems. Each of these problems was of a very different kind – and not all of them turned out to be so very new, despite the sense of urgency over so-called “fake news.” What was really new, we concluded, was the present critical juncture at which we find ourselves.
This critical juncture is marked by:
- The predictable but also the unintended consequences of profound shifts in the globalised information and communication ecology
- The rapid rise to dominance of huge digital platforms which tend to escape state oversight and regulation and, in parallel, threaten the already fragile business models of established news producers
- Damage caused by a few particularly bad actors, and many more who should have known better
These factors, together with the surprising electoral outcomes in 2016, has meant that this critical juncture suddenly made itself felt in the public sphere. Here it gained the problematic but instantly recognisable label “fake news”.
In the middle of all this – positioned simultaneously as the victim and the perpetrator – are ordinary people: the general public, the news audience, the social media users. They are the victim, seen as vulnerable to misinformation, easily nudged and manipulated, lacking in necessary critical media literacy.
But as the perpetrator, they are seen as culpable for sharing fake news, for making bad choices, and for preferring emotional drama and entertaining content over responsible journalism.
In analysing the struggles of the general public faced with today’s crisis of information, we decided to call up the memory of Liberal economist William Beveridge, who in 1942 grounded the argument for the Welfare State in an indictment of five “Giant Evils” in society: squalor, ignorance, want, idleness, and disease.
This metaphor of giant evils provoked us to seek to gauge the scale and reach of the different kinds of evidence relevant to the different kinds of harm, and to counter or question some of the many claims being made about people as victims or perpetrators. For depending how one evaluates these claims, different solutions come into play.
The report identifies five evils of the information crisis – confusion, cynicism, fragmentation, irresponsibility and apathy.
Confusion – the public is less sure about what is true and who to believe.
- Confusion is being generated by rapid media change, bringing a super abundance of sources available on a plurality of platforms that can leave individuals disoriented.
- That confusion is exacerbated by an advertising model that hard-wires the continuous targeting of hyper-partisan views that play into people’s fears and prejudices
- It is increased by ‘information pollution at a global scale’, as the Council of Europe’s report on Information Disorder put it last year.
Cynicism – citizens are losing trust, even in trustworthy sources. This is a global trend.
- In the US, survey results indicate that the average American viewed at least one fake news story in the months leading up to the 2016 election, with more than half of them saying that they believed the fake news stories.
- In Europe, evidence from a large-scale survey indicates that young people are less trusting of the news media and less likely to think the news media are doing a good job in their key responsibilities.’
- Cynicism is amplified by the deliberate exploitation of system vulnerabilities through information warfare and the spread of false information, destabilising public confidence and fomenting social antagonism.
Fragmentation – although citizens have access to potentially infinite information, the pool of agreed facts on which to base societal choices is diminishing.
- There is evidence that citizens are becoming more divided into ‘truth publics’ with parallel realities and narratives online.
- Yet the most enthusiastic users of social media have been shown to have a wider range of information sources than people who rarely go online.
Irresponsibility – arises because power over meaning is held by organisations that lack a developed ethical code of responsibility and that exist outside clear lines of accountability and transparency.
- The use and abuse of platforms is amplifying the reach of misinformation in politics, health, education and more.
- The absence of transparent standards for moderating content and signposting quality can mean the undermining of confidence in authorities and declining public trust in science and research.
Apathy – citizens begin to disengage from society as they lose faith in democracy.
- The Reuters Institute Digital News Report suggests that in the UK there is declining trust that either government or the technology companies will act in the public interest. A well-established tactic of information warfare is to sap morale by continuous attrition through the propagation of misinformation.
In combination, these ‘evils’ comprise a threat at multiple levels and in different spheres – from individual decision-making to democratic government. We argue that they should be addressed through a systemic and robust response which is both multidimensional and coordinated.
Media literacy
Yet it is often hoped, particularly by those who are wary of the complications of regulation, that we could deal with all this by teaching people to work out what’s fake and what’s real, to understand the digital environment, and to take responsibility for their own news choices and the decisions they make as a result.
But education is no silver bullet solution, and nor should it be called on merely because other solutions are seen as too difficult, as the policy of ‘last resort.’ This is not to argue against educating the public: our report calls for an urgent, integrated, new programme in media literacy, both for children and for adults. This cannot be done in a one-shot awareness raising campaign – it will take investment, but it should repay dividends.
However, we cannot teach what is unlearnable, and much of today’s digital environment defies the power of teachers to explain it. In order for people to know what to trust, they need markers of credibility, information about sources, ways to check the information and forms of recourse when they’ve been had. We need the codes, standards and policies that build capacity for individuals to act.
Without intervening in the information environment through policy and regulation, we risk tasking the individual with dealing with the complexities and problems of today’s information crisis. Since the individual can hardly succeed where governments cannot, relying on media literacy alone risks not only burdening but also blaming the individual for the problems of the digital environment.
Beveridge demanded that proposals for improvements should transcend sectional interests, build a comprehensive policy for social progress, and enjoin state and the individual in a cooperative vision. Going forward, we must also enjoin the efforts of the private sector, by one means or another. And while I do not think we are proposing an equivalent of the NHS, I do think that this Commission is part of an equally far-reaching process – shaping the digital infrastructure for democratic society.
I do not think we have all the answers in our report, but I take heart from observing that many other actors in many countries are contributing their expertise to this larger debate, and many of those voices are beginning to converge on an emerging way forward.
This article gives the views of the author, and not the position of the LSE Media Policy Project nor of the London School of Economics and Political Science.
2 Comments