Metric Power. David Beer. Palgrave Macmillan. 2016.
My initial reaction to Metric Power was that, for an essay on the challenges of big data, there was remarkably little of it throughout the book. As David Beer shied away from drawn-out case studies, figures and concrete examples, I found myself getting more and more irritated. How could I trust what he was saying without evidence to back it up? But as I progressed, I realised that this is probably part of his point. As a ‘neoliberal subject’, Beer argues, people like me have a ‘cultural interest in numbers, and a culture that is shaped and populated with numbers’ (149). If something cannot be quantified, our trust and interest in it diminishes.
There is indubitably more data in the world than there has ever been before. In my field, medicine, genomes are being sequenced at an exponential rate, and many institutions have legitimate concerns about where this data will be stored – indeed, some scientists are trying to store data in DNA itself. Companies like Apple and Google have moved into health metrics and are storing tens of thousands of physiological measurements about people who use their health apps. The entirety of Scotland’s computer-based medical records are being made accessible to researchers, who are using machine learning to try and decipher patterns in the relationship between demographics and disease.
Can, or will, this data be used to increase our understanding of basic biological processes, improve health or reduce inequality? And is the metricisation of everyday life leading to changes in ways that we construct social values, live our everyday lives and even how we relate to our bodies? Beer would argue that it is, and he spends Metric Power dissecting the challenges that big data poses both methodologically to the social sciences and to us as individuals.
Beer frames his analysis about the causes and effects of metric power around three key themes: ‘Measurement’, ‘Circulation’ and ‘Possibility’. In ‘Measurement’, we learn about the history of assessing social entities quantitatively. Beer argues that this mode of thinking shapes the social world, quoting Heidegger: ‘calculation refuses to let anything appear except what is calculable’ (235). ‘Circulation’ examines the processes by which metrics about social subjects circulate and a ‘social life of data’ is created. In ‘Possibility’, metrics are related to power, and how they are used to ‘maintain, strengthen, or justify new types of inequality, to define value or worth, and to make the selections central to affording visibility or invisibility’.
Image Credit: Number Drum (zenilorac CC BY 2.0)
One objection to big data is that the ways it is used in research are mostly inferential: observational data is used to try and infer causal relationships in the real world. Rarely does it form part of a controlled experiment; and, as any epidemiologist will tell you, correlation does not demonstrate causation: no volume of data can substitute for a mechanistic demonstration.
This may seem pedantic, but there is something genuinely seductive about the sheer volume of data available to us as researchers, and consumers, today. Volume has become a proxy for value. As an example of the risks of this approach, the US judiciary now apparently routinely uses algorithms to judge the risk of recividism of suspects in the criminal justice system. Northpoint has a 137-section questionnaire, populated by references to a suspect’s criminal record, and a questionnaire answered by suspects themselves. Gradually, according to a ProPublica investigation, there has been mission creep, so that results from the program (which give a prediction of future offences) are used to guide the length of sentences rather than the probation support provided to individuals after their custodial sentences are completed.
Apart from the Orwellian overtones of judgements based on potential future actions rather than demonstrated crimes, these software programs suffer from two fundamental problems. Firstly, their lack of predictive power: there are few studies which look at how well they perform in practice, but a simulation using the software by ProPublica showed they were accurate 61 per cent of the time in predicting future crimes – as they put it, slightly more reliable than a toss of a coin. Secondly, because the questionnaire collects information unlikely to be related causally to the risk of re-offending, the score is as likely to relate to underlying socio-economic factors as much as anything else. ProPublica found the scores given to black Americans were much higher than those given to other ethnicities, when similar factors were taken into account.
Northpoint were cagey when asked to reveal the code underlying their algorithm; companies like Facebook or Google are similarly reticent when requested for details of their software. Given they are commercial entities, their argument that this would undercut the value of their brand seems unexceptionable. Beer would likely situate these examples in the context of what Frank Pasquale has called a ‘black box society’. According to Pasquale, as individuals become increasingly visible, the data infrastructure in which we figure becomes increasingly invisible: important information has passed out of reach, available only to insiders.
These black boxes are also becoming commonplace in healthcare. The Dublin-based company Medtronic provides a wearable device called Enlite, which allows individuals with diabetes to control their blood glucose levels using a wearable pump that makes decisions based on a real-time sensor measuring levels. However, whilst patients are able to access their blood glucose at any particular point, they cannot access the entirety of the data; the same is true for patients who use smartphone apps to communicate pacemaker measurements to their clinicians. This trend is unlikely to change in the near future as companies like Google and Apple poach clinicians to spearhead health metrics projects and start marketising the health data available to them.
It is not hard to see why governments and large corporations would seek to use metrics to understand, predict and control the behaviour of individuals. In Metric Power, Beer, with frequent reference to Michel Foucault, co-locates the evolution of increasingly detailed government records and the development of statistics in a nineteenth-century Europe transformed by the Industrial Revolution. However, he argues that metricisation today is more involved and insidious than this: it is shaping who we are and how we see ourselves.
With regards to social media, Beer wonders if it limits our ability to imagine ourselves socially as we create profiles from a limited repertoire and relatively stereotyped materials (91). He sees friendship and the self increasingly imagined in quantitative terms, and social interactions reduced to differently ranked individuals within a ‘reputation economy’. Whilst an anthropologist might take exception to the idea that a hierarchical approach to social interactions is anything new, there seems a particularly toxic quality to the way social media is shaping ideas of selfhood in teenagers, especially for girls. As the scale of the problem becomes apparent, clinicians working with young people are starting to wonder whether there is a connection between this and rising rates of self-harm in the United Kingdom.
Big data science may lead to breakthroughs that improve the lives of millions of people. However, it is also clear that large amounts of money are being spent on what Nobel Prize winner Sydney Brenner has called an ‘orgy of data extraction’ without a clear idea of what is to be done with it. It is also seems highly likely that big data will allow large corporations to target consumers in increasingly specific ways, and benefit financially from the patients whose data they are analysing.
To me the most interesting question is that of agency, one that Beer touches on but does not engage with in detail. Which individuals are benefiting from this data revolution, rather than at the level of anonymous governments or corporations? Who is driving the metricisation agenda? Why are people choosing to metricise their lives more than ever before, and are relaxed about providing so much information about themselves to digital black boxes? Beer himself tells us that the ‘economic, political and cultural agendas behind black boxes are hard to unravel’ (108), and describes a big data false consciousness where people paradoxically express themselves in public more than ever but with a simultaneous ‘potential closing down of what might look like opportunities for resistance’ (98). It is on this issue that Metric Power might benefit from some concrete case studies in addition to Beer’s musings on his time working in a call centre or how his output as an academic is metricised. Metric power is insidious, pervasive and frightening, but how can resistance start if we don’t know what we are resisting?
- This blog post appeared originally at LSE Review of Books.
- The post gives the views of its author, not the position of LSE Business Review or the London School of Economics.
- Featured image credit:Featured image credit: Number Drum (zenilorac CC BY 2.0)
- Before commenting, please read our Comment Policy.
Thomas Christie Williams is a Clinical Lecturer at the University of Edinburgh in the field of Evolutionary and Molecular Genetics. Prior to taking up this post, he was a Specialist Registrar in Neonatal Medicine. Thomas has a long-standing interest in how our evolutionary past is relevant to human health and disease today.