LSE - Small Logo
LSE - Small Logo

Marie Oldfield

May 9th, 2024

Code Dependent: Living in the Shadow of AI – review

0 comments | 4 shares

Estimated reading time: 5 minutes

Marie Oldfield

May 9th, 2024

Code Dependent: Living in the Shadow of AI – review

0 comments | 4 shares

Estimated reading time: 5 minutes

In Code DependentMadhumita Murgia considers the impact of AI, and technology more broadly, on marginalised groups. Though its case studies are compelling, Marie Oldfield finds the book lacking in rigorous analysis and a clear methodology, inhibiting its ability to grapple with the concerns around technology it raises.

Madhumita Murgia spoke at an LSE event, What it means to be human in a world changed by AI, in March 2024 – watch it back on YouTube.

Code Dependent: Living in the Shadow of AI. Madhumita Murgia. Picador. 2024.


Code Dependent Book coverCode Dependent is a collection of case studies about people from marginalised groups in society who both work in and are negatively affected by technology. However, the book’s arguments pertain to subjects such as worker and refugee rights and global economies rather than the artificial intelligence (AI) of its title. It lacks a unifying thread and the initial chapters do not set up the purpose or main theme of the book. A clearer view is eventually provided on page 267, ie, “the pattern that has emerged for me is the extent of the impact of AI on society’s marginalised and excluded groups; refugees migrants, precarious workers, socioeconomic and racial minorities and women”.

Beyond algorithms, aggregated data and interconnected databases are one of the most concerning and problematic ways to use data.

Beyond algorithms, aggregated data and interconnected databases are one of the most concerning and problematic ways to use data. This suggests that unfit for purpose predictive analytics may be used for incorrect policing and manipulation of the public. We see social media manipulation of the public openly stated in manifestos from governments to world organisations and defence bodies under the auspices of “keeping people safe” or “protecting resources”. The author touches on this in the chapter “Your Rights”, which discusses nefarious uses of facial recognition software and how Meta was sued for their social media algorithm potentially facilitating murders in Ethiopia. The case study illustrates the dark side of technology, showing how technology can easily be used for oppression. However, this chapter, like many of the others, feels light in detail and analysis when its subject matter could easily warrant its own book.

The book contains a number of fundamental flaws that detract from the compelling nature of its case studies. The lack of a clear methodology, justifications for the choice of subjects examined and an outline of the book’s purpose immediately limit the reader’s ability to access the material effectively. There is a lack of prerequisite knowledge of philosophical and technical principles inherent in AI development that inhibits the author’s capacity to grasp the human experiences discussed or connect them to AI in a meaningful way. Some of the more concerning failings were several statements about technology that are either incorrect or unexplained, as well as strong contradictions within the material itself. For example, the concept of “algorithm” is never defined, despite being key to the text and the term “clean data set” is misinterpreted. The description of machine learning models (9) is technically incorrect, displaying unfamiliarity with the nature of models and algorithms. Poor data is not necessarily a driver of algorithmic bias, as Murgia suggests.

The book also lacks balance and a solid research grounding. There is a seeming intention to guide readers to specific, strong views, supported by cherry-picked research and stories that are not all suitably justified. This has the potential to be misleading. The positioning of this book in a small ecosystem of media-friendly personalities in AI leads to a myopic view of the industry and omits more robust research recent issues and developments in AI, such as dehumanisation, funding, technical development, lack of education around algorithms and risk and studies of weaknesses in AI implementation. The author admits to sourcing references by browsing papers from a few media-friendly AI personalities. This absence of a rigorous research methodology casts doubt on the credibility of the conclusions drawn from the case studies.

Disciplines such as philosophy, sociology and psychology are commented on, but without in-depth research and discussion on their relevance to AI, such as in the context of anthropomorphism, morality, human thought and decision making. Thus, the topic of algorithms “hiring and firing” workers lacks a deeper discussion around why this is different to a human performing the same action. The description of “data labelling facilities” (19) to refer to data warehouses of thousands of people sifting images for low pay is confusing to the reader, especially when these workers are referred to as “slaves” with little choice over their own exploitation (30). The wages are discussed as being low, but not contextualised. Murgia cites vast warehouses full of non-technical people classifying images to then be fed into an algorithm, a description which reveals the author’s lack of knowledge of the algorithmic design process. A possible reason for this apparent level of “data labelling” could be that we cannot represent human experience in an algorithm.

The author avoids a nuanced discussion of the simultaneous positive and negative aspects of technologies.

The author avoids a nuanced discussion of the simultaneous positive and negative aspects of technologies. In the chapter on health, the technology taking and using your x-ray data is acceptable (no mention of consent) but in the facial recognition case it is an invasion of privacy. Aside from informed consent, this ignores the key questions of motivation, purpose and ethics. The book overlooks both the potential nefarious uses of technology via optimism bias, ie uses within health that take data without consent or for profit and the positive uses of the technology used for deepfake pornography, which is used to make avatars and animated films. This latter issue around pornography is certainly a concerning, but Murgia refrains from presenting any of the remedies and current work in this area. There is a much deeper discussion to be had here. The issues are not always black and white; they are conceptually complex and require unpacking.

If Murgia had limited the book’s scope to case studies on the extent of the impact of technology and AI on marginalised and excluded groups [] or even on data transparency it would be far more coherent.

If Murgia had limited the book’s scope to case studies on the extent of the impact of technology and AI on marginalised and excluded groups – refugees, migrants, precarious workers, socioeconomic and racial minorities and women – or even on data transparency it would be far more coherent. As it is, the book is a long, meandering read that weaves through complex concepts and issues as if they are already understood by the reader. In order to position the book under the banner of AI, it tries to accomplish too much with too little rigorous, in-depth research, ultimately limiting its capacity to engage with pressing concerns posed by the rapid technological development of our times.


Note: This review gives the views of the author, and not the position of the LSE Review of Books blog, or of the London School of Economics and Political Science.

Image credit: whiteMocca on Shutterstock


 

Print Friendly, PDF & Email

About the author

Marie Oldfield

Marie Oldfield

Dr Marie Oldfield is the Chair and Founder of the IST AI Special Interest Group and Women in Tech Group, CEO of Oldfield Consultancy, and a Senior Lecturer at LSE. With a foundation in Mathematics and Philosophy, she is a distinguished figure in AI and Ethics, serving as a trusted advisor across government, defence, and legal sectors.

Posted In: Book Reviews | Contributions from LSE Staff and Students | LSE Event | Science and Tech

Leave a Reply

Subscribe via Email

Enter your email address to subscribe to this blog and receive notifications of new posts by email.

Creative Commons Attribution-NonCommercial-NoDerivs 2.0 UK: England & Wales
This work by LSE Review of Books is licensed under a Creative Commons Attribution-NonCommercial-NoDerivs 2.0 UK: England & Wales.