In Algorithms of Oppression: How Search Engines Reinforce Racism, Safiya Umoja Noble draws on her research into algorithms and bias to show how online search results are far from neutral, but instead replicate and reinforce racist and sexist beliefs that reverberate in the societies in which search engines operate. This timely and important book sheds light on the ways that search engines impact on our modes of understanding, knowing and relating, writes Helen Kara.
This post originally appeared on LSE Review of Books. If you would like to contribute to the series, please contact the managing editor of LSE Review of Books, Dr Rosemary Deller, at firstname.lastname@example.org.
Algorithms of Oppression: How Search Engines Reinforce Racism. Safiya Umoja Noble. New York University Press. 2018.
Google search results are racist and sexist. Perhaps you know this already or maybe it comes as a surprise. Either way, Algorithms of Oppression: How Search Engines Reinforce Racism by Safiya Umoja Noble will have something to offer. Noble is an associate professor at UCLA in the US, and her book is based on six years of research into algorithms and bias.
As the book’s subtitle suggests, algorithmic bias is not unique to Google. However, Google is the dominant search engine, such that its name is now in common use as a verb. Pretty much everyone with internet access uses search engines, and most of us use Google. Many people regard Google as neutral, like a library (which, of course, wouldn’t be neutral either, though that is a different discussion). However, Google is not neutral: it is a huge commercial corporation which is motivated by profit. The ranking system used by Google leads people to believe that the top sites are the most popular, trustworthy and credible. In fact, they may be those owned by the people most willing to pay, or by people who have effectively gamed the system through search engine optimisation (SEO).
Ten years ago, a friend of Noble’s suggested that she google ‘black girls’. She did, and was horrified to discover that all the top results led to porn sites. By 2011 she thought her own engagement with Black feminist texts, videos and books online would have changed the kinds of results she would get from Google – but it had not. The top-ranked information provided by Google about ‘black girls’ was that they were commodities for consumption en route to sexual gratification.
When problems such as those that Noble experienced are pointed out to Google representatives, they usually say either that it’s the computer’s fault or that it’s an anomaly they can’t control. This reinforces the misconception that algorithms are neutral. In fact, algorithms are created by people, and we all carry biases and prejudices which we write into the algorithms we create.
Interestingly, Noble reports that Google’s founders, Sergey Brin and Larry Page, recognised that commercial agendas had the potential to skew search results in an article they wrote while they were doctoral students at Stanford. At that very early stage, they suggested that it was in the public interest not to have search engines influenced by advertising or other forms of commercialism.
Porn makes money. If you have money, and you pay Google, you can generate more hits for your website. The internet has a reputation as a democratic space, yet the commercial interests that influence what we can find online are largely invisible. The internet is far from being a space that represents democracy, and what is more, people are now using the internet to affect offline democracy.
Image Credit: (Pixabay CCO)
After the book’s first two long chapters about searching online in general, and the results of Noble’s internet search for ‘black girls’, there is a short and vivid chapter in which Noble forensically implicates Google in the radicalisation of Dylann Roof. Roof is a young white American who carried out a terrorist attack on African Americans who were worshipping at their Christian church, killing nine people. Noble is careful in this chapter to use the word ‘alleged’: Roof allegedly searched for information to help him understand the killing of Black teenager Trayvon Martin by a white neighbourhood watch volunteer who was acquitted of murder. Allegedly, using the term ‘black on white crimes’, he found conservative, white-run websites preaching white nationalism for the US and Europe, and encouraging racial hatred, particularly of Black and Jewish people. Noble used the same term and found similar results under different research conditions. She notes that Roof’s alleged search term did not lead to FBI statistics which show that violent crime in the US is primarily an intra-racial, rather than an inter-racial, problem. Presumably white nationalist sites are willing to pay Google more, and/or put more time into SEO, than the FBI. Noble concludes that search engines ‘oversimplify complex phenomena’ and that ‘algorithms that rank and prioritize for profits compromise our ability to engage with complicated ideas’ (118).
There are three more chapters: ‘searching for protections from search engines’, ‘the future of knowledge in the public’ and ‘the future of information culture’. Towards the end of the middle one of these is an impactful statement:
Search does not merely present pages but structures knowledge, and the results retrieved in a commercial search engine create their own particular material reality. Ranking is itself information that also reflects the political, social, and cultural values of the society that search engines operate within… (148)
Technology and the internet don’t simply inform or reflect society back to us in a perfect facsimile as if the screen were a mirror. They change how we understand ourselves and our cultures and societies. One particularly complicated idea, which Noble expresses succinctly, is that ‘we are the product that Google sells to advertisers’ (162). Also, as we interact with technology and the internet, they change us and our networks and surroundings. This applies to journalism too; yet while there is plenty of room to critique journalism for ‘fake news’ and bias and so on, there are also journalistic codes of ethics against which journalists can be held to account. There is no equivalent for those who write algorithms, create apps or produce software or hardware.
As Noble underscores, Google is so massive that it’s hard to comprehend. It set up its parent company, Alphabet, in 2015, apparently to ‘improve transparency’, among other things. Alphabet is growing rapidly. At the end of 2017 it employed 80,110 people and one year later it had 98,771 staff. Its financial growth in 2018 was $136.8 billion. Its users encompass around half of the world’s internet users. Google’s stated aim is to ‘organize the world’s information’. It seems a little sinister, then, at least to me, that Alphabet has recently expanded into drone technology and behavioural surveillance technologies such as Nest and Google Glass. There appears to be an almost complete lack of any moral or ethical dimension for the organisation of information to which Google aspires. The BBC recently reported that online platforms use algorithms that are built to reward anything that goes viral, which means the more outrageous the content, the more revenue is generated. For example, I recently saw in several places on social media a link to a website with a woman’s skirt for sale bearing a printed photographic image of the gas chambers at Auschwitz, which was later reported in the mainstream media. This is the kind of phenomenon we invite when we allow organisations to become monopolies driven by commercial rather than public interests.
Noble doesn’t try to offer an exhaustive account of all the ways in which the internet affects inequalities. That would certainly require a much longer book and would probably be impossible. She does address a number of power imbalances that intersect with her core focus on race and gender, such as antisemitism, poverty, unemployment, social class and radicalisation. She writes that her ‘goal is […] to uncover new ways of thinking about search results and the power that such results have on our ways of knowing and relating’ (71). I would judge that she has achieved her aim in this important and timely book.
Dr Helen Kara has been an independent researcher since 1999, focusing on social care and health, partnership working and the third sector. She teaches and writes on research methods. Her most recent full-length book is Research Ethics in the Real World: Euro-Western and Indigenous Perspectives (Policy, 2018). She is also the author of the PhD Knowledge series of short e-books for doctoral students. Helen is a Visiting Fellow at the National Centre for Research Methods, and a Fellow of the Academy of Social Sciences.
Note: This article gives the views of the authors, and not the position of the LSE Impact Blog, nor of the London School of Economics. Please review our comments policy if you have any concerns on posting a comment below.