LSE - Small Logo
LSE - Small Logo

Blog Admin

October 31st, 2016

Considering the monstrous in digital methods can inform researchers’ ethical decision making

2 comments

Estimated reading time: 5 minutes

Blog Admin

October 31st, 2016

Considering the monstrous in digital methods can inform researchers’ ethical decision making

2 comments

Estimated reading time: 5 minutes

barnesMonsters stop people in their tracks, make us (re)consider the route we are taking. Consideration of imagined horror is a useful ethical tool for those who use digital methods. In this Halloween-themed post, Naomi Barnes provides historical and literary examples of the association between horror and information technology, challenging users of digital methods to consider whether their practices are rendering the informants of their research subhuman.

Conflation of people and monsters has a long literary tradition and as our supermarkets fill with the spooky sugar and plastic fantastic of Halloween, it is timely to remember the relationship between monsters and humans, especially regarding digital methods. To consider the monstrous in digital methods means questioning assumptions about who the monsters are. Digital researchers are leaders in future methods and should model best practice. If digital researchers relax about where, how and why data is accessed we actively reinforce the well-established socio-political attitudes which allow the teaching profession to be likened to The Blob, university adjuncts to zombies or internet trolls to make death and rape threats.

At present digital methods are intent on enriching information and addressing challenges from the post-positivist world of audits, standardisations, replicability and reproduction. There is a further, less spotlighted challenge for digital methods which comes from those destabilising what it means to be human. This is an ethical challenge which asks researchers to remember that the digital data they use is attached to an organic, living, breathing human. Concentrating on what is written and clicked on in the digital worlds – as information analysed through software packages and algorithms – risks rendering the creators of information as subhuman. As those of us who use the digital world to inform our scholarship reflect on, adjust and reapply digital methods, I posit that considering horror as a metaphor is a useful exercise in a researcher’s ethical decision making.

halloweenImage credit: (HMM) Happy Halloween Little Ghost by aotaro. This work is licensed under a CC BY 2.0 license.

Information ages have been known to use technology to regulate populations by conflating socio-political agendas with monsters. What Asma terms political horror has worked to specifically engineer anti-social feeling towards Others over time, instigating an emotional reaction, and hence greater buy in, to the regulatory socio-political manoeuvrings which occurred alongside the technological advancements.

The information age sparked by the invention of the printing press is most often remembered in the public consciousness for the mass production of bibles. What the new technology was also and most often used for was the production of pamphlets. Asma’s example of political horror in early-modern Europe is one of the better-known pamphlets used to regulate early modern populations: Malleus Maleficarum, a demon hunting guide, showed readers how to identify a witch. The pamphlet alleged that midwives were most likely to be witches as witches were known to eat babies or offer them to evil spirits. Child devouring was also used to demonize Jews. The campaign was to regulate those the Christian faith considered to be particularly blasphemous as they had been exposed to the Gospels yet had chosen to reject them.

The scientific revolution saw another information age in which horror was conflated with technological advancement. Shildrick refers to the classification systems of Linnaeus and Bacon which sought to demystify the monstrous by formalising exclusion through hierarchical categories which placed rational white males at the apex. The conflation of these systems with the eventual popularisation of Darwinian thought in the late 19th century became the socio-political regulatory system known as social Darwinism, which relegated those who did not fit the normative imagination as subhuman and disposable.

By the early 19th century writers were beginning to theorise technological advancements through literary monsters. Blake’s warnings of the dark Satanic mills have sparked debate about whether he was referring to industrial advancement or the universities of Oxford and Cambridge. Both interpretations are warnings of a future in which information and technology are problematic. Furthermore, at the time of the application of electricity and the height of what Robertson and Travaglia describe as the first age of “big data”, Mary Wollstonecraft Shelley published Frankenstein. To the critical digital literacy reader, the novel comes across as a warning about combining the magic of alchemy with the rationality of mathematics but also about the ethical responsibilities universities and scientists have towards their creations and how those creations act on the world.

Similar warnings are found in posthumanist readings of historical events, such as Barad’s reading of the relationship between Werner Heisenberg and Niels Bohr that unleashed the horror of nuclear weaponry on the world. Nobody knows what was said in that meeting between the top nuclear physicists of World War II. Despite Heisenberg working for the Nazis and Bohr for the Americans, the two were still friends. However, after the meeting, one scientist successfully developed nuclear weaponry, the other did not.

The questions arising from these readings of Shelley and Bohr are important to digital researchers. Consideration of the horror of unleashing a naive but powerful cyborg or nuclear winter brings into focus the potential vulnerability of methods which seek to explain and represent the socio-political. A horror imagination can help researchers consider whether they are treating the creators of the information as subhuman. Digital researchers have a duty of care to their informants and to the future application of their findings. Simply because something appears freely accessible on the internet does not mean it is ethically accessible. By using it without due ethical consideration of those who create it, even if only a short tweet, digital researchers risk enabling another monstrous Othering.

Haraway also uses the monstrous (in the form of a cyborg) as an ironic metaphor to guide serious consideration of the political intersection of humanity and technology. In taking this stance, she resists age-old fear conditioning which has long associated the monstrous with technology by embracing the techno-organic cyborg. Monsters are liminal, they resist definition and in doing so remind digital researchers that it is difficult to be certain about the digital.

Like monsters, digital methods are dangerous. They work in a realm that is constantly on the move, evolving, augmenting, surprising. It is important to not get stuck in trying to pin the digital down. It is the task of the social science researcher to not simply increase the knowledge of society. Social scientists are charged to resist the antisocial markers created by imaginaries like political horror by ensuring digital practices are not re-enacting some form of cyborgic social Darwinism between human researchers and digital data.

Note: This article gives the views of the author, and not the position of the LSE Impact Blog, nor of the London School of Economics. Please review our comments policy if you have any concerns on posting a comment below.

About the author

Naomi Barnes, an adjunct at Griffith University in Australia, has investigated methods for understanding how conversations in and about the digital shape both the medium and the user. Dr Barnes has situated her research in Facebook, and academic and news media blogging platforms. Naomi is on Twitter @DrNomyn.

Print Friendly, PDF & Email

About the author

Blog Admin

Posted In: Research ethics

2 Comments