A recent encounter at a London airport leads Veronica Barassi to reflect on the increasingly pervasive nature of facial recognition technology, and the implications of such data capture for our children’s privacy and surveillance. Veronica Barassi is an anthropologist and Faculty Member in the Department of Media and Communications at Goldsmiths University of London, where she convenes the BA Anthropology and Media Degree and teaches undergraduate and postgraduate students.
A few months ago I was travelling through London Heathrow airport with my daughters. As soon as we reached security, an officer looked at our passports, asked P (4 years old) to look up and not to smile for the camera. After she dutifully obeyed, he took a small, black device and scanned the face of A (9 months). I felt uncomfortable and violated. I asked him why he was scanning my baby’s face. He looked annoyed and explained that it was for security purposes. Facial recognition was being used as an anti-terror measure to prevent passengers travelling on international flights swapping their boarding cards with passengers from domestic flights.
Obviously this is not a new security measure. Facial recognition technologies at Heathrow were introduced more than seven years ago. Yet, I felt that this year the pervasiveness of facial recognition technologies at Heathrow was more evident than other years. Perhaps this is because British Airways has introduced the technology to speed up passengers’ boarding processes.
As I walked through security, I felt a chilling feeling in my bones and wondered whether I would have felt worse if the officer took my children’s fingerprints or scanned their irises. Maybe I would have been more outraged. Yet the result is very similar. The face, like fingerprints, the iris or even the ear canal are all used now to carry out ID checks; they are all classified as biometric data.
The experience made me think about an article I read on OpenDemocracy titled Our Data Doubles: How Biometric Surveillance ushers in New Orders of Control. The article explores the use of biometric data for border control and policing. It highlights the biases and risks of biometric data, and it also shows how we have started to believe that the collection and processing of biometric data at the border is an inevitable process; a process that we have very little control over.
I found my experience at Heathrow particularly revealing because it reminded me not only that we are surrounded by technologies that collect our facial features, but also that the collection of biometric data is extraordinarily problematic. At times biometric data is collected and used in very positive ways (e.g. medical purposes or to finding missing children), but many times it is becoming a political act; one that is connected to the control and governance of people. In fact, as Christine Rogers argues, in the article mentioned above, “One problem with biometrics is that the databases go along with a whole system of categorisations and classifications [..] a person’s iris can provide a kind of password for access or denial without a person’s control over that process.” The truth is that we live in data environments like Heathrow airport where we have very little control over the collection and use of our biometric data.
But how did we end up in a world that scans the face of a 9-month-old? According to Gates (2011), it is since the 1960s that our societies are engaged in the effort of ‘teaching machines’ to read faces, and hence identify individuals. This has created the basis for the development of multiple technological projects of facial recognition that go well beyond issues of security and criminal activity.
Today we are seeing the emergence of a variety of examples that are particularly revealing of the pervasiveness and easiness in which we are embracing facial recognition. Over the last couple of days, for instance, it emerged in the press that summer camps in the U.S. have started to used facial recognition technologies to allow parents to search for their children’s photos.
The question at heart is what happens to our biometric data? I don’t think we have an answer as yet. What we know for sure is that – at present – biometric data laws are being negotiated piece by piece. The extent of this negotiation can be seen in a variety of examples. Whilst in the EU, the GDPR for instance focuses extensively on biometric data and the private and public sectors are trying to comply with these new regulations, in the US we are witnessing a challenge to the Illinois Biometric Information Privacy Act that since 2008 regulates the collection and processing of biometric data in Illinois.
Within these debates facial recognition has become a highly contested terrain to reflect on the uses of our biometric data. Not only we are witnessing the first lawsuit against Facebook’s use of facial recognition, but over the last few days Microsoft president Bradford L. Smith called for facial recognition regulations. His call followed a growing discontent amongst the tech-giant employees for the ways in which facial recognition technologies were being sold to border enforcement organisations. One critical aspect that is emerging from these debates is not only the fact that facial recognition technologies can severely impact human rights but also that these technologies can be inaccurate (check out the debates about the uses of facial recognition by the police in the UK).
As I walked through Heathrow airport crushed by a dystopian feeling, I came to the conclusion that if the future of our biometric identities is being negotiated piece by piece we need to start from the beginning and question the role of children’s biometric traces and the social and political implications that these traces may have in the future.
Notes
This text was originally published on the Child, Data, Citizen website and has been re-posted with permission.
This post gives the views of the authors and does not represent the position of the LSE Parenting for a Digital Future blog, nor of the London School of Economics and Political Science.