by Lotte Wolff
Automated technology is increasingly part of the decision basis for refugee status. Alongside this, many countries in the ‘Global North’ grant individuals’ asylum if they have been persecuted in their country of origin for their gender identity or sexuality. In this article, Lotte Wolff explores how the highly subjective process of deciding someone’s gender or sexual identity in Australia could soon be shifted onto technology. Can a machine better determine someone’s sexual or gender identity? Does the introduction of algorithms in deciding refugee status counter or entrench existing biases in the current system?
Image by rawpixel via freepik.com
Can a machine learn to detect whether someone is not heterosexual or not cisgender? This might seem like an interesting thought-experiment. But for queer individuals seeking asylum, this question might impact their chance at receiving refugee status in the future.
In many countries in the ‘Global North’, individuals can seek asylum if they have been persecuted in their country of origin for their sexuality or gender identity. In order to be eligible, either the UN Refugee Agency (UNHCR) or a government department makes an assessment about whether or not their claimed identity is genuine, and they will face persecution upon return.
Persecution, which is relatively easy to prove, can take the form of official persecution (criminalisation or violence by the State and its authorities) or societal persecution (family violence/rejection, discrimination, pervading climate of stigma).
The assessment of genuine LGBTI+ status on the other hand, is significantly more complicated. This is mostly based the applicant’s story – narrative evidence. This narrative evidence can include questions such as “when did you first realise you were not straight/cisgender?”, “When did you come out?”, “How did your family react?” or “Have you had any same-sex relationships?”. The answers to these questions are then used to make a credibility assessment about whether or not someone is a credible refugee who will face persecution for their identity if they are forced to return.
The first refugee claims based on sexuality and gender identity in Australia were in the early 1990s. Research documenting refugee claims in Australia between 1994 and 2000 showed that whilst only 7% of lesbian claims succeeded, 26% of gay men were granted refugee status.
Since then, the landscape of LGBTI+ rights has changed significantly in Australia – with many changes occurring in the last five years. The age of consent was legalised nationally in 2016, same-sex couples were granted the right to marry in 2017, same-sex adoption became possible nation-wide in 2018.
Has this wave of improvement of LGBTI+ rights transferred to LGBTI+ refugee applicants? Although a currently unpublished report by Equality Australia shows that there are higher success rates, the ways decisions are made are still plagued with prejudiced decision-making and stereotypical assumptions about how queer people look and behave. Claims labelled gay and lesbian were roughly equally likely to succeed (~40%) but claims regarding bisexuality only had an 18% success rate. A misunderstanding of the fluidity of sexuality and narratives that are less familiar (such as a bisexual person with past relationships with various genders or currently in a heterosexual marriage) contribute to this inequity. Other research has found current issues surrounding the misgendering of applicants and the conflation of sexuality and gender identity in decision-making.
Last year I conducted research with the legal representatives of LGBTI+ refugee applicants in Australia. The interview with the government department responsible for refugee decisions is an extremely high-stakes environment in which an applicant must prove their personal identity through detailing intimate and often traumatising personal experiences. Decisions are generally made by one public servant and there appears to be a lack of understanding of the cultural differences between expressions of LGBTI+ identity. I have been astounded by how much the outcome is largely at the discretion of the decision-maker and the biases that persist despite changing attitudes towards LGBTI+ people in Australia.
While it is not the central question of my research, this has left me thinking about how this decision-making process will change in the future. What will the role of technology be in this highly subjective and personal process? It is evident that the landscape of refugee protection, migration management and aid is already changing rapidly in relation to technology. In 2015, the UNHCR started using an iris scanning system to identify refugees eligible for cash assistance and this is currently being rolled out on a larger scale.
How will decision-making processes regarding who is granted refugee status, and who is excluded, be influenced by these technologies? Canada is already testing the use of algorithms and automation to make decisions and sort applications in an immigration and refugee context. Lie detector tests have been piloted at European airports.
Theoretically, these automated technologies could be used to supplement or replace individual decision-makers in the future. They could be employed to sort applications into various categories or assess whether refugee applicants are telling the truth. Employing technology to help with the credibility assessment of the narrative evidence in LGBTI+ cases could reduce the prevalence of personal bias in decision-making that I have found in my research. It also has the potential to speed up a process that sees applicants sometimes waiting more than three years for a decision.
However, these technologies not neutral. Algorithms and databases have been well documented to have gender and racial biases built into them. They are also subject to change for political or other reasons. In the United States, Immigration and Customs Enforcement (ICE) has received criticism for changing its risk assessment algorithm to recommend detention for all apprehended immigrants instead of just those with a criminal background. The physical movements and tone associated with deception (micro-expressions), that are used in lie detection technologies, have not been proven to be universal, presenting a major flaw in the databases used to train these technologies.
Although, in most countries, the use of automated technologies in refugee decision-making remains purely speculative, developments in Canada demand we pay attention to the potential implications. Human rights and research groups have called for independent oversight mechanisms for these technologies, including an ability to question any automated decisions.
If these technologies were employed in refugee decision-making, how could possible bias and lack of oversight impact marginalised applicants? They will undoubtedly have multiple forms of normative bias – including cultural, class and cis/heteronormative. For lie-detection technology to be effective, for example, they would have to take into account applicants’ inability to give coherent or consistent narrative evidence due to trauma or cultural differences in communication.
For LGBTI+ applicants specifically, would they be able to account for the diversity in the expression of sexual and gendered identities found across the world? Researchers have mapped the diversity of gender identities worldwide, showing that Western conceptualisation and labelling of identity into distinct sex, gender identity, gender roles and sexuality categories is not consistent globally or historically.
Researchers at Stanford University created a ‘Gaydar Machine’ to identify people’s sexuality from photographs with between 71 and 91 percent accuracy. However, they acknowledged that the algorithms were trained on white, American, openly gay men and thus would not have further implications beyond this limited group. If similar technology were to be employed in refugee decision-making, the algorithms would have to be trained on all possible expressions of sexual and gender identity globally, in order to have any degree of accuracy.
The use of modern technologies in decision-making definitely looks promising to overcome the discretionary power of decision-makers and process applications faster. Yet, we must make sure that they do not entrench existing biases in the current systems and have sufficient oversight mechanisms that can account for genuine diversity of sexual and gendered identities and expressions.
Lotte Wolff is a Dutch/Australian recent graduate of the Graduate Institute of International and Development Studies in Geneva. She has conducted research on LGBTI+ people seeking asylum in both the Netherlands and Australia, combining queer theory with an interdisciplinary analysis of domestic migration policies and practices. She is particularly interested in the intersections of colonial history, postcolonial, gender and queer theory, and migration governance