LSE - Small Logo
LSE - Small Logo

Blog Admin

September 10th, 2021

How surveillance technologies and neighborhood watch apps are capturing and reflecting communities’ prejudices 

0 comments

Estimated reading time: 5 minutes

Blog Admin

September 10th, 2021

How surveillance technologies and neighborhood watch apps are capturing and reflecting communities’ prejudices 

0 comments

Estimated reading time: 5 minutes

Alongside the rise of social media, the last decade has seen significant growth in home surveillance technologies and community surveillance apps. Stefano Bloch takes a close look at one such app, Nextdoor, finding that it worsens neighborhood paranoia around imagined threats, which are often racialised, and operates as an unsanctioned surveillance tool for many US police forces.

Surveillance technologies, once the domain of heavily guarded private spaces and oppressive public places, is now almost as ubiquitous a feature on people’s homes as welcome mats and weathervanes. Amazon’s Ring, which Lauren Bridges of the Annenberg School for Communication at the University of Pennsylvania describes as “the largest civilian surveillance network the US has ever seen,” contains an imbedded camera in an equally futuristic-looking “doorbell.” This best-selling of all Ring products connects to a homeowner’s cellular technology, continuously capturing scenes of everyday domestic life such as uneventful family visits and the routine delivery of packages.

Ring footage can also be uploaded to the Internet in an effort to capture “porch pirates” and “suspicious characters” in action. Far more often than not, those caught on camera turn out to be kids selling magazine subscriptions or neighbors looking for lost dogs.

Nextdoor: community surveillance and security politics

While Ring has received the most attention from journalists and academic alike, it is actually the lower tech forms of community surveillance that are promoting a wild west return to rampant racial profiling and vigilantism. A case in point is the Nextdoor app: a popular place-based community networking and virtual neighborhood watch platform that currently has over 27 million registered users and has a valuation of over $2billion.

As described by the company, a “Founding Member” logs onto Nextdoor and takes the initiative to determine the geographical boundaries of a particular neighborhood, as well as how many houses and residents will be invited to join. Once launched, members share information about local activities that range from yard sales to upcoming street repairs, but more often than not descend into a security politics of neighborhood watch activities, without any regard to the actual presence of crime or observed criminality.

Exacerbating paranoia about neighborhood safety, Nextdoor, like any of the other similar platforms on the market, renders face-to-face interactions traditionally needed to engage in community building even less necessary. What results is a spatial politics of exclusion, enabled by imaginaries of who belongs where. The anonymity made possible by such platforms also demands far less knowledge of place, as residents can distill their understanding of how a particular neighborhood functions by scrolling through a collection of largely uninformed complaints about incivility, suspicious persons, and other perceived nuisances that apparently do not need to be seen to be believed.

While sold in the name of community building, apps like Nextdoor are making much of the footwork needed to actually build and engage in community obsolete.

Reflecting our own prejudices with technology

To be fair to the platform and its creators, Nextdoor is more of a clearinghouse than a facilitator for what already lies in the hearts and minds of some of its users. Nextdoor merely allows for anonymous perspectives on “safety” to be articulated in more racially charged ways. That is, the platform makes space for the articulation of what anthropologist Dána-Ain Davis calls “muted racism,” whereby paranoid neighbors couch their suspicions of “outsiders” in language that evokes images of racialized others gleaned from outdated renderings of black men in popular media and bigoted social imaginaries.

As Quartz journalist Hanna Kozlowsky puts it, “People have always been curious about crime, fearful for their safety, and yearned for community. But today, technology can supercharge these feelings, and sometimes helps people give into their worst inclinations.” While the facilitation of bigoted community protectionism is bad enough, it is actually Nextdoor’s usage by police that raises the most ethical as well as constitutional questions.

Ring, CC BY-SA 4.0 via Wikimedia Commons

Added to the highly localized and untrained community surveillance efforts facilitated by apps like Nextdoor, police agencies are now signing onto such platform themselves, collecting data and joining in on the conversation about who does and does not appear to belong. As CityLab’s Sarah Holder describes it, “Nextdoor wants to be a one-stop shop for police [despite] facilitating vague, racially coded, or racist posting.” Over 1,500 police departments across the US now rely on Nextdoor, enabling the build-up of what Rahim Kurwa of the University of Illinois – Chicago calls “digitally gated communities.” And the data that police are collecting from sources such as Ring and Nextdoor are done without the need of a warrant and, as Yesenia Flores argues in the California Law Review, may be undermining Fourth Amendment protections against unreasonable searches.

How Nextdoor is tackling racism

One of the steps Nextdoor has had to take in the face of widespread community-instigated racial profiling and exclusion is the creation of an “Anti-Racism Hub” on which resources for abating racism are provided, as well as politicized statements that speak to the current racial reckoning occurring in the Unites States. As the site reads:

We stand in solidarity with black neighbors. Nextdoor supports the Black Lives Matter movement. All Lives Matter and Blue Lives Matter content is explicitly prohibited when used to undermine racial equality or the Black Lives Matter movement. Support for White Lives Matter is prohibited on Nextdoor.

Additionally, through its Anti-Racism Hub, Nextdoor provides a tutorial on what constitutes platform racism and unconscious bias. In addition to forbidding “threats, insults, and hate-speech,” users are reminded not to “assume that someone is engaged in suspicious activity or criminal behavior because of their race or ethnicity.” The need, it seems, for an “Anti-Racism Hub” on what some may otherwise see as an innocuous community engagement platform may be evidence enough to illustrate how muted racism and profiling has become endemic to the politics of “community” building and aesthetics of “safety.”

As I argue in my research on aversive racism (a form of insidious and implicit racism), community-instigated policing, and the spatial politics of Nextdoor, ubiquitous at-home surveillance technologies and neighborhood watch apps such as Nextdoor and Ring may actually be better at reflecting our own community biases than they are at helping us catch criminals and maintain a sense of “safety.”

Please read our comments policy before commenting.

Note: This article gives the views of the author, and not the position of USAPP – American Politics and Policy, nor the London School of Economics.

Shortened URL for this post: https://bit.ly/2XcevbI


About the author

Stefano Bloch – University of Arizona
Stefano Bloch is Assistant professor in the School of Geography, Development, and Environment at the University of Arizona. He is the author of Going All City: Struggle and Survival in LA’s Graffiti Subculture published by University of Chicago Press.

About the author

Blog Admin

Posted In: Democracy and culture | Stefano Bloch

Leave a Reply

Your email address will not be published. Required fields are marked *

LSE Review of Books Visit our sister blog: British Politics and Policy at LSE

RSS Latest LSE Events podcasts

This work by LSE USAPP blog is licensed under a Creative Commons Attribution-NonCommercial 3.0 Unported.