LSE - Small Logo
LSE - Small Logo

Seeta Peña Gangadharan

June 22nd, 2019

Technologies of control: we have to defend our right of refusal

1 comment | 2 shares

Estimated reading time: 5 minutes

Seeta Peña Gangadharan

June 22nd, 2019

Technologies of control: we have to defend our right of refusal

1 comment | 2 shares

Estimated reading time: 5 minutes

It’s about time that we renewed a conversation about what it means an active digital citizen in the 21st century. The current conditions of our data-driven economy demand that we explore ways to deny technologies the possibility to control us.

Unfortunately, Broadband Technology Opportunities Program, which was the Obama Administration’s signature digital inclusion policy effort, marks the last time the US government paid close attention to digital citizenship. Funded by the American Recovery and Reinvestment Act (2009), the program channeled $4.7 billion aimed to drive economic prosperity, drive educational attainment, and create a more inclusive digital citizenry.

But the history of the program reveals a cruel irony. While many — myself included — celebrated this policy as a means to meaningful participation in digital society (see here and here for examples), the toxic subprime loan industry and the historic financial crisis — which led to the Recovery Act and subsequent establishment of the Broadband Technology Opportunities Program — provided a clue of what was to come in the country’s next chapter of the information economy: surveillance, targeting, nudging, and manipulation.

As I’ve written elsewhere, the toxic subprime financial crisis relied on a thriving network of actors who helped enable demand for an inherently risky financial product. Digital technology and data lay at the backend of this enabling environment. Online marketers created innocuous seeming websites where people could calculate the kinds of loans they might qualify for. Lenders mined demographic data sources and turned to data brokers to identify areas — a majority of which were Latino and African American — worthy of targeting for subprime buyers for so-called “ghetto loans.”

Today, targeting, nudging, and manipulation extend beyond the financial industry into nearly all areas of personal consumption and core services — like education, public safety, housing, workplace hiring and administration, and health services. These areas are undergoing a transformation in which hyper surveillance and hyper-personalised surveillance reign supreme, thanks to advancements in data processing and storage. As a result, people are more likely to experience digital technologies as control, rather than opportunity.

At Our Data Bodies, a research collective I jointly lead, as with other scholars, we find that the people most affected by the promulgation of technological systems that track, profile, and target are those caught in cycles of disadvantage. That’s because people who are caught in such cycles of disadvantage have less support structures to help them rebound from hardship. Take for example digital welfare systems like Los Angeles’ Coordinated Entry System, which dictates whether someone is going to have a roof over her head. When the reasons for denial are unclear, being technologically connected or digitally included can feel like being kept in one’s place. But it doesn’t have to be. Because as any self-respecting person knows, we deserve better than that. People deserve to be treated as dignified human beings.

Figure 1. Mapping your data self
Source: Our Data Bodies: Digital Defense PlaybookThis work is under a CC-BY- 4.0 licence

Right now, elite insidersresearchers and political leaders argue that the surveillance business of technology means bad democracy and deepens injustice. But for the moment, public discourse focuses on two main things that leave the ordinary, tracked-and-targeted person ill-equipped to express an intensifying frustration-cum-outrage that is their due. We are, after all, the ones who have had to deal with companies and governments’ inexcusably bad governance strategies. So while elites draw attention to how companies could design and deploy fairer or more ethical technologies, or to what regulators could do to break up network monopolies, these solutions have limits, because they concentrate power and choice in tech elites and political elites. Neither of these gives people to power to contest and reject technologies of control that govern our modern lives.

For people to be able to exercise a right to refuse, we need to reinvent what it means to practice civil disobedience in an era of networked and so-called intelligent technologies. And fortunately, models for individuals and organised populations to refuse technological control do exist. Recently, in Los Angeles, Stop LAPD Spying Coalition—a group that partnered with Our Data Bodies at the outset of the project — was instrumental in suspending Operation Los Angeles Strategic Extraction and Restoration (LASER), a data-driven program designed to target and root out criminal offenders, and in pressing pause on the use of PredPol, a related “predictive policing” effort.

The Coalition’s civic action spanned years. When the news media started reporting high-tech policing, the Coalition began researching, organising, and fighting back. By 2016, its members were showing up every week to shut down the Los Angeles Police Commission to make its hyper-surveillance (read: overpolicing) grievances impossible to ignore. To find out the precise nature of the city’s predictive policing programs, the Coalition filed public records requests. When denied, it sued the city. Just last year, it demanded a government audit of the programs. It allied with local groups and academics, which raised conflict-of-interest and reliability concerns about the research used to justify PredPol’s effectiveness to the Police Commission.

Though some might see this example as a high standard, I see it as a type of civic action — a type of digital citizenship — that deserves greater consideration and cultivation. Based on Our Data Bodies’ interviews and workshops, we know that individuals practice acts of technological refusal on a daily basis. Channeling that energy into something more organised will take great effort. But the potential is there. Ordinary users wised up to the problems of technologies of control long before elites caught on. Let’s strengthen that and safeguard our right of refusal to prevent governing technologies from getting the better of us.

Professor Gangadharan’s recent TEDxLondon talk:

♣♣♣

Notes:

  • This blog post appeared originally on LSE USAPP.
  • The post gives the views of its authors, not the position of LSE Business Review or the London School of Economics.
  • Featured image by Seeta Peña Gangadharan, under a CC-BY-SA-4.0 licence
  • Before commenting, please read our Comment Policy

Seeta Peña Gangadharan is an assistant professor in the department of media and communications at LSE. Before joining the department in 2015, she was senior research fellow at New America’s Open Technology Institute, addressing policies and practices related to digital inclusion, privacy, and “big data.” Before OTI, she was a postdoctoral associate in law and MacArthur fellow at Yale Law School’s Information Society Project.

 

About the author

Seeta Peña Gangadharan

Seeta Peña Gangadharan is an assistant professor in the department of media and communications at LSE. Before joining the department in 2015, she was senior research fellow at New America’s Open Technology Institute, addressing policies and practices related to digital inclusion, privacy, and “big data.” Before OTI, she was a postdoctoral associate in law and MacArthur fellow at Yale Law School’s Information Society Project.

Posted In: LSE Authors | Technology

1 Comments