LSE - Small Logo
LSE - Small Logo

Blog Admin

August 12th, 2013

Book Review: Regulating Code: Good Governance and Better Regulation in the Information Age by Ian Brown and Chris Marsden

1 comment

Estimated reading time: 5 minutes

Blog Admin

August 12th, 2013

Book Review: Regulating Code: Good Governance and Better Regulation in the Information Age by Ian Brown and Chris Marsden

1 comment

Estimated reading time: 5 minutes

In issues from online surveillance to social media ethics and piracy, questions of internet governance surround us. In Regulating Code, authors Ian Brown and Chris Marsden make a case for multi-stakeholder co-regulation based around the function of code rather than national geographic boundariesAlison Powell reviews the argument made through the authors’ five case studies of where regulation meets code.

Regulating Code: Good Governance and Better Regulation in the Information Age. Ian Brown and Chris Marsden. MIT Press. March 2013.

Find this book: amazon-logo

This summer’s revelations about the extent of the US PRISM surveillance program and the similar programs in place in the UK put an end to any notion that the internet is an unregulated space. Instead the question becomes one of how it will be regulated, and by whom. In the PRISM case, this means that US-based private companies share the personal details of their customers (in an aggregated, anonymous form, they attest) with government entities, who then share those details (again, in the aggregate, one hopes) with other governments. The fundamental human rights of the individuals whose data are collected and shared are not considered in this equation. The political economy of security, combined with the ease with which data can now be collected and transferred, seem to trump these issues of principle.

Similarly, other questions of politics online easily get sidetracked into calls for technical fixes. When campaigner Caroline Criado-Perez experienced threats of sexual violence on Twitter, the outraged public looked first not to the police, but to the company, and instead of asking for more robust policing, thousands signed a petition asking Twitter for a “report abuse” button, a very blunt technical solution.

These two examples illustrate how regulation online is deeply complex, involving an interrelationship between law and code. They also show how solutions anchored in either law or code can underestimate or overlook issues of moral principle. In part this is because trying to fix norms, laws or policy in code can backfire, but there’s also the fact that managing the “regulation-by-code” is delegated to automated systems, corporate entities, governments and a range of other institutions. Clearly, we need to understand these political economic relationships in order to come up with a better way to govern the internet.

A new book, Regulating Code: Good Governance and better regulation in the Information Age, by Ian Brown and Chris Marsden, attempts to provide recommendations for establishing what they call “holistic” governance and regulation of the internet and other code-based environments based on a series of “hard cases” that illuminate the relationships between code and regulation. The book is timely and important. It’s also not always easy to approach.  Co-authored by a computer scientist (Brown) and an academic lawyer (Marsden), the book argues that regulating the internet is inevitable, and instead of self regulation by companies, or government regulation, the best option from both an economic and human rights perspective is multi-stakeholder co-regulation, anchored in the function of code rather than in the geography of national government.

This is an excellent and provocative suggestion, because it acknowledges both that contemporary information flows are hard to arbitrarily constrain, and that much regulation these days happens through changes to the code that the internet is built on, at various layers in the technical protocol stack. In itself, this is an exciting proposition for a book, but the authors go further, using five “hard cases” to test the development of what they call a “unified framework for research into Internet regulation” (p. 20), including principles for regulatory intervention that balance “due process, effectiveness and efficiency, and respect for human rights”  (p. 20). These principles are referred to as prosumer law.

Addressing these multiple aims in just under 200 pages results in a book that compresses a lot of complexity into its sometimes muddled prose. In an effort to explain the relationships between law, code, type of regulation, political economics and legal frames, the authors leap between multiple examples of similar phenomena: for example varieties of network architecture or privacy policy. The effort is obviously to highlight the many relationships between coded architecture, governance and regulation, but the strong normative claims that the authors make about human rights are not pulled throughout the book, which results in some sections missing an analytic force behind their descriptions.

This is a pity, because if any writing team can make the argument for the simultaneous consideration of law, code, and regulation, it is computer scientist Brown, a Senior Research Fellow at the Oxford Internet Institute and security and privacy expert and lawyer Marsden, a Professor at the University of Sussex and proponent of internet co-regulation. Their combined expertise shows in the multiple modes of analysis they bring to bear to each of the five case studies that make up the core of the book.

The five “hard cases” are privacy, copyright, censors, social networking sites, and smart pipes. The cases are selected to represent the current relationships in play between code and regulation, but it’s interesting to note that among the five, one is a social construct, one a legal construct, two are fully socio-technical (censors and social networking sites) and one encompasses what we might call “the concept formerly known as net neutrality” – normally framed technologically. As an interdisciplinary team, Brown and Marsden shy away from using any theoretical frame or explanatory context.

The cases make tough reading for non-experts. Dense and detailed, they outline the institutional political economy that applies to each of their cases and the outcomes of current policy frames.  They would be useful to researchers working in the areas covered by each of the case studies as they present a few novel ways of organizing the current state of the art, and do illustrate numerous instances of the relationships between code and law. The final two chapters are the most interesting of the book, since they come closest to suggesting a guiding theoretical or analytic frame. In the final chapter, the authors step up and argue for the protection of consumer, human, and speech rights, draw from a normative tradition that articulates code as law, in the tradition of Lessig. At the same time, they position the same normative values in relation to standards, including open source standards. Especially in the final chapters, the book slips back and forth between these positions, introducing a tension between the functional liberalism of Lessig, and the cultural critique provided by the open source movement. These normative positions have their own intellectual histories, but the authors’ holistic view makes it difficult to examine such fine distinctions.

In some ways, it is a great pity that this book is not longer. I would like to think that with more time and space, the authors might be able to do some really useful work explaining to non-experts how code and regulation have come to be entwined, and where and to what extent multi-stakeholder processes intervene in those entanglements.

What the authors have done is to devise a systemic way of analyzing design and governance of large and complex systems that have technical, economic and political implications. They try very hard to come up with a framework that takes account of individual and collective rights without placing them totally in opposition to economic demands. This provides a very useful framework for moving beyond relying only on the market, or on code.

This review was originally published on the LSE Media Policy Project blog.

——————————————————-

Alison Powell is a Lecturer at the Department of Media & Communications at the London School of Economics. Before arriving at the LSE in 2010, Alison was an SSHRC postdoctoral research fellow at the Oxford Internet Institute, where she studied grassroots technology development and digital advocacy and their impact on new media technologies and policies. She has a PhD from Concordia University in Montreal, Canada. Read more reviews by Alison.

Print Friendly, PDF & Email

About the author

Blog Admin

Posted In: Contributions from LSE Staff and Students | Media Studies

1 Comments

Subscribe via Email

Enter your email address to subscribe to this blog and receive notifications of new posts by email.