LSE - Small Logo
LSE - Small Logo

Helena Vieira

March 22nd, 2018

Facebook and Cambridge Analytica: let this be the high-water mark for impunity

3 comments | 1 shares

Estimated reading time: 5 minutes

Helena Vieira

March 22nd, 2018

Facebook and Cambridge Analytica: let this be the high-water mark for impunity

3 comments | 1 shares

Estimated reading time: 5 minutes

The last few days represent more than just the most recent and inevitable controversy emanating from Facebook’s beleaguered offices. The scandal over Cambridge Analytica’s participation in electoral manipulation and gross breaches of privacy have resonated more widely with users than the earlier allegations about fake news and Russian connections.

On an individual level, Facebook users have to contend with the fact that through no fault of their own, their personal information was harvested and weaponised by the fractious company that provided analytics for the Brexit campaign and Donald Trump’s presidential campaign. On a societal level, we are only beginning to understand how our democratic institutions are being manipulated by unethical technological practices that transform a loose amalgamation of interests into targeted advertising. This meddling could have disastrous consequences; facilitating the rise of a new Radio Rwanda or form of agitprop for the Web 2.0 world. We are now in the era of bespoke, personalised propaganda.

There is still time, however, to reverse the trend of declining privacy and technological elitism. This latest Facebook scandal can be a catalyst for regulatory change, ensuring that 2018 becomes the high-water mark for impunity in Silicon Valley (and all of its various incarnations around the globe). It is time to apply pressure to ensure that Facebook can no longer turn a blind eye to the actions of third-party software developers and the other clients it allows access to user data. The company may not have intended to become synonymous with electoral manipulation and fake news but by failing to address key issues (including being aware of this particular privacy breach since 2015) they now must accept responsibility for their inaction. This article will therefore offer a number of suggestions for how we can ensure this scandal results in real change in how social media companies and their clients do business.

First, when trying to rectify this massive breach of trust, one must prioritise transparency. The famous US Supreme Court Justice Louis Brandeis once opined that “Sunlight is said to be the best of disinfectants; electric light the most efficient policeman.” Unfortunately, for too long, social media companies have been allowed to keep users and regulators in the dark, shielding their practices behind claims of proprietary technology and excessive secrecy. Without transparency, would-be critics and regulators struggle to collect information, often at the detriment of the everyday user who would benefit from their intervention. This may seem like an obvious place to begin but it is the only way to start.

Second, we must create systems of oversight so we are not entirely reliant on the actions of whistle-blowers. It cannot be emphasised enough that for every Christopher Wylie, there are hundreds (or even thousands) of people who work in tech who would have been aware of some of the underlying issues in this scandal but did nothing. One notable feature of the current controversy is how much the parties involved tried to “pass the buck” when discussing who was really at fault. The Guardian reported, for example, that “Cambridge Analytica said that its contract with GSR stipulated that Kogan should seek informed consent for data collection and it had no reason to believe he would not.” Instead of allowing data to be traded in wilful blindness, we need to ensure that we have systems of oversight that act as interlocking chains, where each party that passes along data must be diligent in ensuring that the receiving party adheres to certain data protection principles.

So, for example, Facebook must take an active interest in the actions of third-party companies like Global Science Research (GSR) who, in turn, must contract with companies like Cambridge Analytica in a legal and ethical manner. These chains need to begin at the governmental level and must include tech experts who can audit the practices of these companies and report back to policy-makers. In the last twenty years, our lives have become profoundly affected by two major systems that laymen cannot understand (the global financial market and the tech industry) and we need specialist watchdogs who possess the expertise to identify risks to the public.

Finally, we must empower users to become true guardians of their personal data. This stewardship can only occur if the requirements of informed consent are strictly enforced. One of the most disturbing aspects of the Cambridge Analytica scandal has been the complete disregard for informed consent. While users participating in the original GSR quiz did provide their consent for their data to be used in academic settings, they were not informed that their data would eventually be packaged and sold to commercial companies like Cambridge Analytica. Even more disturbingly, the GSR app also collected the information of all of the quiz-takers’ Facebook friends and harvested their data without even the thinnest veneer of consent.

We need the equivalent of the warning label on cigarette boxes for personal data exchanges through social media. Users need to be explicitly informed that by maintaining a profile in general, but also by completing quizzes or installing third-party apps that their data is being collected and can be transferred between many companies, rendering future attempts at erasure difficult.

It should be noted, of course, that self-help remedies must only ever be a complement to regulation and corporate change. There are commentators online querying whether, in light of the Cambridge Analytica scandal, users should just delete their social media accounts since these companies seem to act with such impunity. This argument side-steps the fact that deleting Facebook accounts may be a viable alternative for some users but it cannot be the only solution. It is possible to have social media services that are transparent, audited, and prioritise informed consent.

The problems we are witnessing at social media companies today are not inextricably entwined with the services they are offering, they are rather the by-product of a laissez-faire approach to regulation that has permitted these risks to fester into the mess we’re in today.

♣♣♣

Notes:

  • This blog post is based on the author’s current PhD research on social media, LSE’s Department of Law.
  • The post gives the views of its author, not the position of LSE Business Review or the London School of Economics.
  • Featured image credit: Facebook beachfront, by mkhmarketing, under a CC-BY-2.0 licence
  • When you leave a comment, you’re agreeing to our Comment Policy.

MacKenzie F. Common is a PhD student at LSE’s Department of Law. She holds a B.A. (Honours) in Political Science from the University of Guelph (Canada) where she graduated with Distinction in 2011. She earned her LLB (Graduate Entry) from City University in 2013 and her LLM from the University of Cambridge in 2015, where she was a blog editor on the Cambridge Journal of International and Comparative Law. MacKenzie has worked at the Conduct and Discipline Unit, a specialised unit in the United Nations Department of Field Support which handles criminal complaints against peacekeepers and civilian staff working on peacekeeping missions. While at the CDU, she drafted a handbook on investigation procedure and evidentiary standards to be disseminated to all of the peacekeeping missions around the world. In 2013, MacKenzie worked in the Office of the Prosecutor (OTP) at the International Criminal Tribunal for the former Yugoslavia (ICTY). MacKenzie also worked for the Nanaimo Crown Attorney’s Office in Nanaimo, British Columbia and the Law Society in London, England.

About the author

Helena Vieira

Posted In: LSE Authors | Technology

3 Comments