The Internet has created seemingly limitless opportunities, but it also offers a platform for violent, hateful, and antisocial behaviour. Drawing on his recent book, Raphael Cohen-Almagor considers how to strike a balance between the free speech principle and social responsibilities. He proposes that deliberative democracy mechanisms could be used to promote content net neutrality and encourage Net-users to think and act like citizens in the online world.
In the late 1990s, the internet seemed a heaven for business: a facilitator of unlimited economic propositions to people without any regulatory limitations. Cases like Yahoo! in France and Google in Italy (discussed below) mark the beginning of the end of that illusion. They demonstrate that Internet Service Providers (ISPs) have to respect domestic state legislation in order to avoid legal risks. Both cases show that companies need to find the right balance between freedom of expression (and business) on the one hand, and social responsibility on the other.
Image credit: Pete Markham CC BY-SA 2.0.
In business, responsibility is defined in terms of obligations accepted by employers in relation to their employees, and suppliers to customers and clients. In both cases there is often a basis in law, but many responsibilities are customary or subject to negotiation according to the interests and balance of power of the parties involved. The acceptance and fulfillment of responsibilities by business actors is mainly determined by considerations of long-term self-interest and maintaining good customer relations, although ethical principles may also play a part and moral responsibilities exist alongside legal obligations.
The Yahoo! saga started in February 2000. Yahoo!’s auction site contained pages upon pages of Nazi-related paraphernalia, where one could find swastika armbands, SS daggers, concentration camp photos, striped prisoner uniforms, and replicas of the Zyklon B gas canisters. France has strict laws against selling or displaying anything that incites racism, and sales of Nazi artefacts are against the law. So while in the United States these auctions are legal, in France they are illegal. Section R645-1 of the French Criminal Code prohibits the display of Nazi symbols. The Paris court found that Yahoo! Inc. had committed “a manifestly illegal disturbance” under the French New Code of Civil Procedure, which in turn was based on the French Criminal Code and the offence of distributing Nazi memorabilia. Yahoo! Inc. was ordered to “take all measures of a nature to dissuade and to render impossible all consultation on Yahoo.com of the online sale of Nazi objects and of any other site or service that constitutes an apology of Nazism or a contestation of Nazi crimes”. The company appealed against the decision in an American court but in the end it had to abide by the French court order.
The case highlighted that states may affirm their laws also on the internet. In February 2010, three Google executives were convicted of violating Italy’s privacy laws after a three-minute footage of a disabled boy being bullied was posted on Google Video. The young Down Syndrome boy was shown as he was punched and kicked by four teenagers at a Turin school. Google removed the video and cooperated with the authorities on investigating the clip. However, the Italian prosecutor claimed that the video had been viewed 5,500 times over a period of two months. It reached the top of Google’s Italy “most entertaining” video list and the company had ignored Netcitizens’ appeals to remove it. Only after it was notified by the authorities did Google take active steps. The court decision was later overturned but the affair made Google to be far more vigilant and attentive to both the posting of such problematic content as well as to public complaints.
My new book, Confronting the Internet’s Dark Side, considers these and other cases, aiming to strike a balance between the free speech principle and the responsibilities of the individual, corporation, state, and the international community. It argues that freedom of expression is of utmost importance and value but it needs to be weighed against an equally important consideration: social responsibility. It is the first comprehensive book on social responsibility on the internet.
Countering the dark web is particularly challenging and requires a concerted effort of all stakeholders. The responsibility of ISPs and web-hosting companies is arguably the most intriguing and complex issue. With the advancement of technology, responsibility for gaining and maintaining trust in the Net increasingly falls on those who operate the Net, namely on ISPs and Web Hosting Services (WHSs). Some of these companies act responsibly, in the spirit of Corporate Social Responsibility (CSR), making an effort to provide a safe environment for their Net-users, thinking that this policy is beneficial to their reputation and business. For instance, Google formally prohibits content that promotes or condones violence against individuals or groups based on race or ethnic origin, religion, disability, gender, age, nationality, veteran status, or sexual orientation/gender identity, or whose primary purpose is inciting hatred on the basis of these core characteristics. Other companies uphold internet neutrality and conduct their business in accordance with direct monetary consequences. The main question is whether internet intermediaries should be proactive, i.e. not only cooperate upon receipt of information from various sources but also scrutinise their sphere for problematic, anti-social and potentially harmful material; this in order to promote trust among their subscribers. Here I discuss the concepts of net neutrality, perfectionism and discrimination. I distinguish between three different meanings of neutrality:
(1) Net neutrality as non-exclusionary business practice, highlighting the economic principle that the internet should be opened to all business transactions.
(2) Net neutrality as an engineering principle, enabling the internet to carry the traffic uploaded to the platform.
(3) Net neutrality as content non-discrimination, accentuating the free speech principle.
I call (3) content net neutrality. While endorsing the first two meanings of net neutrality I argue that internet gatekeepers should adhere to the ‘promotional approach’ rather than to neutrality. The promotional approach accentuates ethics and social responsibility, so that ISPs and web-hosting services promote the basic ideas of respect for others and not harming others. They should scrutinise content and discriminate against illegal content (child pornography, terrorism). Facebook, for instance, deploys a variety of technology tools, including easily available reporting links on photos and videos. But Facebook should also be alert to content that is morally repugnant and hateful. It is argued that some value screening of content may be valuable and that the implications from affording the internet the widest possible scope can be very harmful. It is emphasised that only cyberbullying and hate speech feature in this category.
The book concludes by proposing to establish a new browser for liberal democracies called CleaNet ©. Through mechanisms of deliberative democracy, Netusers would agree on what constitutes illegitimate expression to be excluded from the browser. As a result, the browser would facilitate a safer and more responsible surfing of the internet. In a sense, CleaNet © will be an enhanced, citizens-based form of server filtering, based on detailed Terms of Fair Conduct. Only material that is deemed problematic by at least 80% of the votes will be listed for exclusion. A separate list, “under review”, will include debatable speech to be considered and debated periodically until a resolution is made: either to permit it, or to filter it from CleaNet ©. The “under review” list will also include the problematic material with restricted access to which Netusers will have to sign up. It will be the responsibility of the ISPs and web-hosting companies to retain the list and to cooperate with law-enforcement whenever required.
My aim with this book is to push readers to think and debate concerns relating to freedom of expression, privacy, security, trust and responsibility. The solutions proposed in this book are likely to provoke discussion and debate, in the spirit of deliberative democracy mechanisms that involve the public. In light of the detailed stories concerning hate sites (toward groups or humanity in general), webcam viewing of actual suicides, the exponential growth of child pornography, internet-based terrorism and crime, it is hard to fall back on knee-jerk First Amendment responses. The book encourages Netusers to think and act like citizens in the online world, insisting that we have a moral obligation to confront those who abuse the technology. Confronting the Internet’s Dark Side is intended to serve as a wake-up call and will challenge its readers to reconsider their views of free expression in the internet age. You may agree. You may disagree. You can hardly remain indifferent.
This article was originally posted on Democratic Audit UK and is based on Raphael Cohen-Almagor, Confronting the Internet’s Dark Side: Moral and Social Responsibility on the Free Highway.
Note: This article gives the views of the author, and not the position of the LSE Impact Blog, nor of the London School of Economics. Please review our Comments Policy if you have any concerns on posting a comment below.
Raphael Cohen-Almagor (D. Phil., Oxon) is an educator, researcher, human rights activist and Chair in Politics, University of Hull, UK. He has published extensively in the fields of political science, law, ethics and philosophy. He was Visiting Professor at UCLA and Johns Hopkins, Fellow at the Woodrow Wilson Center for Scholars, Founder and Director of the Center for Democratic Studies, University of Haifa, and Member of The Israel Press Council. He is the Founder and Director of the Middle East Study Group at the University of Hull. Confronting the Internet’s Dark Side is his seventeenth book.