Privacy has usually been defined in terms of an individual’s space, necessary for meeting the person’s vital interests. The central concept in these discussions is autonomy, a core value in our society. The best way to connect the two concepts is to consider privacy as a tool that fosters and encourages autonomy. Privacy contributes to the demarcation of a personal sphere, which makes it easier for a person to make decisions independently of other people.

A violation of privacy will result in autonomy being undermined, particularly when one additional condition is met: the observing (privacy-violating) person is in one way or another exercising control over the other person. For instance, the person involved might feel pressure to alter her behaviour just because she knows she is being observed. Or a person who is not aware of being observed is being manipulated. This occurs very often in the digital age, characterised by persistent surveillance and invisible algorithms. When privacy is under threat, the independence of individual decisions is typically also compromised.

This observation is striking in Western societies when we consider that they focus on the individual person, whose autonomy is esteemed very highly. For instance, in consumer and advertising ethics, the consumer’s free choice is the moral cornerstone. In the online world, this ethical value is scarcely met. The common solution to this problem is that people are asked to agree with terms and conditions. But, we all know, very few people even read them and most people probably don’t understand them. A strange paradox seems at work. Autonomy can only be protected when people understand all information, are fully aware of risks and consequences, and are independent-minded beings, i.e. when people are fully autonomous. We already presuppose the capacity that must be protected! Given this complication it is valuable to take into consideration another privacy justification.

The social approach

According to the social approach, privacy guarantees boundaries that help to maintain the variety of social environments. As the poet Robert Frost remarked: ‘Good fences make good neighbours.’ To answer the question of how this concept of privacy manifests itself in the digital age, we turn to the famous ‘contextual privacy model’ of Helen Nissenbaum: ‘What people care most about is not simply restricting the flow of information but ensuring that it flows appropriately.’ The storage, monitoring, and tracking of data are allowed insofar as they serve the goals of the context. As examples of contexts, Nissenbaum mentions health care, education, religion, and family. Privacy protection means that data flows are limited to these contexts and do not transgress boundaries.

However, it is not always clear what a context is. In order to get a grip on this notion we have to get clarification of the norms and values that guide contexts. And that requires that we pick a notion that plays an important role in professional ethics: the ‘substantial goods’ that characterise practices. These are immaterial qualities that people conceive and create in the course of their actions. Their achievement is relevant for the whole community, whose members participate in the practice; they can be shared in the full sense of the word. Consider for instance goods such as security, education and health.

The notion of substantial goods is not as abstract and vague as the general values that are often used in policy and judicial documents, such as dignity, justice, respect, and integrity. Substantial goods can be helpful in giving these concepts a more precise (contextual) meaning. For instance: an accurate description of the meaning of the notion ‘respect’ in education (i.e. respect for the student or the teacher) differs from respect as understood in healthcare (i.e. respect for the patient). This is not a superfluous luxury in the digital age. For example, in healthcare, explicit awareness of the meaning of ‘respect’ for the patient helps to determine the appropriate flow of information that benefits the patient’s health. It is helpful in protecting the interests of the patient from institutional pressures or pressures from special interest groups.

The ‘substantial goods’ notion is not too narrow. A more explicit articulation of the goods at stake enables us to describe activities under a normative perspective, at the same time being open for the possibility that new developments lead to new interpretations of the goods involved, which, in turn, facilitate innovation. Take, for instance, education. Under the umbrella of having a good education, a wide variety of patterns of education can be developed, and new trends can be incorporated.

In addition to these merits, an articulation of the substantial goods delivers a welcome intervention in an otherwise awkward debate about the different roles that privacy can play. Privacy is not exclusively positive. It can, for instance, be used to conceal poor practices. For instance, feminists have stated that privacy is the enemy of equality . . . placing ordinary people at the mercy of powerful people. For criminals, privacy is a cover-up for their activities. Relating privacy to the substantial goods it serves is helpful in these debates, in which privacy seems to be a double-edged sword. When it is clear which kinds of goods privacy serves (e.g. goods of particular interest groups; emancipation; the common good), a context-specific discussion on the value of privacy is possible.

Finally and importantly, the notion of goods contains a normative orientation, which is distinguished from, for instance, economic imperatives. After all, commercial interests are increasingly hampering privacy. Admittedly, discussions about substantial goods, that often have a philosophical tune may be difficult and even uneasy. But we need them to protect the quality of our lives in the digital era.

♣♣♣

Notes:


Marcel Becker is associate professor of philosophical ethics at Radboud University Nijmegen, the Netherlands. He has written books about the ethics of public administration and ethics in the digital age. He participates in Radboud University’s interdisciplinary research hub on security, privacy, and data governance. ((www.ru.nl/ihub)