LSE - Small Logo
LSE - Small Logo

Blog Administrator

January 24th, 2017

The EU Must Ensure Data Protection & Digital Marketing Safeguards for both Children & Adolescents (and that includes you too, UK!)

0 comments

Estimated reading time: 5 minutes

Blog Administrator

January 24th, 2017

The EU Must Ensure Data Protection & Digital Marketing Safeguards for both Children & Adolescents (and that includes you too, UK!)

0 comments

Estimated reading time: 5 minutes

jeff-chesterThe European General Data Protection Regulation (GDPR) will, among efforts to tackle how to best safeguard children’s data, require parental consent for young people under the age of 16 who use digital services such as social media. Jeff Chester, executive director of the US-based Center for Digital Democracy, played a lead role establishing COPPA, including spearheading its 2012 update by the Federal Trade Commission. Here he applies lessons from that experience to the implementation of the GDPR, due to become law in May 2018. Jeff will be chairing a panel on this topic at the CPDP 2017 conference in Brussels this Friday, along with LSE’s Sonia Livingstone.

Influencers, native advertising, programmatic targeting, predictive algorithms, cross-device identification, immersive advertising, hyper-geolocation tracking, Big Data—all of these terms are part of the lexicon that is at the foundation of our digital lives. Led by Google and Facebook, but embedded deeply throughout the EU and the rest of the world, is a pervasive and far-reaching commercial surveillance complex.

The potent and purposeful combination of contemporary consumer data gathering and analytic practices—so called “Big Data”—with an ever-growing array of digital marketing practices, has been deployed globally to monitor, analyze, and influence our behaviors. The very digital devices and services we rely on—and which marketers openly claim we are often “addicted” to—have been expressly designed to facilitate continuous data collection on all “screens,” including desktop computers, mobile devices and (now) television. Information on and about our online and offline lives is fed into supercomputers, where it becomes a potent mix of insights and other “actionable” details as our profiles are enhanced with a plethora of added details readily obtained from giant one-stop-shopping “marketing data clouds.”

The data collected on individuals today include a far-reaching array of information on how much money we make; what our health concerns are; what our race or ethnicity is; how we spend our time online and offline and what we do there; and much more. In milliseconds, decisions are made about our lives. Should we get an ad or offer for a credit card or payday loan? Do we likely have cancer or some other serious illness and are looking for treatments? What video game, film, sporting event ad or editorial content should we receive—whether “fake” or from legitimate news sources? The data-driven digital marketing system is now also a major way citizens determine the fate of their nation, such as the vote on “Brexit” and the U.S. presidential election (to cite just two examples).

The pursuit of user monetization—getting our thumbs to buy, as Facebook puts it—is just one goal. So is using all the clout the digital giants and their allies can muster to get individuals and groups to embrace brands and products as part of their identity. Marketers are increasingly deploying new ways to influence our actions and emotions, including through artificial intelligence, virtual reality, 3D video, cognitive computing (such as IBM’s Watson), and the expanded use of neuromarketing.

The largely unchecked role that data-driven digital marketing plays in the lives of adults is troubling enough. But we should all be concerned about its impact on young people. From undermining their privacy, to encouraging them to buy junk food, “pester” their parents to spend money, promote products and brands to their friends by serving as influencers, or stealthily honing their identity and social development to promote life-long brand loyalty, marketers and media companies are playing an important role shaping this and future generations of young people. One has only to see how digital marketing is being used by fast food companies to peddle junk food around the world—despite the youth obesity epidemic—to witness how these forces can undermine a person’s health and potential. Market researchers are constantly studying how best to use the Internet, mobile devices, and social media to ensure ads and brand messages play an important—and unavoidable—role in children’s lives. While digital media offer children and adolescents important ways to learn, play, and communicate, they have also unleashed forces that require scrutiny, corporate responsibility, and regulation.

One critical safeguard that both children and teens require is to have their privacy rights respected. The European Union has a historic opportunity to implement its new General Data Protection Regulation (GDPR) to ensure that there are meaningful controls for data gathered on youth. (The new law comes into force in May 2018.) Throughout the EU, companies are using mobile phones, apps, social media, music streaming channels, YouTube, and more to entice “digital natives” to turn over their information. Data is the new “gold” and in the case of young people, worth some $1.2 trillion yearly in buying and influencing power.

But different rules are required for young children versus adolescents. Affirmative consent from a parent or caregiver before data collection can occur from a child is one important safeguard. That’s because the business model of the digital industry significantly depends on a person not being able to effectively “opt-out” of the data profiling process.  The default is collection.  Google and other digital ad firms make their vast revenues ($178 billion and growing) by being able to seamlessly gather our information without having to first ask, or being required to candidly explain what they do. (To see a glimpse of how Google or Facebook really work to help advertisers, see here and here.)

By first requiring truly informed consent, the process established by commercial sites (such as “read our privacy policy” if you can find or understand it) breaks. As one of the two people (along with Prof. Kathryn Montgomery) who led the campaign that resulted in the enactment of the Children’s Online Privacy Protection Act (COPPA) in 1998, I know that in the U.S. the range and nature of digital marketing and data practices that occur once someone turns 13 (and is no longer covered by COPPA’s parental consent requirement) is starkly different than online marketing directed at children. You just don’t see on sites targeting children what you unfortunately discover nearly everywhere else.

Although COPPA serves as a critical safeguard (and we were able to significantly expand its coverage in 2012 to include mobile, apps, gaming, and other platforms as well as geolocation and other increasingly used tracking data), it’s not enough. The marketing practices designed to elicit the data from children and encourage parents to say “yes” also require public policies.

In addition, adolescents require a different approach. While well meaning, the new provision in the GDPR raising the age for required parental consent from 13 to 16 is not the best way to address the problem. It’s true that teens are a key target for digital marketers in the EU and globally, both for their data and from the marketing opportunities that result. But young people should be able to participate effectively in the digital culture without having to ask permission first.

The answer, I believe, is not merely encouraging Member States to vote to roll back the GDPR provision so they can set the age of consent back to 13. What’s required is the development of rules that empower teens to make their own decisions regarding data collection—in which they have the right to “opt-in” after being told how their data will actually be used. (Such disclosure, if honestly made, would serve as a cautionary tale for many young people, I believe.) The EU and Member States also need to craft an enforceable set of Fair Marketing Practices for the Digital Era that protects children and teens and ensures their rights are respected (and which would be useful for the rest of us as well!).

As we all know, the online marketplace targeting young people is booming, with companies such as Google specifically creating sites (e.g., YouTube Kids) where even the youngest (such as five and under) are encouraged to spend time. Growing investment in efforts to lure children into digital environments, through games, video, and social media, along with ongoing technological innovation (such as immersive AI environments), will foster an even more ubiquitous and effective online marketing environment. Given their historic commitments to data protection, human rights, and the rights of minors, the EU and Member States should play a leadership role ensuring that the commercial marketplace engages young people in ways that truly respect their privacy and enhance their well-being.

This post gives the views of the author and does not represent the position of the LSE Media Policy Project blog, nor of the London School of Economics and Political Science. Throughout this blog series, we have been decoding the implications of the GDPR for children, noting that a range of provisions within the GDPR are of particular importance to their rights to participation and protection, to the balance between parental responsibility versus teen autonomy, and to the challenging media literacy task set for both parents and children by the opaque, ‘black-boxed’ algorithmic operation of the internet industry. We also looked at available evidence that can be used to unpack GDPR’s implications for children online, and discussed the GDPR from both a parents’ and a youth perspective.

 

About the author

Blog Administrator

Posted In: Children and the Media | Data Protection | LSE Media Policy Project

Leave a Reply

Your email address will not be published. Required fields are marked *