LSE - Small Logo
LSE - Small Logo

Sharanya Shanmugam

December 16th, 2022

What Singapore’s Proposed Online Safety Laws Mean for the Youth: Finding a Balance Between Protection and Autonomy

0 comments | 3 shares

Estimated reading time: 5 minutes

Sharanya Shanmugam

December 16th, 2022

What Singapore’s Proposed Online Safety Laws Mean for the Youth: Finding a Balance Between Protection and Autonomy

0 comments | 3 shares

Estimated reading time: 5 minutes

Sharanya Shanmugam, a Research Associate from the Centre for AI and Data Governance at Singapore Managment University’s Yong Pung How School of Law, explains the implications of Singapore’s proposed online safety legislation for children and young people.

Legal frameworks related to children tend to be protectionist in nature. Children and teenagers are often seen as careless, myopic and more vulnerable than adults towards the multiple predatory forces they may encounter online, and while they are likely to have increased vulnerabilities, this then leads to regulations that impose top-down evaluations of ‘safety’ onto minors and corresponding limits on their Internet use.

Singapore’s recently proposed online safety laws, which seek to reduce user exposure to harmful online content, show that the State’s method of protecting the safety and well-being of Internet users, particularly young users, reflects such a paternalistic approach of speaking for them about their safety and assuming that they lack the maturity to make sense of their own online experiences. The State’s promise of a consultative approach to enhancing users’ online safety invites the public to provide feedback on the proposed measures and offers us the opportunity to reconsider if protection should be pitted against other fundamental rights that support the ‘best interests’ of the child, as outlines in the UN Convention on the Rights of the Child. Legal and policy frameworks aimed at creating safe digital spaces for minors should not only aim to uphold their safety, but also promote their holistic development into adulthood by protecting their dignity, right to privacy, right to participation, and access to information.

The State knows best?

To protect young users, the proposed code grants power to Singapore’s Infocomm Media Development Authority (IMDA) to direct any social media services to disable user access to what the Government deems as ‘extremely harmful content’, which is determined as content that is related but not limited to suicide and self-harm, sexual harm, public health, public security, and racial or religious disharmony or intolerance, and to disallow specified online accounts from communicating with users in Singapore. People’s opinions on what is harmful can differ greatly, especially the views of young people from adults. This divergence will therefore not simply be reconciled by a small group of elites defining the parameters of harm and ‘public interest’ when it comes to the content we can access.

If young people are not involved in the decision-making processes to improve online experiences, they may see these regulations as disconnected and authoritarian restrictions on their social media use and respond by accessing such content unlawfully, possibly via alternative channels, to satisfy their curiosities.

Legislating a narrow view of ‘online safety’

Focusing only on content moderation, the proposed bills risk also legislating a narrow view of ‘online safety’, omitting other online harms including the rise in privacy intrusions from commercial companies through the secondary use of personal data commodified for advertising purposes. Young users are also often unaware about third-party data access and how much data they supposedly voluntarily share online, undermining their privacy and freedom of choice to make informed evaluations.

The proposed bills are inadequate in relation to personal data protection when compared to the European Union’s Digital Services Act (DSA), which explicitly seeks to ban targeted advertising on online platforms that minors frequent. The Act further restricts presenting advertisements to all users based on profiling that relies on special categories of their personal data, such as their ethnicity, residence, religious or political views, or sexual orientation. The Act also seeks to enhance transparency for all advertising on online platforms by ensuring that users can receive more information about the advertisements they receive, such as why an advertisement targets them specifically and who paid for it.

When legislating online safety, it is crucial for Singapore to go beyond content moderation and consider other privacy risks online, as only the State might have the capacity and the will to level the power imbalances between profit-hungry social media giants and their users.

The ‘vulnerability’ of the young Internet user

The first reading of the proposed bills in the Singapore Parliament on 3 October 2022 cites a survey conducted in June which found that an overwhelming majority of respondents felt that harmful online content affects children and youth the most, and that sexual content, cyberbullying, and violent content were the types of content that the young needed to be protected from the most. Such a survey was not specifically directed to young social media users.

Child-specific regulatory initiatives tend to only define children by their vulnerability, seeing them as ‘passive victims’ of potential risks and harms on digital platforms, without consideration of their capacity and potential to navigate these spaces. This results in overly-protectionist agendas, which can be damaging to the development of responsible autonomy.

Understanding young people to be active agents in their social media use is crucial when educating them on risk and making informed judgements. The Singapore’s Children’s Society, for instance, conducts body-safety programs with pre-school children to teach them to protect themselves from sexual abuse by developing a healthy respect of their bodies, distinguishing good from bad touches and knowing what to do if they have been touched inappropriately. The organisation strongly believes that children can be empowered and equipped with these preventative skills through appropriate sexuality education, rather than letting adults deal with the aftermath of such unwarranted incidents.

Expert and institutional voices are heard more often than the beneficiaries of policies in public consultations. While adults may find it helpful to establish boundaries around young people’s behaviours and limit what they are exposed to in order to protect them, regulations on young people’s Internet use need to consider and build upon their desires for independence, empowerment and capacity building to manage their own behaviour.

The need to balance between protection and autonomy

Protecting the ‘best interests’ of the child when regulating social media requires moving away from overly-protectionist approaches and towards empowering them. Young people should be made more aware of the consequences of personal data production and sharing, and be involved in decision-making processes to determine more tailored notions of risk online.

In response to public feedback on the proposed bills, which included the youth, the Singapore government revealed that they are working towards finding a balance between prioritising user safety with the other concerns that were raised, such as privacy and freedom of expression online. The government also noted calls for social media services to make safety information easily accessible for young users to better enable them to protect themselves from harmful or age-inappropriate content, which is a step in the right direction towards granting them the self-confidence to navigate digital spaces safely.

Approaches that balance between protecting minors from online harms and empowering them to safeguard their digital rights by developing their capacities to make informed decisions online should be welcomed and promoted as a progressive alternative to top-down controls on their behaviour.

Sharanya Shanmugam is a Research Associate from the Centre for AI and Data Governance, SMU Yong Pung How School of Law (sharanyas@smu.edu.sg). Her research is supported by the National Research Foundation, Singapore under its Emerging Areas Research Projects (EARP) Funding Initiative. Any opinions, findings and conclusions or recommendations expressed in this material are those of the author(s) and do not reflect the views of National Research Foundation, Singapore.

This article represents the views of the author and not the position of the Media@LSE blog, nor of the London School of Economics and Political Science.

Featured image: Photo by Hu Chen on Unsplash

About the author

Sharanya Shanmugam

Sharanya Shanmugam is a Research Associate from the Centre for AI and Data Governance, SMU Yong Pung How School of Law (sharanyas@smu.edu.sg). Her research is supported by the National Research Foundation, Singapore under its Emerging Areas Research Projects (EARP) Funding Initiative.

Posted In: Children and the Media | Internet Governance

Leave a Reply

Your email address will not be published. Required fields are marked *