Protecting children from harmful content online – a global perspective

With little regulation, controlling children's access to online content has been left to parents and carers to manage. But now lawmakers around the world are trying to put responsibility into the hands of websites and online service providers.

Recent years have seen concerted action aimed at controlling the collection and use of children’s personal data online. Established laws such as the GDPR in Europe and the UK, and COPPA in the US, have meant that we have long said goodbye to the bad old days when mainstream platforms could share children’s contact details freely with third parties. 

But what has happened to attempts to regulate what children can view online and on their devices?

UK - The Online Safety Bill

This proposed law, hugely popular with many, is currently being taken forward in the UK Parliament. It seeks to introduce new rules for organisations which host user-generated content, such as social media platforms, search engines, online forum providers, and some online games publishers. As well as being responsible for removing illegal material such as child sexual exploitation and terrorism, regulated services must also assess the risks of children accessing legal but harmful content such as that related to suicide and self-harm, and design services to prevent such access. Platforms that fail to comply once the bill is implemented could face fines of up to 10% of their annual revenue from the Information Commissioner and risk being blocked.

US – The Kids Online Safety Act

In 2022, US Congress introduced the Kids Online Safety Act which aims to impose new measures and requirements for children under the age of 17. In the Senate, the latest draft was published in early December and focuses on limiting the activities and negative content that children are exposed to online. It has to cover more ground than the UK and EU legislation (such as the control of geolocation data) as the US does not have comprehensive federal privacy laws as a backdrop. The central proposition is that in-scope providers “shall act in the best interests of a user that the platform knows or should know is a minor”. If passed, enforcement will be effected through the Federal Trade Commission or State Attorneys General.

EU – The Digital Services Act

Unlike the Online Safety Bill and proposed US legislation, this is already law. The Digital Services Act will require very large online platforms and search engines to “take measures to protect minors from content that may impair their physical, mental or moral development and provide tools that enable conditional access to such information”. In addition, very large online platforms and search engines must take “targeted measures to protect the rights of the child, including […] tools aimed at helping minors signal abuse or obtain support”. It starts applying in practice from February 2024, with providers of online platforms having to start publishing user numbers a year before that. Duties are more vaguely defined than in the UK, but fines are still substantial: up to 6% of global turnover. 

So why hasn’t this happened already?

Given the unquestionable harm which has been caused by children’s exposure to unfiltered online content over at least a couple of decades, why is it taking so long for laws to follow?

The answer has partly to do with the difficulty in practice of establishing the true age of internet users without disclosure of a user’s personal data in ways which may do more harm than good. 

These developments also raise other difficulties, which are common to efforts to regulate adult content. There is obviously an ever-contentious balancing act between personal freedom and state control. Also, the moderation of content is expensive, and smaller platforms are likely to find it harder to conform to rules. Some fear that this will result in further consolidation of regulated players, which is exactly the outcome that parallel pro-competition legislation such as the EU Digital Markets Act is designed to prevent.

The sooner that platforms have to catch offending content, the higher the risk of a challenge to end-to-end encryption, which is seen as a cornerstone to facilitating freedom of speech on the internet. There are concerns that once “backdoors” into private content are established to enable platforms to comply with new rules, these same backdoors are potentially available to governments and/or bad actors. 

Finally, it can place disproportionate power in the hands of regulators, usually not democratically elected, to make judgement calls on the harmfulness of content.
Given the concerns raised by voters everywhere, it’s likely that attention will increasingly turn to online content, and laws like these will become an more common feature of the online landscape. 

Disclaimer

This information is for general information purposes only and does not constitute legal advice. It is recommended that specific professional advice is sought before acting on any of the information given. Please contact us for specific advice on your circumstances. © Shoosmiths LLP 2024.

 


Insights

Read the latest articles and commentary from Shoosmiths or you can explore our full insights library.