In the UK, there is currently heightened regulatory scrutiny and increased public interest in children’s data protection and online harm, with a raft of new guidance and regulation from both the ICO and Ofcom, the chief regulator of the Online Safety Act, in relation to children’s safety online.
Since the introduction of the ICO’s Children’s code of practice in 2021, the ICO has been working with online services to improve privacy protections for children and has placed it at the top of its agenda of priorities for 20245/2025. The new Children’s Code Strategy aims to build on this progress and sets out the priority areas social media and video-sharing platforms need to improve on. In particular, the Children’s Code Strategy will focus on: default privacy and geolocation settings; profiling children for targeted advertisements; and using children’s information in recommender systems.
Similarly, Ofcom has also issued a new Children’s Safety Code which imposes new duties on services that can be accessed by children, including social media sites and apps and search engines. Firms must assess the risk their service poses to children and then implement safety measures to mitigate those risks. In addition, Ofcom has published its consultation on protecting children from harms online, which focusses on proposals for how internet services that enable the sharing of user-generated content (‘user-to-user services’) and search services should approach their new duties relating to content that is harmful to children.
Both regulators recognise that there are strong synergies and also tensions between online safety and privacy and have therefore committed to work together where their remits intersect. It will be fundamental to effective regulation that comprehensive protection is afforded to online users and clarity is given to providers of online services to ensure they can navigate the changing legal landscape with “regulatory clarity and free from undue burden.”[1]
In this vein, both regulators issued a joint statement earlier this month setting out “collaboration themes” where they will “work together on areas of mutual interest to achieve a coherent approach to regulation”. These themes will be continuously monitored and subject to change as new priorities emerge, but the joint statement has highlighted the following indicative illustrative examples of such collaborative themes:
- Age Assurance – the range of approaches used to estimate or establish the age of users to put in place protections appropriate to age or mitigate risk of harms arising from processing children’s personal data.
- Recommender systems – the use of algorithms to curate and determine the ranking of content to online users based on their characteristics, inferred interests and behaviour.
- Proactive tech and relevant AI tools – this refers to a range of technology which may be able to analyse and assess whether content of a particular kind; assessment of patterns of online behaviour; or user profiling technology.
- Default settings and geolocation settings for child users – For children’s data, settings must be high privacy by default with geolocation switched off, unless there is a compelling reason to do otherwise.
- Online safety privacy duties – duties under the ODA require online service providers to have regard to protecting users from a breach of statutory provisions/ rules of law concerning privacy when deciding on safety measures.
- Upholding terms, policies and community standards – Services must publish clear and accessible terms of service and must uphold them.
The statement has also outlined proposed ways of working together which include identification of companies of mutual interest – where companies or services are subject to both the online safety and data protection regimes, and are of current regulatory interest to both the ICO and Ofcom, the regulators may routinely share information between each other relating to generic information concerning information requests on online safety matters (but not the content of the requests); stakeholder meetings; or publicly available information which may be of interest to each other.
With increasing numbers of children spending their time online, children’s safety will continue to be a focus for regulators, both in the UK and globally. The focus for application of both regimes is not whether the online service actively allows children on the site or actively collects children’s data, but whether, in practice, the service is likely to be accessed by children. Two key principles which are common to both are transparency – to be clear to users whether, and how you process their information and which kinds of potentially harmful content a service may allow; and accountability – conducting risk assessments, implementation of safety measures and safeguards to protect children and keep such assessments under periodic review. It is therefore critical for organisations to establish whether your online services may fall within the scope of data protection law, the Online Safety Act, or both and to develop appropriate compliance frameworks accordingly.
Should you wish to discuss any matter contained within this article, please get in touch with your usual DLA Piper contact.
[1] From previous joint statement published in November 2022