The UK Government has published a White Paper outlining its approach towards regulating the internet to tackle online harms.

The White Paper cites a study carried out by the UK’s communications regulator (Ofcom) and data protection regulator (Information Commissioner’s Office (ICO)). The study found that nearly one in four British adults suffered harm from either online content or their interactions online. Regulatory and voluntary initiatives currently dealing with online harms were identified by the UK Government as not going far enough or being inconsistently enforced.

Online harm

The White Paper broadly identified what would be considered an online harm. These include activities and content involving:

  • child sexual exploitation and abuse (CSEA)
  • terrorism
  • harassment
  • disinformation
  • encouragement of self-harm and/or suicide
  • online abuse of public figures
  • interference with legal proceedings
  • cyber-bullying
  • children accessing inappropriate content

Scope

The White Paper proposes regulating companies that enable or facilitate users to share or discover user-generated content, or interact with each other online. Companies offering the following services will likely find themselves within scope of the White Paper:

  • social media platforms
  • file-hosting sites
  • public discussion forums
  • messaging services
  • search engines

Regulatory framework

A key position in the White Paper is the introduction of a new statutory duty of care. This new duty will be enforced by an independent regulator that will establish compliance requirements in new codes of practice. The regulator will take a risk-based approach. In particular, the regulator will prioritise action in response to CSEA and terrorism-related content and activity.

The UK Government is currently consulting on powers that would allow the regulator to:

  • disrupt the business activities of a non-compliant company
  • impose liability on individual members of the senior management of non-compliant companies
  • block companies from providing non-compliant services

Requirements for companies

Companies can ensure they are compliant by referring to the incoming codes of conduct. Key steps for companies to take to ensure compliance are likely to include:

  • having clear and accessible terms and conditions, especially for children and other vulnerable users
  • publishing annual transparency reports explaining what harmful content exists on their platforms and what measures they are taking to address this
  • preventing known terrorist or CSEA content being made available to users
  • having an effective and easy-to-access user complaint function
  • ensuring that prompt, transparent and effective action is taken following user reporting
  • investing in safety technologies to reduce the burden on individual users to avoid harm
  • ensuring that users who have suffered harm are offered support
  • taking steps, where relevant, to cooperate with UK law enforcement and other public agencies

Comment

Trolling, threats, and online grooming. These are, unfortunately, commonplace internet activities that have significant impact on many people. The proposed framework set out in the White Paper is broad and captures a wide array of activities. The UK government has opened a consultation process to discuss the next steps for tackling online harms. If you want to take part, click here before 1 July 2019.

In the meantime, keep an eye on this blog for further updates on this and other developments with privacy regulation.