<img height="1" width="1" style="display:none;" alt="" src="https://px.ads.linkedin.com/collect/?pid=5678177&amp;fmt=gif">

UK Information Commissioner Publishes Guidance on Data Privacy in Online Content Moderation

online-content-moderation-uk-ico-uk-gdpr

On February 16, 2024, the UK’s Information Commissioner’s Office (ICO) announced that it released the first guidance on content moderation, aiming to clarify how data protection laws, such as the UK GDPR, the Online Safety Act, or the Data Protection Act 2018 apply to processes of content moderation and how these can have an impact on individuals’ data subject rights. 

The guidance defines content moderation as “the analysis of user-generated content to assess whether it meets certain standards and any action a service takes as a result of this analysis. For example, removing the content or banning a user from accessing the service.”

According to the press release by the ICO, content moderation “involves using people’s personal information and can cause harm if incorrect decisions are made,” which is why this guidance aims to help organizations moderate data correctly and in compliance with data protection laws, because “decisions based on the wrong information could lead to a user's content being incorrectly identified as being illegal or people being kicked off online platforms without reason. Under data protection law, people have the right to have inaccurate personal data corrected.”

The guidance was developed by the ICO in collaboration with Ofcom as part of their shared commitment to data protection and online safety. The representatives of the two issued the following statements: 

Content moderation decisions shape what we see and who we interact with online. It’s crucial that data protection is designed into these processes so that people have confidence in how their information is being used and can get redress if the wrong decisions are reached.

Stephen Almond, ICO Executive Director for Regulatory Risk 

Effective content moderation will play a crucial role in creating a safer life online for people in the UK. Last year, Ofcom proposed how tech firms can protect their users from illegal content, and we’re working closely with the ICO to make sure companies protect people’s personal data in the process. 

Gill Whitehead, Ofcom Group Director for Online Safety 

The guidance addresses how data protection law applies to content moderation technologies and processes. It's crafted to aid compliance with the UK General Data Protection Regulation (UK GDPR) and the Data Protection Act 2018 (DPA 2018), offering practical advice but not serving as a comprehensive compliance guide. It emphasizes the importance of adhering to the Children's code for services accessed by individuals under 18 and mentions adjustments in line with the Data Protection and Digital Information Bill once enacted.

Organizations are advised on what they "must," "should," and "could" do to align with legal requirements and good practice, emphasizing legislative mandates, expected actions for compliance, and optional strategies for effective compliance. The guidance is set to be updated to reflect technological developments and Ofcom’s finalized online safety codes of practice.

 

Overview of the Guidance

    • Purpose: Explains the application of data protection laws to content moderation, offering practical compliance advice with UK GDPR and DPA 2018, not a comprehensive compliance guide.
    • Audience: Targeted at organizations using or considering content moderation and providers of such services, especially relevant for those meeting Online Safety Act 2023 obligations.
    • Content it Covers: 
      • Defines content moderation and its relevance to user-generated content.
      • Focuses on legal compliance for content moderation processes, both manual and automated.
  • Exemptions: Does not cover specific Online Safety Act obligations, behavior identification, user profiling, on-device moderation, or CSEA content reporting requirements.
  • Usage: Distinguishes between "must" (legal requirements), "should" (good practice expectations), and "could" (optional strategies) for compliance.
  • Relationship to the Online Safety Act (OSA):
    • While aiding in data protection law compliance, it does not guarantee compliance with the OSA, overseen by Ofcom.
    • Emphasizes the distinct compliance paths for data protection and online safety regulations.
  • Future Updates: Plans to update guidance to reflect technological developments and Ofcom’s final online safety codes of practice.