On February 16, 2024, the UK’s Information Commissioner’s Office (ICO) announced that it released the first guidance on content moderation, aiming to clarify how data protection laws, such as the UK GDPR, the Online Safety Act, or the Data Protection Act 2018 apply to processes of content moderation and how these can have an impact on individuals’ data subject rights.
The guidance defines content moderation as “the analysis of user-generated content to assess whether it meets certain standards and any action a service takes as a result of this analysis. For example, removing the content or banning a user from accessing the service.”
According to the press release by the ICO, content moderation “involves using people’s personal information and can cause harm if incorrect decisions are made,” which is why this guidance aims to help organizations moderate data correctly and in compliance with data protection laws, because “decisions based on the wrong information could lead to a user's content being incorrectly identified as being illegal or people being kicked off online platforms without reason. Under data protection law, people have the right to have inaccurate personal data corrected.”
The guidance was developed by the ICO in collaboration with Ofcom as part of their shared commitment to data protection and online safety. The representatives of the two issued the following statements:
Content moderation decisions shape what we see and who we interact with online. It’s crucial that data protection is designed into these processes so that people have confidence in how their information is being used and can get redress if the wrong decisions are reached.
Stephen Almond, ICO Executive Director for Regulatory Risk
Effective content moderation will play a crucial role in creating a safer life online for people in the UK. Last year, Ofcom proposed how tech firms can protect their users from illegal content, and we’re working closely with the ICO to make sure companies protect people’s personal data in the process.
Gill Whitehead, Ofcom Group Director for Online Safety
The guidance addresses how data protection law applies to content moderation technologies and processes. It's crafted to aid compliance with the UK General Data Protection Regulation (UK GDPR) and the Data Protection Act 2018 (DPA 2018), offering practical advice but not serving as a comprehensive compliance guide. It emphasizes the importance of adhering to the Children's code for services accessed by individuals under 18 and mentions adjustments in line with the Data Protection and Digital Information Bill once enacted.
Organizations are advised on what they "must," "should," and "could" do to align with legal requirements and good practice, emphasizing legislative mandates, expected actions for compliance, and optional strategies for effective compliance. The guidance is set to be updated to reflect technological developments and Ofcom’s finalized online safety codes of practice.