UK's Online Safety Act Takes Shape: Ofcom Publishes Final Guidelines for Tech Firms

Jordan Vega

Jordan Vega

December 16, 2024 · 3 min read
UK's Online Safety Act Takes Shape: Ofcom Publishes Final Guidelines for Tech Firms

The UK's internet regulator, Ofcom, has published the first set of final guidelines for online service providers subject to the Online Safety Act, marking a significant milestone in the implementation of the sprawling online harms law. The guidelines, which come into effect on March 16, 2025, outline measures for tech firms to protect their users from illegal harm, including terrorism, hate speech, child sexual abuse, and fraud.

The guidelines follow a consultation process and parliamentary approval, with Ofcom stating that over 100,000 tech firms could be in scope of the law's duties to protect users from a range of illegal content types. Failure to comply with the guidelines risks fines of up to 10% of global annual turnover (or up to £18 million, whichever is greater). The regulator has emphasized that it is ready to take enforcement action if providers do not act promptly to address the risks on their services.

The guidelines cover measures for user-to-user and search services to reduce risks associated with illegal content, including risk assessments, record-keeping, and reviews. Ofcom has also published a summary document outlining the key requirements for tech firms, including having a content moderation system, mechanisms for users to submit content complaints, clear and accessible terms of service, and removing accounts of proscribed organizations.

The approach taken by the UK law is distinct from a one-size-fits-all approach, with larger services and platforms facing more obligations due to the higher risks associated with their services. However, smaller lower-risk services do not receive a carve-out from obligations, and all services must undertake an assessment of how the law applies to their business. For larger platforms with engagement-centric business models, greater operational changes may be required to avoid falling foul of the law's duties to protect users from myriad harms.

A key lever to drive change is the law introducing criminal liability for senior executives in certain circumstances, meaning tech CEOs could be held personally accountable for some types of non-compliance. Ofcom CEO Melanie Dawes has stated that 2025 will see significant changes in how major tech platforms operate, including changes to algorithms and testing to ensure illegal content is not displayed on user feeds.

While the guidelines mark a significant step forward in implementing the Online Safety Act, Ofcom is still working on further measures and duties in relation to other aspects of the law. The regulator plans to introduce wider protections for children, including age checks and rules on pornography, suicide, and self-harm material, in the new year. Additionally, Ofcom is reviewing risks associated with emerging technologies such as generative AI and may further evolve requirements on service providers.

The regulator is also planning crisis response protocols for emergency events, proposals for blocking the accounts of those who have shared child sexual abuse material, and guidance for using AI to tackle illegal harms. As the Online Safety Act continues to take shape, tech firms operating in the UK will need to carefully assess their compliance with the guidelines and prepare for the changes ahead.

Similiar Posts

Copyright © 2024 Starfolk. All rights reserved.