UK's Online Safety Regulator Unveils Draft Guidance to Protect Women and Girls from Online Harms

Elliot Kim

Elliot Kim

February 25, 2025 · 6 min read
UK's Online Safety Regulator Unveils Draft Guidance to Protect Women and Girls from Online Harms

The UK's internet safety regulator, Ofcom, has published a new draft guidance aimed at supporting in-scope firms in meeting their legal obligations to protect women and girls from online threats. The guidance, which is part of the implementation of the Online Safety Act (OSA), focuses on four major areas where females are disproportionately affected by online harm, including online misogyny, pile-ons and online harassment, online domestic abuse, and intimate image abuse.

The government has emphasized that protecting women and girls is a priority for its implementation of the OSA, with certain forms of misogynist abuse, such as sharing intimate images without consent or using AI tools to create deepfake porn, explicitly set out in the law as enforcement priorities. However, the online safety regulation has faced criticism that it's not up to the task of reforming platform giants, despite containing substantial penalties for non-compliance.

Child safety campaigners have also expressed frustration over the time it's taking to implement the law, as well as doubts about whether it will have the desired effect. Even the technology minister, Peter Kyle, has called the legislation "very uneven" and "unsatisfactory." Nevertheless, the government is sticking with the approach, which requires parliament to approve Ofcom compliance guidance.

Ofcom's latest package of practice recommendations won't become fully enforceable until 2027 or later. However, enforcement is expected to start soon in relation to core requirements on tackling illegal content and child protection. Other aspects of OSA compliance will take longer to implement. According to Ofcom's Jessica Smith, who led the development of the female safety-focused guidance, "the first duties of the Online Safety Act are coming into force next month," and Ofcom will be enforcing against some of the core duties of the Online Safety Act ahead of this guidance becoming enforceable.

The new draft guidance is intended to supplement earlier broader Ofcom guidance on illegal content, which also provides recommendations for protecting minors from seeing adult content online. Ofcom has previously produced a Children's Safety Code, which recommends online services dial up age checks and content filtering to ensure kids are not exposed to inappropriate content such as pornography.

The latest set of guidance was developed with help from victims, survivors, women's advocacy groups, and safety experts. It covers four major areas where the regulator says females are disproportionately affected by online harm. Ofcom's top-line recommendation urges in-scope services and platforms to take a "safety by design" approach, encouraging tech firms to "take a step back" and "think about their user experience in the round."

Examples of "good" industry practices highlighted in the guidance include online services taking actions such as removing geolocation by default, conducting 'abusability' testing, taking steps to boost account security, designing in user prompts that make posters think twice before posting abusive content, and offering accessible reporting tools that let users report issues. However, not every measure will be relevant for every type or size of service, and in-scope companies will need to understand what compliance means in the context of their product.

When asked if Ofcom had identified any services currently meeting the guidance's standards, Smith suggested they had not. "There's still a lot of work to do across the industry," she said. She also tacitly acknowledged that there may be growing challenges given some of the retrograde steps taken by major industry players, such as Elon Musk's Twitter rebranding and Meta's ending of third-party fact-checking contracts.

In response to such high-level shifts, Ofcom plans to use transparency and information-gathering powers to illustrate impacts and drive user awareness. The regulator will produce a market report about who is using the guidance, who is following what steps, what kind of outcomes they're achieving for their users who are women and girls, and really shine a light on what protections are in place on different platforms so that users can make informed choices about where they spend their time online.

One type of online harm where Ofcom is explicitly beefing up its recommendations is intimate image abuse. The latest draft guidance suggests the use of hash matching to detect and remove such abusive imagery, whereas earlier Ofcom recommendations did not go that far. According to Smith, there was more deepfake intimate image abuse reported in 2023 than in all previous years combined, and Ofcom has also gathered more evidence on the effectiveness of hash matching to tackle this harm.

The draft guidance will now undergo consultation, with Ofcom inviting feedback until May 23, 2025. After that, it will produce final guidance by the end of this year. A full 18 months after that, Ofcom will then produce its first report reviewing industry practice in this area. Responding to criticism that the OSA is taking Ofcom too long to implement, Smith said it's right that the regulator consults on compliance measures. However, with the final measure taking effect next month, she noted that Ofcom anticipates a shift in the conversation surrounding the issue, too.

As the UK's online safety regulator continues to implement the Online Safety Act, it remains to be seen how effective these measures will be in protecting women and girls from online harms. Nevertheless, Ofcom's latest draft guidance is a step in the right direction, and its emphasis on "safety by design" and transparency may help to drive real change in the industry.

Similiar Posts

Copyright © 2024 Starfolk. All rights reserved.