Ireland's Media Regulator Cracks Down on Meta Over Terrorist Content Takedowns

Alexis Rowe

Alexis Rowe

December 16, 2024 · 3 min read
Ireland's Media Regulator Cracks Down on Meta Over Terrorist Content Takedowns

Ireland's media regulator, Coimisiún na Meán, has issued a decision against social media giant Meta, ordering the company to take "specific measures" to prevent its services from being used for the dissemination of terrorist content. The regulator has also requested that Meta report back on the measures taken to address the issue.

This decision follows a similar determination by Coimisiún na Meán against Meta-owned Instagram in November, as well as against TikTok and X. The Irish authority plays a significant role in regulating tech giants' compliance with a range of digital rule books, largely due to the fact that many opt to locate their regional headquarters in Ireland.

The relevant bit of Ireland's online safety framework that Coimisiún na Meán is enforcing in today's decision is a pan-EU law on terrorist content takedowns that was agreed by the bloc's lawmakers back in 2021. This law requires hosting service providers, such as social media platforms, to remove terrorist content within one hour of it being reported. Penalties under the regime can reach up to 4% of global annual turnover.

According to the Irish regulator, "Under the Terrorist Content Online Regulation, hosting service providers which receive two or more final removal orders from EU competent authorities within the last 12 months may be determined as being exposed to terrorist content." The regulator reached this decision following the notification of two or more final removal orders in respect of Meta-owned Facebook and subsequent engagement with the provider.

It remains unclear exactly which type of terrorist content was found on Facebook and notified to the regulator. Requests for more details have been made, and Meta has been contacted for a response to the Coimisiún na Meán decision.

This development highlights the ongoing challenges faced by social media platforms in balancing free speech with the need to prevent the dissemination of harmful content. As the EU continues to strengthen its online safety framework, tech giants will need to adapt their moderation strategies to comply with the evolving regulatory landscape.

The implications of this decision are far-reaching, with potential penalties of up to 4% of global annual turnover hanging over Meta's head. The tech giant will need to take swift action to address the regulator's concerns and prevent further non-compliance. As the EU continues to shape its digital policy, this decision serves as a reminder of the importance of cooperation between regulators and tech companies in creating a safer online environment.

Similiar Posts

Copyright © 2024 Starfolk. All rights reserved.