Lucid's Gravity SUV Boasts Impressive 450-Mile EPA-Estimated Range
Lucid Motors' Gravity SUV receives EPA-estimated range of 450 miles, exceeding initial expectations, with production underway at Arizona factory
Alexis Rowe
Ireland's media regulator, Coimisiún na Meán, has issued a decision against social media giant Meta, ordering the company to take "specific measures" to prevent its services from being used for the dissemination of terrorist content. The regulator has also requested that Meta report back on the measures taken to address the issue.
This decision follows a similar determination by Coimisiún na Meán against Meta-owned Instagram in November, as well as against TikTok and X. The Irish authority plays a significant role in regulating tech giants' compliance with a range of digital rule books, largely due to the fact that many opt to locate their regional headquarters in Ireland.
The relevant bit of Ireland's online safety framework that Coimisiún na Meán is enforcing in today's decision is a pan-EU law on terrorist content takedowns that was agreed by the bloc's lawmakers back in 2021. This law requires hosting service providers, such as social media platforms, to remove terrorist content within one hour of it being reported. Penalties under the regime can reach up to 4% of global annual turnover.
According to the Irish regulator, "Under the Terrorist Content Online Regulation, hosting service providers which receive two or more final removal orders from EU competent authorities within the last 12 months may be determined as being exposed to terrorist content." The regulator reached this decision following the notification of two or more final removal orders in respect of Meta-owned Facebook and subsequent engagement with the provider.
It remains unclear exactly which type of terrorist content was found on Facebook and notified to the regulator. Requests for more details have been made, and Meta has been contacted for a response to the Coimisiún na Meán decision.
This development highlights the ongoing challenges faced by social media platforms in balancing free speech with the need to prevent the dissemination of harmful content. As the EU continues to strengthen its online safety framework, tech giants will need to adapt their moderation strategies to comply with the evolving regulatory landscape.
The implications of this decision are far-reaching, with potential penalties of up to 4% of global annual turnover hanging over Meta's head. The tech giant will need to take swift action to address the regulator's concerns and prevent further non-compliance. As the EU continues to shape its digital policy, this decision serves as a reminder of the importance of cooperation between regulators and tech companies in creating a safer online environment.
Lucid Motors' Gravity SUV receives EPA-estimated range of 450 miles, exceeding initial expectations, with production underway at Arizona factory
Australia approves ban on social media for children under 16, citing concerns over negative impact on well-being, with fines up to $32 million for non-compliance
Discover the top 10 countries with the largest destroyer fleets in 2024, highlighting the significance of naval power and defense priorities.
Copyright © 2024 Starfolk. All rights reserved.