EU Investigates Social Media X for Content Moderation Reductions

EU Investigates Social Media X for Content Moderation Reductions

By
Theo Kostas
2 min read

European Union Investigates Social Media Platform X under Digital Services Act

The European Union is investigating social media platform X for reducing its content moderation resources by almost 20% and scaling back linguistic coverage within the EU from 11 languages to seven. This move has sparked concerns over compliance with the Digital Services Act (DSA) regarding content moderation and generative AI risk assessments. The investigation follows the EU's formal infringement proceedings against X in December 2023, focusing on its adherence to DSA regulations in countering disinformation and hate speech. Violations of the DSA could lead to hefty fines of up to 6% of the company's global annual revenues.

Key Takeaways

  • The European Union is probing social media platform X under the Digital Services Act due to reductions in content moderation resources.
  • X has downsized its content moderation team by nearly 20% and limited linguistic coverage from 11 languages to seven within the EU.
  • The EU is seeking further information from X about risk assessments related to generative AI's impact on electoral processes, dissemination of illegal material, and protection of fundamental rights.
  • X is required to provide the requested information by May 17, with remaining answers due by May 27.
  • This investigation marks a significant step in the formal probe into X's alleged breaches of the EU's Digital Services Act, with the company potentially facing fines of up to 6% of its global annual revenues for non-compliance.

Analysis

The European Union's investigation into social media platform X signals a stricter stance on DSA compliance, which may influence other tech companies to prioritize regulatory adherence to avoid fines of up to 6% of their global annual revenues. The scrutiny could also compel X to revise its content moderation strategy, potentially impacting user experience and trust. In the long term, we may witness more stringent regulations and enforcement across social media platforms, affecting aspects like free speech, privacy, and digital innovation. The outcome of this investigation could shape the approach of other countries and regions in regulating tech giants and AI.

Did You Know?

  • Content moderation resources: These encompass the personnel, technology, and processes utilized by a social media platform to monitor and enforce community guidelines and legal requirements. The reduction in content moderation resources implies a decrease in human moderators and/or increased use of automated tools.
  • Digital Services Act (DSA): This EU regulation, enacted in 2022, aims to create a safer online environment by imposing rules on digital services to ensure transparency, user protection, and fair competition. Non-compliance can result in substantial fines, up to 6% of global annual revenues.
  • Generative AI: This type of artificial intelligence can create new content based on patterns learned from existing data, raising concerns about its potential impact on electoral processes, dissemination of illegal material, and fundamental rights protection within social media platforms.

You May Also Like

This article is submitted by our user under the News Submission Rules and Guidelines. The cover photo is computer generated art for illustrative purposes only; not indicative of factual content. If you believe this article infringes upon copyright rights, please do not hesitate to report it by sending an email to us. Your vigilance and cooperation are invaluable in helping us maintain a respectful and legally compliant community.

Subscribe to our Newsletter

Get the latest in enterprise business and tech with exclusive peeks at our new offerings