The European Union is stepping up its scrutiny of social media platform X (formerly Twitter) amid growing concerns over disinformation, hate speech, and right-wing political bias, a new report reveals. The EU’s increased focus on the platform comes as part of its broader efforts to enforce transparency and accountability among digital platforms operating within its borders.
The report, released by the European Commission’s Digital Services Oversight Group, states that authorities are “energetically” investigating how X handles misinformation, with particular attention to its alleged amplification of far-right narratives. The findings suggest that the platform may not be doing enough to combat harmful disinformation or to address the persistent presence of extremist content.
Rising Concerns Over Disinformation and Bias
Since its acquisition by billionaire Elon Musk in 2022, X has faced mounting scrutiny over content moderation practices, platform transparency, and the rise of politically motivated misinformation. European lawmakers have been particularly concerned with how X handles content related to elections, public health, and hate speech, areas where disinformation has historically had a significant impact.
The report highlights that X’s content moderation algorithms have been linked to the preferential treatment of right-wing accounts and narratives, particularly in the run-up to major political events. Studies have shown that X’s recommendation system often promotes content from far-right influencers, conspiracy theorists, and extremist groups, while moderating left-wing or progressive viewpoints more aggressively.
“It’s essential that social media platforms take concrete action to ensure they are not amplifying harmful, misleading, or biased content that could polarize public opinion or undermine democratic values,” said Vera Jourova, the European Commissioner for Values and Transparency. “The EU will continue to hold platforms like X accountable for their role in shaping online discourse.”
The EU’s Digital Services Act and Growing Pressure
The scrutiny on X comes as the EU moves to enforce its landmark Digital Services Act (DSA), which mandates stricter regulations for large tech companies. The DSA requires platforms to take swift action against illegal content and provides a framework for the transparent handling of disinformation, misinformation, and political bias.
X, along with other tech giants, is also facing pressure to comply with the EU’s Code of Practice on Disinformation, which encourages platforms to adopt stricter measures to combat misinformation, particularly in the context of elections and public health crises. The European Commission has previously warned that companies found in violation of these rules could face hefty fines or restrictions on their ability to operate within the EU.
According to the latest report, the EU has found that X has been “inconsistent” in its application of content moderation policies, with certain political viewpoints often facing greater scrutiny. The investigation points out that while X has made efforts to reduce hate speech, there are notable gaps in enforcing its own guidelines on disinformation, particularly in areas related to populist or nationalist political movements.
X Responds
In response to the EU’s growing concerns, X issued a statement emphasizing its commitment to improving content moderation and transparency. The company noted that it has made significant updates to its platform, including enhanced reporting features and a renewed focus on combatting harmful content.
“We are fully committed to adhering to EU regulations and ensuring that our platform remains a space for open and respectful dialogue,” said X spokesperson Lauren Alvarado. “While we continue to refine our systems and policies, we are confident in our ability to comply with the Digital Services Act and other European guidelines.”
However, critics argue that X’s efforts to curb disinformation and bias remain insufficient. A coalition of European civil society groups has called for stronger enforcement of the DSA, particularly in ensuring that platforms like X are held accountable for the content they host.
“The digital space is being weaponized for political gain, and it’s not just about ensuring the removal of harmful content. It’s about reshaping how these platforms curate and amplify information,” said Marcela Jovanovic, a researcher at the European Digital Rights group. “The EU must ensure that companies like X do not simply pay lip service to these issues but take meaningful action to prevent the spread of harmful political content.”
Moving Forward
As the EU’s investigation into X continues, the findings could pave the way for more stringent regulations and potential penalties for the platform. With elections in several European countries coming up in the next few years, the pressure on social media platforms to safeguard democratic processes is expected to intensify.
In the meantime, the European Commission is calling on other tech giants to be more transparent about their content moderation practices, with a particular emphasis on how they handle political content. The broader push for a safer and more accountable digital space in the EU looks set to continue, with the aim of preventing the spread of divisive and harmful content that could undermine public trust and democratic values.
As the probe into X’s practices deepens, it’s clear that the European Union is sending a strong message: in the digital age, accountability is paramount, and platforms must answer for their role in shaping the information landscape.