The European Commission has accused Meta and TikTok of violating the European Union’s (EU) Digital Services Act (DSA) by restricting researchers’ access to public data and failing to provide users with simple ways to report illegal content.
In its preliminary findings released on Friday, the Commission said Facebook, Instagram, and TikTok may have placed “burdensome procedures and tools” that make it difficult for independent researchers to examine how these platforms influence public life, health, and safety.
It described such access as “an essential transparency obligation under the DSA, as it provides public scrutiny into the potential impact of platforms on our physical and mental health.”
Meta and TikTok both denied wrongdoing; a Meta spokesperson told Reuters, “We have introduced changes to our content reporting options, appeals process, and data access tools since the DSA came into force and are confident that these solutions match what is required under the law in the EU.”
TikTok, however, maintained that while it supports transparency, regulatory overlaps complicate compliance. “But requirements to ease data safeguards place the DSA and GDPR in direct tension,” a company spokesperson said.
“If it is not possible to fully comply with both, we urge regulators to provide clarity on how these obligations should be reconciled.”
The DSA, which came fully into effect in August 2023, imposes strict obligations on “Very Large Online Platforms” such as Meta and TikTok. These platforms are expected to give researchers access to public data, allow users to report illegal content like hate speech or terrorism, and disclose how their algorithms make content recommendations.
The Commission said Meta’s Facebook and Instagram failed to offer a “user-friendly and easily accessible” system for flagging harmful content, including child sexual abuse and terrorist material. It also accused Meta of using “deceptive interface designs” that could confuse or discourage users from reporting such posts.
TikTok’s data-sharing framework was similarly criticised for being unreliable and incomplete, limiting research into online harms.
If these violations are confirmed after further consultations, both companies could face fines of up to 6% of their global annual revenue, a penalty that could cost Meta more than $7 billion based on its 2024 earnings.
Despite the serious implications, the findings are preliminary. The companies have the opportunity to respond and address the breaches before any final decision is made. The Meta spokesperson added that the company would “continue to negotiate with the Commission.”
The probe forms part of the EU’s focus on Big Tech, which has already placed X (formerly Twitter), Google, YouTube, and Amazon under investigation for issues ranging from disinformation to product safety.

