Understand Your Rights. Solve Your Legal Problems
winecapanimated1250x200 optimize
Law Enforcement & Oversight

EU reviews possible Digital Services Act breach by X under Elon Musk’s ownership

Reading Time:
5
 minutes
Posted: 27th November 2025
Susan Stein
Share this article
In this Article

EU reviews possible Digital Services Act breach by X under Elon Musk’s ownership

The European Commission is considering sanctions against X for potential breaches of the Digital Services Act, focusing on verification systems, advertising transparency, and researcher data access.

A decision before the end of the year would mark one of the first major enforcement tests of the EU’s online safety framework.


EU Examines X’s Compliance With Digital Services Act Duties

The European Commission is assessing whether X, the social media platform owned by Elon Musk, breached obligations under the Digital Services Act.

The review concerns X Corp’s operations across the European Union and examines issues that surfaced publicly after the Commission advanced its investigation in mid-2024.

Regulators are evaluating whether X’s paid verification system, advertising transparency tools, and researcher-access policies comply with statutory requirements designed to manage systemic online risks.

The enforcement process involves the Commission as the primary authority for very large online platforms.

The case matters because it relates to user protection, accountability mechanisms for high-reach digital services, and the rights of EU residents who rely on social networks for information, communication, and political participation.


What We Know So Far

The Commission previously opened formal proceedings to determine whether X met its duties to manage systemic risks, provide clear verification signals, and maintain compliant advertising databases.

Preliminary findings issued in 2024 raised concerns that paid verification could mislead users about authenticity, potentially qualifying as a prohibited design practice.

Regulators also identified gaps in the platform’s advertising transparency tools, noting that public access to information about advertisers, targeting, and amplification may not meet the DSA standard.

A further area under assessment is whether X has limited data access for vetted researchers, which the DSA requires for independent scrutiny of disinformation, manipulation, and other online harms.

Reports in 2025 indicate the Commission may now be nearing the stage of issuing a formal enforcement decision, with a potential fine under consideration before year-end.


The Legal Questions Raised

A central question is whether X’s paid verification model constitutes a misleading design that interferes with users’ ability to assess account authenticity.

The DSA restricts interface features that impair informed decision-making, particularly when they affect trust signals.

Another issue concerns whether X’s advertising database provides the required visibility into political, commercial, and targeted advertising. The law obliges large platforms to offer searchable, reliable tools that allow authorities and researchers to review potential risks.

Regulators are also assessing if X’s policies on researcher data access align with provisions intended to support independent study of systemic risks.

Access restrictions may affect oversight of disinformation, public-order threats, or electoral interference.

The proceedings additionally raise questions about proportional enforcement, due-process rights for the company, and how the Commission evaluates compliance across multiple operational areas.


Risks to Public Understanding and Democratic Participation

International human-rights standards recognise that technology platforms can be regulated to address illegal content and serious online harms, provided safeguards for lawful expression are maintained.

The DSA operates within this framework, aiming to reduce risks without creating new speech offences.

From a safety perspective, verification signals, transparent advertising systems, and accessible research data help users distinguish reliable information from impersonation, manipulation, or covert influence.

Weaknesses in these areas can affect public understanding, emergency communication, and democratic participation.

The case also relates to broader community impacts. If identity cues are unclear or ad transparency is limited, users may struggle to evaluate sources or understand how targeted messages reach them.

Limited researcher access can reduce the ability of civil society and institutions to monitor emerging risks.


Role of Law Enforcement & Regulators

The European Commission serves as the lead regulator for very large online platforms under the DSA. It conducts information requests, technical assessments, and risk evaluations to determine whether compliance obligations have been met.

Investigative work may include reviewing platform documentation, examining system design, analysing ad library performance, and assessing how verification functions in practice.

National Digital Services Coordinators may support investigations where issues affect domestic users.

If regulators find non-compliance, the Commission can order corrective measures, impose financial penalties, or require structural changes.

Any administrative action occurs separately from criminal proceedings, though law-enforcement bodies may pursue parallel inquiries where illegal networks or coordinated activity are involved.


Systemic Risks and Public-Interest Consequences

One risk relates to user confusion if verification signals do not reliably indicate authenticity, particularly during fast-moving news events or public-safety emergencies.

Another concern is the potential impact on trust in digital governance. A clear enforcement outcome could set expectations for compliance across the sector, while prolonged uncertainty may raise questions about regulatory consistency.

For major platforms, the case highlights exposure related to design choices, data-access policies, and transparency requirements. Decisions in this matter could influence how similar services structure verification tools, ad libraries, and researcher interfaces.

The broader impact includes public confidence, risk mitigation around misinformation, and the effectiveness of mechanisms designed to protect EU residents’ rights online.


Key Questions People Are Asking

How large could a potential fine be?

Under the DSA, very large online platforms can face penalties of up to 6% of global annual turnover, depending on the gravity and duration of any confirmed breach. The final amount would depend on the Commission’s assessment.

Why is verification under scrutiny?

Because verification historically signalled identity, regulators are examining whether paid verification could mislead users about authenticity and affect their ability to assess the credibility of accounts.

What are the transparency requirements for advertising?

Platforms must maintain a public database containing information about advertisers, targeting criteria, and ad content. The database must be searchable and accessible for public-interest scrutiny.

Why is researcher data access important?

Independent researchers play a role in monitoring systemic risks such as disinformation and targeted manipulation. The DSA requires platforms to provide vetted researchers with specific categories of public data.

Could this case influence other platforms?

Yes. As one of the earliest major DSA enforcement actions, the outcome may shape expectations and compliance practices for other large digital services operating in the EU.


Enforcement Process and Key Implications

X has the opportunity to respond to the Commission’s preliminary findings, provide written submissions, and request a hearing, while regulators continue reviewing technical evidence, assessing platform systems, and seeking clarification on any changes made by the company.

Once this process concludes, the Commission may issue a formal finding of non-compliance, order corrective measures, or impose a financial penalty, with any decision open to challenge before the EU courts.

Monitoring may continue regardless of outcome, as very large online platforms remain under enhanced supervision to ensure ongoing compliance with systemic-risk obligations.

The potential enforcement action against X centres on whether its verification, advertising transparency, and researcher-access systems meet the Digital Services Act’s risk-management standards.

The case is significant because it will influence how the EU applies systemic-risk rules to major platforms and shape future expectations for transparency, accountability, and user protection across the digital sector.


Frequently Asked Regulatory Questions

What is a “dark pattern”?

A design feature that may significantly impair users’ ability to make informed decisions, such as obscuring information or presenting misleading cues. The DSA restricts such practices.

How is digital evidence used in regulatory cases?

It can include technical documentation, screenshots, logs, design specifications, and system-behaviour analyses. Regulators evaluate these materials to determine compliance.

What defines illegal content in the EU?

Illegal content is defined by existing national and EU laws, such as hate speech, fraud, terrorism, or incitement. The DSA governs platform responsibilities, not the underlying definitions.

What is the role of international cooperation?

Where online activity intersects with cross-border networks, authorities may rely on existing frameworks for mutual legal assistance, information sharing, and enforcement coordination.

Lawyer Monthly Ad
osgoodepd lawyermonthly 1100x100 oct2025
generic banners explore the internet 1500x300

JUST FOR YOU

9 (1)
Sign up to our newsletter for the latest Compliance & Regulation Updates
Subscribe to Lawyer Monthly Magazine Today to receive all of the latest news from the world of Law.
skyscraperin genericflights 120x600tw centro retargeting 0517 300x250

About the Author

Susan Stein
Susan Stein is a legal contributor at Lawyer Monthly, covering issues at the intersection of family law, consumer protection, employment rights, personal injury, immigration, and criminal defense. Since 2015, she has written extensively about how legal reforms and real-world cases shape everyday justice for individuals and families. Susan’s work focuses on making complex legal processes understandable, offering practical insights into rights, procedures, and emerging trends within U.S. and international law.
More information
Connect with LM

About Lawyer Monthly

Legal News. Legal Insight. Since 2009

Follow Lawyer Monthly