The popularity of ChatGBT in the European Union has reached a new milestone yet with it comes regulations.
OpenAI reported that the average number of monthly active recipients for the ChatGPT Search feature in the EU for the six-month period ending September 30, 2025, was approximately 120.4 million.
This milestone is important not only as an indicator of success; it automatically places the product in a legal context subject to the strictest digital regulation requirements in Europe.
“Growing Up Under supervision”?
The EU regulatory framework is based on the Digital Services Act (DSA), a set of rules aimed at making large platforms more transparent and accountable for information dissemination, content moderation, and risks to civil rights.
For a number of specific obligations, the DSA introduces a “very large online platform” (VLOP) threshold.
If a service crosses this threshold in the EU, it is subject to expanded requirements, including the publication of detailed risk reports, independent audits, mandatory complaints mechanisms, modifications to algorithmic recommendations, and the possibility of large fines for non-compliance.
Why 120 Million Isn’t Just a Number?
The practical significance of the 120 million statistic is that it places ChatGPT on par with the largest online platforms in terms of regulation.
The EU assesses recipients based on a number of criteria — active engagement with the service’s features, not just account ownership — so OpenAI’s 120.4 million figure is an indicator of real reach.
Once a platform is classified as a VLOP, regulators gain the right to demand detailed measures to mitigate systemic risks, transparently explain its personalization principles, and protect user rights.
Specific Obligations might OpenAI face?
If ChatGPT is found to fall under the strictest DSA rules, OpenAI will be required to comply with several key obligations:
- Conduct and publish regular systemic risk assessments and mitigation plans;
- Ensure transparency of algorithmic decisions, including the ability for users to opt out of personalized recommendations;
- Allow independent audits and cooperate with European supervisory authorities;
- Establish mechanisms to respond to illegal content and user complaints, as well as prompt reporting.
Failure to comply with these obligations could result in severe sanctions: the DSA provides for significant fines and even temporary restrictions on activities in the EU in extreme cases.
Precedents and Context: Why Regulators Are Vigilant
European regulators have already demonstrated their willingness to hold OpenAI accountable: in 2024, the Italian supervisory authority Garante fined OpenAI €15 million for data protection violations and briefly suspended it’s service in Italy.
This precedent demonstrates that compliance issues with European law can quickly escalate into administrative action and be costly for the company.
OpenAI: Industry Response
OpenAI is not standing idle: the company cooperates with European authorities, has signed up to the EU Code of Practice, and publishes reports as part of its DSA obligations.
Yet the negotiations focus on issues of competitiveness, excessive regulation, and the impact of regulations on innovation. European officials emphasise that the regulations are not aimed at hindering technology, but rather at protecting citizens’ rights and public interests.
Repercussions for Users and Businesses in the EU?
For users, these are potential benefits: greater transparency, the ability to manage personalisation, and strengthened complaints and data protection mechanisms.
For businesses and developers, these are new operational requirements: reporting, external audits, and possible changes to product solutions to comply with European standards. The regulatory burden may also encourage local alternatives and services with an architecture focused on EU compliance.
ChatGPT’s growth to c120 million active users in the EU is a significant milestone, transforming technological success into legal and institutional accountability. The European regulatory model strives to embed principles of transparency, accountability, and rights protection into the digital ecosystem.
How OpenAI adapts — from data publication to participation in codes of practice and compliance with DSA requirements — will be key for the entire generative AI industry.
Adapability will determine whether the platform can combine scale, innovation, and adherence to European standards.
Read the Latest Articles on DET!
When the Angels Return: How Victoria’s Secret is Shocking the New Era of Beauty
Social Capital Lives On: The Story of Germany and Russia
From Sweden to Türkiye: The Stark Gender Divide in Unpaid Work





