What the Commission found
The Commission said on Wednesday that Meta does not have effective measures in place to stop under-13s from using Facebook or Instagram, according to reporting by the Guardian. The preliminary findings follow formal Digital Services Act (DSA) proceedings opened against the company in May 2024, which initially examined both child safety and addictive design features on Instagram.
The scope of the charge has since narrowed. The Commission's current finding focuses specifically on age-gating failures across both platforms, rather than the broader question of algorithmic design. In practical terms, the regulator concluded that Meta's self-declared age screens, which rely largely on users stating their date of birth at sign-up, do not constitute an effective barrier to access by minors.
The DSA requires very large online platforms, defined as those with more than 45 million monthly active users in the EU, to take "appropriate and proportionate measures" to ensure a high level of privacy, safety and protection of minors. Meta's platforms comfortably exceed that threshold.
Financial exposure and next steps for Meta
The preliminary finding is not a final ruling. Meta now has the right to respond, and the Commission must issue a definitive decision before any penalty is imposed. Historically, preliminary findings under EU digital regulation have led to extended negotiation periods.
The theoretical ceiling, however, is significant. Under the DSA, fines can reach up to 6 per cent of a company's global annual turnover. Meta reported worldwide revenue of approximately $164 billion in 2025, according to the company's public filings. A maximum penalty on that basis would exceed $9.8 billion.
Actual fines in cases that begin with preliminary findings typically land far below the statutory maximum after dialogue between the platform and the regulator. The Commission can also impose periodic penalty payments of up to 5 per cent of average daily worldwide turnover for continued non-compliance, creating an ongoing financial incentive to remediate.
Meta has not yet publicly detailed its response to the findings. The company has previously stated that it uses age-verification technology and parental controls, though critics and regulators have questioned the efficacy of those tools.
Parallels with the UK's Online Safety Act
The timing matters for UK-based businesses. Ofcom's first set of enforceable codes under the Online Safety Act, covering age assurance and child access prevention, are expected to take effect in 2026. Those codes will impose obligations on any platform with UK users under 18, a broader age bracket than the EU's focus on under-13s.
The regulatory architecture differs in detail but converges in direction. The DSA places duties on platforms designated as "very large"; the Online Safety Act applies a risk-based framework to a wider range of services, including smaller operators. Both regimes treat inadequate age-gating as a compliance failure rather than a design choice.
Ofcom has signalled that it expects platforms to move beyond simple self-declaration of age. Its consultation documents reference technologies such as age estimation, identity verification and open banking checks, though no single method has been mandated. The Commission's finding against Meta suggests that regulators on both sides of the Channel regard date-of-birth entry fields as insufficient.
What UK platform operators should do now
For UK scale-ups and mid-market businesses running user-facing digital products, the Meta case is a leading indicator rather than an abstract headline about Big Tech.
Three practical points stand out.
First, any platform likely to attract users under 18 should audit its current age-assurance mechanisms against the standards emerging from Ofcom's draft codes. Waiting for final enforcement is a risk; implementation lead times for robust age-verification technology can stretch to several months.
Second, businesses that rely on Meta's advertising ecosystem should monitor whether the DSA proceedings lead to changes in how Meta collects or segments data on younger users. Restrictions on profiling minors could affect targeting options and campaign performance across the EU.
Third, the enforcement pattern matters. The Commission moved from opening proceedings to preliminary findings in under two years. Ofcom has indicated it intends to act with similar pace once its codes are in force. Boards and compliance teams at UK platform businesses should treat age assurance as a near-term operational priority, not a long-horizon regulatory risk.
The direction of travel is clear on both sides of the Channel: platforms that cannot demonstrate effective age-gating face regulatory action and, potentially, material financial penalties. The Meta case is the most prominent test so far. It will not be the last.



