What Meta is threatening, and why

In a court filing this week, Meta stated that the operational reforms proposed by New Mexico's attorney general would be so onerous that withdrawing its platforms from the state entirely would be the more viable option, as first reported by the Guardian. If carried out, it would mark the first time a major social media company has blocked access to its services in a US state.

The case, New Mexico v. Meta, reached its first milestone earlier in 2026 when a court found the company liable for systemic child-safety failures on Instagram and Facebook, according to the Guardian's reporting. The state's department of justice argued that Meta's platforms had failed to protect underage users from harmful content and predatory behaviour. The court imposed a $375m fine.

The second phase of the lawsuit, known as the remedies phase, is scheduled to begin on 4 May 2026. It will determine the specific reforms Meta must implement. New Mexico's department of justice has proposed a series of product-level changes designed to make the platforms safer for children in the state. Meta has characterised those changes as unfeasible.

The company's position amounts to a stark ultimatum: accept the fine, or face the withdrawal of three of the world's most widely used communications platforms from an entire state. New Mexico has a population of roughly 2.1 million.

The $375m fine versus the cost of compliance

On paper, the financial penalty is modest relative to Meta's scale. The company reported global revenue of approximately $165bn in 2025. A $375m fine represents roughly 0.2% of annual revenue.

Meta's filing makes clear, however, that the fine is not the element it considers prohibitive. The proposed remedies, which according to the Guardian would compel product-level reforms to Instagram and Facebook, are what the company frames as the real burden. The distinction matters. Fines are one-off costs that can be absorbed, provisioned against, or appealed. Operational remedies, by contrast, require sustained engineering investment, ongoing compliance monitoring, and potentially fundamental changes to how a product works.

This framing is deliberate. By casting compliance as existential rather than expensive, Meta shifts the debate from whether it can afford to protect children to whether regulators are asking for something technically impossible. It is a negotiating posture, but it is also a signal to every other jurisdiction considering similar enforcement action.

Meta has argued these reforms are unfeasible and it would be left with little option but to withdraw its services completely.

That language, drawn from the Guardian's reporting on the court filing, positions the company as a reluctant actor forced into an extreme response, rather than a firm resisting accountability.

Parallels with UK Online Safety Act enforcement

The timing is notable for UK observers. The Online Safety Act entered its enforcement phase in 2025, with Ofcom empowered to impose fines of up to 10% of qualifying global turnover for non-compliance. For a company of Meta's size, that theoretical maximum would dwarf the New Mexico penalty by orders of magnitude.

Ofcom's approach differs from the New Mexico model in important respects. The UK regime is primarily prospective: it sets out duties that platforms must meet and penalises failures going forward. The New Mexico case is retrospective, imposing remedies after a finding of liability for past conduct. But the underlying tension is the same. Regulators want product-level changes. Platforms argue those changes are disproportionate.

The Online Safety Act grants Ofcom broad powers to require platforms to conduct risk assessments, implement age-verification measures, and take proactive steps to limit children's exposure to harmful content. If a platform refuses to comply, Ofcom can, in theory, seek a court order to restrict access to the service in the UK. The mechanism is different from a voluntary withdrawal, but the end result could be similar.

Meta's New Mexico stance raises an obvious question: would the company adopt the same posture in the UK? The commercial calculus differs. New Mexico is a single state with a relatively small user base. The United Kingdom represents a significantly larger market. But the rhetorical strategy of framing compliance as technically unfeasible could travel across jurisdictions regardless of market size.

UK regulators will be watching closely. If Meta's brinkmanship succeeds in softening the New Mexico remedies, it establishes a template that could be deployed against Ofcom, the European Commission, or any other body pursuing platform-level child-safety reform.

What this means for boards with platform dependency

For UK businesses that rely on Meta's platforms for customer acquisition, communication, or commerce, the New Mexico case is a reminder that platform access is not guaranteed. A company that threatens to withdraw from a US state could, under different circumstances, withdraw from or restrict services in other markets.

Boards should consider several practical implications.

Concentration risk. Businesses that depend heavily on Instagram, Facebook, or WhatsApp for core operations carry a form of supplier risk that is rarely stress-tested. The New Mexico filing illustrates that regulatory disputes can escalate to the point where service continuity is openly questioned.

Regulatory trajectory. The Online Safety Act is still in its early enforcement phase. Ofcom has signalled that it intends to take a phased, proportionate approach, but the statutory powers available to the regulator are substantial. Companies that operate platforms, or depend on them, should be tracking enforcement actions and consultation outcomes as closely as they track tax or employment law changes.

Compliance as a cost of doing business. Meta's argument that operational remedies are more burdensome than financial penalties is instructive. For smaller UK platforms or digital businesses, the lesson is that building child-safety compliance into product design from the outset is likely cheaper than retrofitting it under regulatory pressure. The Online Safety Act's risk-assessment duties are designed to encourage exactly this approach.

Precedent effects. If the New Mexico court imposes remedies and Meta complies rather than withdrawing, the episode will have demonstrated that platform companies' threats are negotiating tactics rather than genuine commitments. If Meta does withdraw, it will create a precedent with implications far beyond one US state. Either outcome will shape the regulatory environment in which UK businesses operate.

The remedies phase begins on 4 May 2026. The outcome will not bind any UK court or regulator, but it will influence how platform companies and governments approach child-safety enforcement for years to come. UK boards with any degree of platform dependency, which in practice means most of them, should be paying attention.