Business Fortitude
    Tech firms will have 48 hours to remove abusive images under new law
    Policy & Regulation

    Tech firms will have 48 hours to remove abusive images under new law

    Ross WilliamsByRoss Williams··4 min read

    🕐 Last updated: February 24, 2026

    Matching CSAM and terror content standards

    The UK government has drawn a regulatory line in the sand: tech platforms will have 48 hours to remove non-consensual intimate images or face fines of up to 10% of global revenue. That percentage, for context, would amount to £2.9bn for Meta or £24bn for Alphabet based on their 2024 figures.

    What makes this proposal more than another piece of well-intentioned legislation is the enforcement mechanism. According to Prime Minister Keir Starmer, platforms already operate under similar obligations for child sexual abuse material and terrorist content. The government believes the infrastructure exists. The question is whether platforms can apply the same operational rigour to intimate image abuse at scale, and whether regulators have the teeth to make them.

    The amendment to the Crime and Policing Bill, currently winding through the Lords, represents a substantial escalation in legal liability for any company hosting user-generated content in the UK. But the devil, as always, sits in the detail of enforcement.

    Enjoying this article?

    Get stories like this in your inbox every week.

    Starmer's comparison to existing content moderation duties deserves scrutiny. Platforms do face legal obligations around CSAM and terror material, but those systems have been built over years with significant investment in hash-matching technology, human review teams, and cross-platform intelligence sharing. The US National Center for Missing & Exploited Children maintains PhotoDNA, which allows platforms to identify known CSAM through digital fingerprinting. Similar systems exist for terrorist propaganda.

    Intimate image abuse presents different challenges. Unlike CSAM databases, there is no centralised repository of intimate images reported as non-consensual. Each complaint requires individual assessment of context, consent, and identity verification. A Parliamentary report published in May found reports of intimate image abuse increased 20.9% in 2024, suggesting the volume is already straining existing response systems.

    The proposed legislation attempts to solve this through a single-point reporting system. Victims would flag content once, triggering removal across platforms and implementing blocks against re-upload. The operational complexity here is considerable. Platforms would need to share data about reported images across company boundaries whilst navigating privacy regulations. They would need to distinguish between consensual adult content and abuse. They would need to do all this within 48 hours, repeatedly, at scale.

    The 10% revenue stick

    The penalty structure deliberately mirrors GDPR, which can also levy fines up to 10% of annual global turnover for serious breaches. Only one GDPR fine has ever reached that threshold: Luxembourg's €746m penalty against Amazon in 2021. Most hover in the millions rather than billions. The precedent suggests regulators prefer negotiated compliance over maximum penalties.

    What's interesting here is the enforcement architecture Starmer outlined: a "combination of oversight bodies in relation to what's online" plus criminal enforcement. That language is notably vague. Ofcom holds enforcement powers under the Online Safety Act, but intimate image abuse spans multiple agencies including the Information Commissioner's Office, the Crown Prosecution Service, and potentially the new Victims' Commissioner. Coordination between these bodies on rapid enforcement decisions within a 48-hour window raises practical questions about who makes the final call.

    Technology Secretary Liz Kendall's rhetoric about tech firms no longer getting "a free pass" plays well politically, but the mechanisms for blocking access to rogue websites outside the Online Safety Act's reach remain unclear. Internet service provider blocking exists for copyright infringement and CSAM, but those systems move slowly. Applying them to intimate image abuse at speed would require ISPs to receive, verify, and act on blocking orders faster than current processes allow.

    Commercial calculations

    For platforms, this creates a new category of compliance cost. Content moderation teams would require expansion. Machine learning models need training on intimate image detection with sufficient accuracy to avoid false positives. Legal teams must develop rapid response protocols. All of this needs funding at a time when Meta, Google, and others are already cutting trust and safety budgets.

    Smaller platforms and emerging social networks face steeper challenges. The infrastructure for 48-hour compliance across global operations doesn't come cheap. That may be the quiet intention: raising the barriers to entry for platforms that can't or won't invest in robust moderation. The reference to targeting "rogue websites" suggests the government recognises some operators will simply ignore UK law. Whether blocking measures prove effective against sites hosted in non-cooperative jurisdictions remains an open question.

    The January confrontation with X over its Grok AI tool, which generated non-consensual sexualised images of real women, clearly accelerated this legislative push. The February deepfake law criminalised creation. This amendment targets distribution. The combination creates overlapping pressure points, but enforcement against a platform like X, which has systematically reduced its trust and safety operations, will test whether financial penalties alone can compel compliance from companies willing to fight regulators in court.

    The legislation moves to the Lords for detailed scrutiny. Industry responses will determine whether the 48-hour deadline proves workable or requires revision. The precedent this sets could influence regulatory approaches across Europe, particularly as the EU reviews its Digital Services Act effectiveness. Whether this becomes a model for other jurisdictions or a cautionary tale about ambitious timelines colliding with operational reality will depend on what happens when the first penalties land.

    Ross Williams
    Ross Williams

    Co-Founder

    Multi-award winning serial entrepreneur and founder/CEO of Venntro Media Group, the company behind White Label Dating. Founded his first agency while at university in 1997. Awards include Ernst & Young Entrepreneur of the Year (2013) and IoD Young Director of the Year (2014). Co-founder of Business Fortitude.

    More articles by Ross Williams

    Comments

    💬 What are your thoughts on this story? Join the conversation below.

    to join the conversation.

    More in Policy & Regulation

    View all →