When Alastair learned he might have to hand Discord his ID to keep running his 60,000-member server, his response was blunt: 'I really do not want to send Discord my ID given their track record — I do not trust them.' The US-based British streamer, known online as Eret to his one million Twitch followers, had good reason for concern. Just five months earlier, Discord had admitted that official ID photos of roughly 70,000 users had potentially leaked through a breach at a third-party verification firm.
That October 2024 incident has become the inconvenient backdrop to Discord's global rollout of age verification measures. Starting this March, the platform's 200 million users worldwide face a stark choice: submit to facial scans or upload government ID, or lose access to large portions of the service. For creators who've spent years building communities worth millions in potential revenue, this represents an impossible dilemma between protecting vulnerable members and trusting a company that has already demonstrated its verification partners can be compromised.
The verification conundrum
Discord's assurances sound reassuring in principle. According to the company, facial scans never leave users' devices, and uploaded IDs are only used to verify age before being deleted. Savannah Badalich, Discord's head of product policy, has described the measures as giving teens 'strong protections while allowing verified adults flexibility.'
The problem is that the October breach didn't involve Discord's own systems. A third-party vendor handling age verification was compromised. Discord severed ties with that firm, but the fundamental vulnerability remains: the platform relies on external partners to process sensitive data. When Discord says your facial scan stays on your device or your ID gets deleted, users are being asked to trust not just the company but its entire supply chain of verification providers.
The UK trial, which has been running since early 2025, uses technology from Persona — a firm backed by Founders Fund, the investment vehicle co-founded by Peter Thiel. For users familiar with Thiel's other venture, Palantir, which holds lucrative contracts with intelligence agencies and law enforcement, this connection does little to inspire confidence. Whether fair or not, the association between age verification infrastructure and a data firm known for government surveillance work has amplified concerns about where this information might ultimately flow.
The creator economy calculus
British streamer Toby, who commands 5.2 million Twitch followers and 2.7 million YouTube subscribers under the handle Tubbo, put the concerns in stark terms: 'I just think it's kind of a dangerous precedent for social media companies to request 3D scans of your face or official documents without there being any kind of knowledge of how that information is being protected or stored.'
That's not an abstract privacy concern. These creators have built businesses on Discord's infrastructure. Their servers function as community hubs, merchandise distribution channels, and direct lines to paying subscribers. Walking away means abandoning income streams and fracturing communities. Staying means asking followers — many of them young people in marginal situations — to surrender biometric data to a platform that cannot guarantee its security.
American streamer Katie, known as Pikachulita, articulated a broader worry that resonates beyond gaming communities: 'We live in a time when it's not too far-fetched to believe that companies like Discord could share this data with state or federal agencies — in the US or elsewhere — for their benefit.' Google search data suggests her scepticism is widely shared. Queries for Discord alternatives have spiked globally since the announcement, whilst some users report cancelling paid Nitro subscriptions in protest.
What's particularly telling is that these creators explicitly support child protection measures. Alastair emphasised that 'something must be done to protect children on Discord.' The objection isn't to age verification itself, but to the specific implementation requiring biometric data or government documents from a company whose verification infrastructure has already been breached.
The regulatory squeeze
Discord isn't implementing these measures out of enthusiasm for collecting sensitive data. The company faces mounting pressure from regulators worldwide, particularly following the UK's Online Safety Act, which places legal obligations on platforms to prevent minors from accessing harmful content. Similar legislative frameworks are advancing across Europe and gaining traction in various US states.
This creates a collision between two legitimate regulatory objectives: protecting children from age-inappropriate content and safeguarding personal data from misuse or breach. Platforms caught in the middle are making calculated decisions about which risk to prioritise. Discord has chosen child safety compliance over privacy concerns, gambling that regulatory penalties for inadequate age checks pose a greater threat than user backlash over data collection.
Dr Peter Macaulay, a senior lecturer in psychology at the University of Derby, according to the BBC, noted that the backlash demonstrates the challenge tech firms face deploying child safety tools whilst preserving community trust. Meanwhile, Professor Carissa Véliz of Oxford's Institute for Ethics in AI highlighted a fundamental problem: 'Companies have broken their word before, facing little to no consequences.'
That observation cuts to the core issue. Discord can make whatever promises it likes about data handling. Users have no meaningful way to verify those claims, no recourse if the promises prove hollow, and limited alternatives if they refuse to comply. The creators abandoning or considering leaving the platform recognise that trust, once broken, cannot be restored through assurances alone.
The rollout will proceed regardless. What bears watching is whether Discord's approach becomes the template other platforms adopt under regulatory pressure, or whether the backlash prompts development of verification systems that confirm age without requiring biometric data or identity documents. The answer will shape not just Discord's future, but the broader question of how much privacy the creator economy must sacrifice to meet child safety mandates.