A creator called RadialB has generated millions of views with AI-fabricated videos depicting Croydon as a dystopian hellscape, despite never visiting the borough
One video showing AI-generated "roadmen" in Parliament garnered eight million views in 24 hours
YouGov polling shows a majority of Britons believe London is unsafe, yet 81% of actual Londoners rate their local area as safe
Copycat accounts from Israel to Brazil now reshare the content, creating an international cottage industry of British urban decay videos
Britain has found its misinformation capital, and it's a place the chief architect has never even visited. A creator from north-west England is manufacturing AI-generated dystopian hellscapes of Croydon—complete with grimy water parks and balaclava-clad crowds—and racking up millions of views whilst real residents deal with the reputational fallout. The economics of "decline porn" have never been more lucrative, and platform safeguards have proven embarrassingly ineffective at stemming the tide.
Person working on computer with AI technology
RadialB represents something more insidious than your typical internet troll. He's part of an emerging class of "post-shame" content merchants who understand exactly what they're doing—manufacturing convincing deepfakes designed to deceive—yet maintain studied indifference to the consequences. When confronted about racist commentary his videos generate, he suggests platform filters handle the problem.
They manifestly don't. What's striking isn't just the audacity of the fabrication but the economics underpinning it. RadialB has discovered that decline porn functions as pure engagement arbitrage: feed AI tools a prompt about "roadmen" in taxpayer-funded facilities, add a perfunctory "AI-generated" label, then watch outraged commenters drive the algorithm wild.
Enjoying this article?
Get stories like this in your inbox every week.
The goldmine in manufactured outrage
Platform incentive structures have created a perverse feedback loop. RadialB insists his prompts don't specify race or ethnicity—he simply asks for "roadmen wearing puffer jackets, track suits, and balaclavas" because that makes the "funniest" characters. The fact that AI models consistently produce images of predominantly Black young men in these scenarios apparently merits no reflection.
His stance that this emerges organically from the technology rather than embedded algorithmic bias is either disingenuous or remarkably naive for someone monetising these very outputs.
The copycat ecosystem demonstrates how quickly this model scales. Users from Israel to Brazil have begun resharing the Croydon videos, openly admitting they do so for engagement and potential monetisation on platforms like Facebook. Arabic-language accounts based in the Middle East have joined in, creating an international cottage industry of British urban decay content.
Several profiles masquerade as UK news accounts whilst exclusively posting AI-generated footage or decontextualised crime clips. These fabrications sit within a broader trend that figures like Elon Musk have amplified to his 230 million followers on X. The Tesla chief spoke at Tommy Robinson's Unite the Kingdom rally last year, warning of "rapidly increasing erosion of Britain with massive uncontrolled migration".
Social media on mobile device screen
When labelling policies meet wilful ignorance
The supposed safeguard of AI disclosure labels has proven embarrassingly ineffective. TikTok, Instagram and X all mandate labelling of synthetic media, and several of RadialB's videos do carry small tags identifying them as AI-generated. Yet BBC interviews with commenters revealed many viewers believed the content anyway.
The label functions more as legal cover for platforms than actual deterrent to misinformation. When TikTok eventually banned RadialB's account for graphic and inappropriate content, he simply opened another. He's currently posting fresh videos of "roadmen" at grimy infinity pools and taxpayer-funded buffets.
C.Tino, a Black TikTok user actually from Croydon, posted a response highlighting how these videos falsely portray his neighbourhood as "ghetto" and make viewers think the fabrications represent real life. His concerns about the trend getting "out of hand" touch on something RadialB airily dismisses: real communities face reputational damage and racist abuse whilst he treats the whole enterprise as an artform that games recommendation algorithms.
The perception gap that makes this profitable
YouGov polling released in January offers disturbing evidence that this content ecosystem may be working. A majority of Britons now believe London is unsafe, yet only a third of people surveyed in the capital itself agree—and 81% of actual Londoners rate their local area as safe. That perception gap between those consuming content about London and those living there suggests a significant disconnect being actively exploited.
RadialB maintains his intention wasn't to become a decline porn influencer but rather to make people laugh, yet he wants viewers to believe his scenes are real initially—"if people saw it and they immediately knew it was fake, then they would just scroll".
The cognitive dissonance required to hold both positions simultaneously speaks to a deeper pathology in content creation economics. His suggestion that some angry comments might be "ironic" is particularly rich considering he acknowledges "50-year-olds and 60-year-olds in the comments raging and saying all this political stuff". Whether those commenters intend their racism ironically seems an academic distinction when real people in Croydon deal with the fallout.
Digital technology and artificial intelligence concept
The barrier to entry for this kind of misinformation manufacture has collapsed. RadialB credits the "huge jump" in AI tool quality and availability, noting it "hugely lowers the barrier for entry" for anyone wanting to make "fake stuff". Other content creators have followed similar playbooks: South African YouTuber Kurt Caz, with over four million subscribers, recently faced accusations of using AI to alter a thumbnail for his "Avoid this place in London" video, adding Arabic shop signs and a balaclava to an image where the original footage showed English signage and a friendly cyclist.
The infrastructure is now in place for industrial-scale production of convincing urban decay narratives, divorced from evidence and optimised purely for engagement. As AI tools become more sophisticated and accessible, expect this model to spread beyond Croydon to other communities whose reputations can be strip-mined for views. RadialB and his copycats have demonstrated the playbook works—and that platform policies remain toothless in stopping the flood of AI-generated 'slop' that now dominates social feeds.
AI-generated misinformation is now profitable and scalable, with collapsed barriers to entry creating industrial-scale fabrication divorced from evidence
Multi-award winning serial entrepreneur and founder/CEO of Venntro Media Group, the company behind White Label Dating. Founded his first agency while at university in 1997. Awards include Ernst & Young Entrepreneur of the Year (2013) and IoD Young Director of the Year (2014). Co-founder of Business Fortitude.