A Gloucestershire-based photography charity has celebrated the reinstatement of its Facebook group after battling for more than a month against an automated takedown triggered by Meta's artificial intelligence systems mistakenly identifying the organisation as promoting class-A drugs.
The Costly Confusion: Heroines Versus Heroin
Hundred Heroines, an organisation dedicated to celebrating female photographers, found itself at the mercy of Facebook's content moderation systems when its group was removed in September with a notification stating it "goes against our community standards on drugs".
This marked the second time in 2025 that the charity had faced removal from the platform for alleged breaches of community guidelines related to drug promotion. The root cause appears to be Facebook's AI tools incorrectly linking the charity's name containing "Heroines" with references to the opioid heroin.
Dr Del Barrett, the charity's founder and former president of the Royal Photographic Society, described the impact as "devastating" for an organisation that relies heavily on Facebook to attract visitors to its physical space in Nailsworth, near Stroud.
A Kafkaesque Appeal Process
After a second appeal within twelve months, the Hundred Heroines: Women in Photography Today page was quietly restored last week without explanation or apology from the tech giant.
"AI technology picks up the word heroin without an 'e', so we get banned for breaching community guidelines," Barrett explained. "Then no matter what you do, you can't get hold of anyone and it really affects us because we rely on Facebook to get our local audience."
The charity, which houses approximately 8,000 items in its collection focusing on women photographers throughout history, estimates that about 75% of its visitors come via Facebook.
Meta's Increased Vigilance and Its Consequences
The timing of these erroneous takedowns coincides with Meta's heightened efforts to combat drug-related content, particularly in response to the opioid crisis in the United States where 80,000 people died from overdoses last year.
In a statement on its website, Meta emphasises that buying and selling drugs is "strictly prohibited" on its platforms and claims to have "robust measures" in place to detect and remove such content. The company states: "We recognise the significance of the drug crisis and are committed to using our platforms to keep people safe... and strict enforcement of our community standards."
However, when the company's automated systems incorrectly identify legitimate groups as violators, users describe the appeal process as Kafkaesque, with feedback forms often providing the only method to report errors.
Meta acknowledges that AI tools are "central to [its] content review process" and can "detect and remove content that goes against our community standards before anyone reports it". The technology sometimes flags content for human review teams, though Barrett confirmed Hundred Heroines received no human interaction during their appeal process.
"We thought, 'should we change our name?' But why should we? Why have we got to mess with our brand just because of Facebook?" Barrett questioned, highlighting the difficult position organisations face when automated systems malfunction.
She added: "It sort of verges on scary and laughable. You think these bots are running the world and they can't tell the difference between a woman and an opioid. Heaven help us."
This incident follows earlier criticism directed at Meta this year over mass bans or suspensions of accounts on both Facebook and Instagram. While users blamed the company's AI moderation tools for erroneous bans, Meta acknowledged only a "technical error" affecting Facebook Groups while denying any increase in incorrect enforcement across its platforms.
The company stated it was addressing the issue that emerged during summer after multiple groups, including one sharing memes about bugs, were allegedly informed they violated standards on "dangerous organisations or individuals".