How a Guardian Investigation Uncovered Child Sex Trafficking on Meta Platforms
A tipoff in 2021 sparked a Guardian investigation that exposed how child sex trafficking surged on Facebook and Instagram during the Covid pandemic. This reporting ultimately contributed to Meta losing a multimillion-dollar court case in March this year, marking a significant legal defeat for the tech giant.
The Investigation Begins with a Critical Tip
While reporting on migrant worker exploitation in the Gulf, a trusted source alerted journalists to a disturbing trend: child sexual abuse trafficking was increasing in the United States. As predators moved online during lockdowns, they began using Meta's platforms to buy and sell children. This information launched a collaborative investigation with human rights journalist Mei-Ling McNamara.
At the time, Meta was still known as Facebook, and there had been no prior reporting on how children were being trafficked through its services. Experts from anti-trafficking organizations and law enforcement officials described the crimes they were witnessing, revealing that much of the activity occurred in non-public areas like Facebook Messenger and private Instagram accounts.
Uncovering Shocking Evidence in Court Documents
The investigation relied heavily on Pacer, the federal courts records database, though finding evidence proved challenging due to sealed records and lack of text search functionality. Journalists spent hours examining Department of Justice press releases and court documents, eventually uncovering transcripts of sale negotiations for teen girls conducted through Facebook Messenger.
Exhibit documents contained pictures of trafficking victims advertised for sale using Instagram's Stories function, complete with discussions about money and logistics. In all cases examined, Meta had failed to detect or flag these criminal activities.
Former Moderators Reveal Systemic Failures
McNamara and colleagues interviewed former contract workers who had moderated content for Facebook and Instagram. These individuals described being traumatized by the harmful material they reviewed daily. All reported that their efforts to flag potential child trafficking on Meta platforms frequently went unaddressed, with harmful content rarely being removed by the company.
The moderators expressed feelings of helplessness and criticized Meta's narrow criteria for escalating potential crimes to law enforcement, suggesting the company prioritized other concerns over child safety.
Survivor Stories Highlight Human Cost
In July 2022, journalists visited Courtney's House, a Washington DC safe house for teen girls of color who are survivors of trafficking. Run by Tina Frundt, a trafficking survivor and former member of the United States Advisory Council on Human Trafficking, the organization provided crucial insights into how traffickers operate.
Frundt demonstrated how Instagram's Stories function was used to advertise girls for sex and described how vulnerable youth were targeted. She shared the tragic story of a 15-year-old girl who frequented Courtney's House, loved by her family and peers, who died after meeting a sex buyer through Instagram who gave her fentanyl-laced drugs.
Law Enforcement Confirms Growing Problem
During a visit to an assistant district attorney's office in Massachusetts, prosecutors revealed that child trafficking crimes on social media platforms were increasing by approximately 30% annually. The pandemic exacerbated the situation as children spent more time online for remote learning, with fewer opportunities for teachers and other adults to notice warning signs.
Prosecutors explained that traffickers could easily identify vulnerable children based on their online activity, with the digital nature of these crimes making them both lucrative and difficult to trace. An incarcerated sex trafficker even identified Instagram as his platform of choice for committing crimes.
Legal Repercussions and Meta's Response
The Guardian's investigation, published in April 2023 as "How Facebook and Instagram became marketplaces for child sex trafficking," initially seemed to have limited impact due to Section 230 protections that shield social media platforms from liability for user-generated content.
However, several months later, the investigation was cited in a Supreme Court amicus brief, and New Mexico's attorney general filed a lawsuit against Meta for failing to protect children from sexual abuse and human trafficking on its platforms. The complaint specifically referenced the Guardian's findings.
In March of this year, Meta lost the resulting jury trial - its first such legal battle - and was ordered to pay $375 million in civil penalties for violating New Mexico's consumer protection laws. The company announced plans to appeal, maintaining confidence in its record of protecting teens online.
Ongoing Revelations and Encryption Concerns
In the three years since the initial investigation, the Guardian has continued publishing revelations about child exploitation on Meta's platforms. These include reports that Facebook Messenger and Meta Pay were used to exchange money for child sexual abuse material, and the tragic story of Kristen Galvan, a Texas teenager groomed and sold for sex through Instagram who was later murdered.
Child safety experts have criticized Meta's December 2023 decision to encrypt Facebook Messenger, arguing that while it enhances user privacy, it also prevents scanning for inappropriate content and hinders law enforcement investigations. Meta has defended encryption as safe because users can report inappropriate interactions, but Instagram head Adam Mosseri testified that self-reporting tools are far less effective than the company's own detection technology.
Additional Legal Challenges for Meta
Just one day after the New Mexico verdict, Meta lost another trial in Los Angeles focused on platform features that allegedly harm children's mental health by being intentionally addictive and amplifying content promoting self-harm and body dysmorphia. Meta plans to appeal this ruling as well.
Further legal battles loom, with a coalition of 33 attorneys general preparing to sue Meta for allegedly "knowingly designing and deploying harmful features" that "purposefully addict children and teens." These cases represent growing scrutiny of how social media platforms impact vulnerable users.



