Meta Confronts Reckoning Over Child Safety Practices in Landmark Trial
Meta is facing a monumental legal challenge as a landmark trial in New Mexico exposes fresh allegations that the company prioritized profit incentives and user engagement over protecting children on its platforms. The trial, now in its fifth week, has seen the state attorney general rest its case, with proceedings expected to continue for another week as Meta presents its defense before jury deliberations begin.
Internal Emails Reveal Exploitation Concerns
Central to the case are internal company documents obtained by the attorney general's office, including emails between Meta executives flagging urgent exploitation issues on Facebook and Instagram. One email from 2019 to Instagram head Adam Mosseri stated, "Data shows that Instagram had become the leading two-sided marketplace for human trafficking." This email was read in court, highlighting serious concerns within the company about platform safety.
Prosecutors have presented evidence demonstrating what they describe as delays and deficiencies in Meta's ability to detect and report harms to children, including the distribution of child sexual abuse material and child trafficking. The trial has also scrutinized Facebook and Instagram features for their alleged impact on children's mental health, with plaintiffs claiming the platforms are intentionally addictive and amplify content promoting self-harm, suicidal ideation, and body dysmorphia.
Defense Arguments and Executive Testimony
Meta's defense has vigorously rejected the attorney general's allegations as "sensationalist, irrelevant and distracting arguments," asserting that the company makes significant efforts to ensure platform safety and continues to invest in protective features for teens. Company executives, including Mosseri and CEO Mark Zuckerberg, have defended Meta's safety track record, arguing that with billions of users worldwide, preventing all crimes and harms on the platforms is impossible.
"We do our best to keep Facebook safe, but we cannot guarantee it," said Mosseri during his testimony in Santa Fe. "Safety is incredibly important to us." Zuckerberg acknowledged in a video deposition that some users, including children, find Meta's platforms addictive, which is also the subject of a separate trial in Los Angeles.
Operation MetaPhile and Undercover Investigations
One of the main pillars of New Mexico's case is "Operation MetaPhile," an investigation by the attorney general's office where undercover agents posing as girls under 13 were contacted by suspects soliciting them for sex through Facebook and Instagram features. Despite receiving hundreds of friend requests per day and accruing 7,000 followers within a month, Meta did not shut down the account, instead sending information about how to monetize and grow its following, investigators testified.
The state also presented allegations that Instagram's algorithms connect pedophiles or help them find sellers of child sexual abuse material, which Mosseri labeled as "unfair." Former Meta executives testified against the company, with Brian Boland, former vice-president of partnerships, stating, "I absolutely did not believe that safety was a priority, which is the primary reason that I left."
Encryption and Reporting Backlogs
The trial revealed how Meta's decision to encrypt Facebook Messenger has blocked access to crucial evidence of crimes, with the National Center of Missing and Exploited Children (NCMEC) calling the move a "devastating blow to child protection." Fallon McNulty, executive director of NCMEC's exploited children division, testified that Meta submitted 6.9 million fewer reports in 2024 after Messenger's encryption was implemented compared to the previous year.
Jurors heard that between May 2017 and July 2021, Meta had a reporting backlog of 247,000 cyber tip reports of potential harms and abuses, which were several weeks or months old when sent to NCMEC. These backlogs may have meant opportunities to prevent crimes or identify perpetrators were lost. McNulty testified that thousands of other reports were improperly classified as low priority, with NCMEC regarding this as "a serious failing that affected child safety."
Mental Health Impacts and Internal Knowledge
Internal documents from Instagram revealed how much the company knew about its tween users despite its 13-and-over policy. A 2018 presentation stated, "If we wanna win big with teens, we must bring them in as tweens," while a 2015 estimate suggested about 30% of 10-12-year-olds in the US use the app. Another document detailed goals to increase time spent by 10-year-olds on Instagram.
Ian Russell, whose daughter Molly died by suicide in 2017 after viewing harmful content on Instagram, testified about the platform's mental health impacts. "That inescapable stream of harmful content turned Molly from that bright, hopeful young person into someone who unbelievably thought she was a burden," he said.
Evidence included internal communications about augmented-reality filters on Instagram that allowed users to alter their appearance, with warnings that teens using these features would be at greater risk of self-image and mental health issues. An email from a former Meta employee to Zuckerberg warned about these risks, citing personal experience with teenage daughters affected by body dysmorphia.
Whistleblower Testimony and Global Implications
Arturo Béjar, a former Meta engineering director turned whistleblower, testified that the platform's recommendation system was "really good at connecting" predators with minors. When he reported issues to the company, he said he understood that executives including Zuckerberg already knew about the problem but chose not to act. "I don't think we can trust Mark Zuckerberg and Meta with our kids," Béjar stated.
The trial comes as Meta faces global regulatory scrutiny, with countries considering age restrictions similar to Australia's ban on social media for those under 16. The outcomes of the New Mexico and Los Angeles trials may influence lawmakers worldwide to implement stricter regulations, potentially cutting Meta off from the younger users it needs for growth.
As the trial continues, the fundamental question remains: Can Meta protect its next generation of users while balancing profit motives with safety responsibilities? The jury's decision could have far-reaching implications for social media regulation and child protection standards globally.



