Family Sues OpenAI Over Tumbler Ridge School Shooting, Alleges AI Could Have Prevented Tragedy
Family Sues OpenAI Over School Shooting, Claims AI Could Have Prevented It

Family of Tumbler Ridge Shooting Victim Files Lawsuit Against OpenAI, Alleging AI Could Have Prevented Attack

The family of a child critically injured in one of Canada's worst mass shootings has initiated legal action against OpenAI, contending that the technology firm had the capability to avert the school assault that occurred last month. This lawsuit emerges shortly after OpenAI's CEO expressed intentions to apologize to the affected families in a remote Canadian community, where violence has deeply disrupted the close-knit environment.

Details of the Tragic Incident and AI Involvement

On February 10, 2026, an 18-year-old shooter, Jesse Van Rootselaar, took the lives of eight individuals in the mountain town of Tumbler Ridge, British Columbia. The victims included five school students, aged 12 to 13, and a 39-year-old teaching assistant. Van Rootselaar later died from a self-inflicted injury. Investigations revealed that prior to the attack, he had engaged with ChatGPT over several days in June, describing violent scenarios involving firearms. An automated review system flagged this content, but OpenAI determined it did not indicate "credible or imminent planning" and merely suspended his account without notifying Canadian authorities. The company later discovered a second account linked to the shooter after the initial suspension.

Legal Claims and Family's Pursuit of Justice

On Monday, Cia Edmonds filed a lawsuit against OpenAI on behalf of herself and her two daughters, Maya and Dahlia Gebala, both of whom were present during the shooting. The legal firm representing the family, Rice Parsons Leoni & Elliott LLP, stated that the lawsuit aims to uncover the full truth behind the tragedy, impose accountability, seek redress for harms, and help prevent future mass shootings in Canada. The allegations have not yet been tested in court.

Maya, aged 12, sustained three gunshot wounds: one bullet entered her head above her left eye, another struck her neck, and a third grazed her cheek and part of her ear. She remains hospitalized with a catastrophic traumatic brain injury, permanent cognitive and physical disabilities, right-sided hemiplegia, scarring, and physical deformities. Both Edmonds and her daughter Dahlia, who was not physically injured, have experienced post-traumatic stress disorder, anxiety, depression, and sleep disturbances.

The civil claim alleges that ChatGPT was rushed to market by OpenAI without adequate safety studies. The family is seeking undisclosed punitive damages, asserting that the company's conduct is "reprehensible and morally repugnant" to the plaintiffs and the broader community.

Political and Regulatory Responses

Last week, OpenAI CEO Sam Altman held a virtual meeting with British Columbia Premier David Eby and Tumbler Ridge Mayor Darryl Krakowka, amid growing frustration over the tech giant's policies, which did not mandate reporting violent content to police. Eby emphasized that while an apology is necessary, it is insufficient, and collaboration is needed to avoid retraumatizing the community. A spokesperson for OpenAI described the shooting as an "unspeakable tragedy" and committed to working with officials to implement meaningful changes for prevention, though no timeline was provided.

Eby has emerged as a vocal critic of the regulatory framework governing AI companies in Canada, highlighting the lack of mental health support and firearm access as contributing factors. He stressed that it is unacceptable for companies to decide whether to report such content, calling for urgent policy changes. Under pressure from lawmakers, OpenAI has revised its procedures to better identify warning signs of serious violence. Canada's AI Minister Evan Solomon has requested the company apply new safety standards retroactively and review previously flagged cases to ensure any missed incidents are reported to the Royal Canadian Mounted Police.

While Eby acknowledged OpenAI's responsiveness, he warned that other companies with similar chatbots have not yet updated their policies, underscoring the urgent need for systemic change to prevent future failures.