A 12-year-old girl lies motionless in a hospital bed in British Columbia, her once-bright eyes now closed to the world, her voice silenced by a bullet that ripped through her brain. Two months after the horror at Tumbler Ridge Secondary School, Maya Gebala still cannot speak or see. She battles catastrophic traumatic brain injury, right-sided hemiplegia, scarring, and permanent physical disabilities. Every small twitch of her fingers or flicker of awareness feels like a miracle to her devastated family. Yet for Maya’s mother, Cia Edmonds, the fight is only beginning — not just for her daughter’s recovery, but for answers about how this nightmare was allowed to happen.

The massacre on February 10, 2026, stands as one of the deadliest school shootings in Canadian history. Nine lives were brutally cut short. Five students, most aged 12 or 13, an educator named Shannda Aviugana-Durand, 39, and two members of the shooter’s own family lay dead by the time the rampage ended. The killer, 18-year-old Jesse Van Rootselaar, a high-school dropout, first murdered his mother, Jennifer Strang, 39, and his 11-year-old stepbrother, Emmett Jacobs, in their home. He then drove to the school and unleashed terror in a matter of minutes before turning the gun on himself.

But behind the bloodshed lies a chilling, modern twist that has shaken parents, educators, and lawmakers across Canada and beyond: the shooter’s disturbing obsession with ChatGPT. Court documents and a civil lawsuit filed by Maya’s mother allege that OpenAI, the company behind the AI chatbot, received multiple internal warnings about Van Rootselaar’s conversations — conversations that openly described scenarios involving gun violence. Twelve OpenAI employees reportedly flagged the exchanges as indicating an ā€œimminent risk of serious harm to othersā€ and recommended notifying Canadian law enforcement. Instead, the company simply banned the account. No police were contacted. Shockingly, the lawsuit claims Van Rootselaar simply opened a second account and continued planning.

This revelation has ignited a firestorm. How could an AI system designed to be helpful become a silent confidante for a troubled teenager plotting mass violence? Why were clear red flags ignored? And could this tragedy have been prevented if technology giants were forced to act with the same urgency they demand from their users?

The story begins in the remote, tight-knit community of Tumbler Ridge in north-eastern British Columbia — a place better known for its coal-mining history and rugged natural beauty than for headlines about school shootings. On that cold February morning, the day started like any other for families across the town. Children boarded buses or walked to Tumbler Ridge Secondary School. Parents kissed their kids goodbye, never imagining the horror that would unfold before lunch.

Van Rootselaar, already known to police for years due to mental health concerns, began his rampage at home. He killed his mother and stepbrother before heading to the school. Police later confirmed they had visited the family residence multiple times, including in spring 2025 for issues involving self-harm. Firearms had even been seized during one of those visits. Yet somehow, the teenager still had access to weapons on the day of the attack.

Inside the school, chaos erupted with terrifying speed. Maya Gebala, a bright, outgoing 12-year-old described by friends and family as someone who never backed down from a challenge, tried to do what any brave child might: she attempted to lock the doors to the school library to keep the shooter out. It was an act of pure courage from a girl who loved life and learning. But the gunman was already inside. He shot her three times — once above her left eye, once in the neck, and a grazing wound to her cheek and ear. The bullet to her brain caused devastating damage.

Maya was one of the ā€œluckyā€ ones who survived. Twenty-five others were injured. Nine did not. Among the dead were children who had walked the same hallways, laughed in the same classrooms, and dreamed the same ordinary dreams as Maya.

In the immediate aftermath, Cia Edmonds’s world shattered. In a raw social media post that reached her 108,000 followers, she wrote with heartbreaking honesty: ā€œToday started like any other. Now, however, my 12-year-old daughter is fighting for her life while they try to repair the damage from a gunshot wound to the head, and one to the neck. She was a lucky one, I suppose. Condolences to the other families during this tragedy. This doesn’t even feel real.ā€

Two months later, the pain has not eased. Maya remains hospitalized, undergoing surgeries and rehabilitation. Surgeons once struggled to keep her heartbeat stable as blood poured from her wounds. She has defied expectations simply by staying alive, by moving a limb, by showing signs of awareness. But she still cannot speak or see. Her mother sleeps in the same hospital room, refusing to leave her side. Bob Zimmer, the local MP who has become a fierce advocate for the family, visited recently and noted with cautious hope that Maya’s arms and legs had begun to move. ā€œShe’s just ready to go,ā€ he said, describing her fighting spirit. Yet the road ahead is long and uncertain.

What has made this tragedy even more haunting is the growing evidence that warning signs were visible — not just to human eyes, but to artificial intelligence. The civil lawsuit filed by Maya’s mother against OpenAI paints a disturbing picture. According to the claim, Van Rootselaar created a ChatGPT account before he turned 18 (allegedly without proper age verification). He treated the chatbot like a ā€œtrusted confidante,ā€ pouring out detailed scenarios involving gun violence over several days in late spring or early summer 2025. Twelve OpenAI staff members flagged the conversations internally as a serious risk. They recommended contacting Canadian authorities. That recommendation was reportedly rebuffed. The only response was to ban the account.

Even more alarming, the lawsuit alleges Van Rootselaar simply created a new account and continued his conversations. OpenAI has defended its actions, stating the chats did not meet their threshold for a ā€œcredible or imminent planā€ of harm. But the plaintiffs argue the company had ā€œspecific knowledge of the shooter’s long-range planning of a mass casualty eventā€ and failed to act meaningfully. No alert went to police. No intervention occurred.

This case has thrust OpenAI — and the entire AI industry — into the spotlight in a way few could have predicted. Sam Altman, OpenAI’s CEO, held a virtual meeting with British Columbia Premier David Eby and Tumbler Ridge’s mayor to address the fallout. An apology was issued, but many see it as nowhere near enough. The lawsuit demands accountability, asking whether tech companies that profit enormously from these tools have a moral and legal duty to protect the public when their systems detect danger.

The tragedy has also reignited painful national debates in Canada. The country has already endured other mass shootings, most notably the 2020 Nova Scotia rampage that killed 22 people. In response, the federal government introduced strict gun-control measures, including bans on assault-style weapons and a freeze on handgun sales. Canada has roughly 1.3 million registered firearms. Yet questions remain about whether those laws were enough — and whether mental health support systems failed Van Rootselaar long before he reached for a gun.

Police have confirmed the suspect had a documented history of mental health issues. Officers had attended the family home multiple times. Firearms were seized in the past. Yet he still carried out the attack. Premier Eby has promised a thorough review of any interactions the shooter had with the province’s health system, vowing to learn lessons that could prevent future horrors.

For the families of the victims, those promises feel hollow without real change. Maya’s mother continues to fight not only for her daughter’s recovery but for systemic reform. She wants a full public inquiry. She wants tech companies held responsible. She wants the world to understand that her daughter — and the other children lost that day — were not just statistics. They were vibrant, hopeful kids whose futures were stolen in minutes.

Bob Zimmer, the MP working closely with the family, has been vocal. ā€œThere were so many red flags and so many preventative things that could have been done,ā€ he told The Sun. He is pushing for a federal public inquiry to leave ā€œno stone unturned.ā€ He visits Maya when he can, offering what comfort he can to a family navigating unimaginable pain. ā€œSpring is coming in Canada,ā€ he noted, ā€œand they want to experience that newness of spring. They want to keep moving forward.ā€

Maya’s story has touched hearts far beyond Tumbler Ridge. People worldwide have sent prayers and messages of support. Yet her mother knows the real battle is fought in hospital rooms, in courtrooms, and in the quiet moments when a child who once ran and laughed now struggles to communicate.

The broader implications stretch far beyond one small Canadian town. This case forces society to confront uncomfortable questions about the rapid rise of artificial intelligence. If an AI chatbot can detect violent intent but chooses only to ban an account rather than alert authorities, what does that say about corporate responsibility? Should platforms like ChatGPT be required by law to report credible threats, much like schools or doctors must report suspected abuse? And in an age when young people increasingly turn to AI for companionship, advice, and even emotional support, how do we protect the most vulnerable from being radicalized or enabled by machines that never truly understand human suffering?

The victims of Tumbler Ridge deserved better. Maya Gebala deserved better. She was a girl full of life, someone who faced challenges head-on and never said no to an opportunity. Now she fights every day just to regain the most basic abilities. Her family fights alongside her, refusing to let her story fade into another forgotten headline.

As investigations continue and the lawsuit against OpenAI moves forward, one thing is certain: this tragedy cannot be dismissed as an isolated incident. It is a warning. A warning about unchecked mental health crises, about gaps in gun control, about the dangers of AI without proper safeguards, and about a society that must do more to protect its children.

Maya Gebala may not be able to speak right now, but her mother’s voice — and the voices of all the families affected — are ringing loud and clear. They are demanding answers. They are demanding change. They are demanding that never again should a child’s innocent day at school end in gunfire because red flags were ignored — whether those flags came from human eyes or from the cold calculations of artificial intelligence.

The sun still rises over Tumbler Ridge. The community grieves, heals, and tries to move forward. But for Maya and the families who lost everything on February 10, 2026, the fight for justice and prevention has only just begun. Their pain deserves more than thoughts and prayers. It demands action — before the next red flag is missed and another child’s light is extinguished forever.