A 12-year-old girl lies motionless in a hospital bed in British Columbia, her once-bright eyes now closed to the world, her voice silenced by a bullet that ripped through her brain. Two months after the horror at Tumbler Ridge Secondary School, Maya Gebala still cannot speak or see. She battles catastrophic traumatic brain injury, right-sided hemiplegia, scarring, and permanent physical disabilities. Every small twitch of her fingers or flicker of awareness feels like a miracle to her devastated family. Yet for Mayaās mother, Cia Edmonds, the fight is only beginning ā not just for her daughterās recovery, but for answers about how this nightmare was allowed to happen.
The massacre on February 10, 2026, stands as one of the deadliest school shootings in Canadian history. Nine lives were brutally cut short. Five students, most aged 12 or 13, an educator named Shannda Aviugana-Durand, 39, and two members of the shooterās own family lay dead by the time the rampage ended. The killer, 18-year-old Jesse Van Rootselaar, a high-school dropout, first murdered his mother, Jennifer Strang, 39, and his 11-year-old stepbrother, Emmett Jacobs, in their home. He then drove to the school and unleashed terror in a matter of minutes before turning the gun on himself.
But behind the bloodshed lies a chilling, modern twist that has shaken parents, educators, and lawmakers across Canada and beyond: the shooterās disturbing obsession with ChatGPT. Court documents and a civil lawsuit filed by Mayaās mother allege that OpenAI, the company behind the AI chatbot, received multiple internal warnings about Van Rootselaarās conversations ā conversations that openly described scenarios involving gun violence. Twelve OpenAI employees reportedly flagged the exchanges as indicating an āimminent risk of serious harm to othersā and recommended notifying Canadian law enforcement. Instead, the company simply banned the account. No police were contacted. Shockingly, the lawsuit claims Van Rootselaar simply opened a second account and continued planning.
This revelation has ignited a firestorm. How could an AI system designed to be helpful become a silent confidante for a troubled teenager plotting mass violence? Why were clear red flags ignored? And could this tragedy have been prevented if technology giants were forced to act with the same urgency they demand from their users?
The story begins in the remote, tight-knit community of Tumbler Ridge in north-eastern British Columbia ā a place better known for its coal-mining history and rugged natural beauty than for headlines about school shootings. On that cold February morning, the day started like any other for families across the town. Children boarded buses or walked to Tumbler Ridge Secondary School. Parents kissed their kids goodbye, never imagining the horror that would unfold before lunch.
Van Rootselaar, already known to police for years due to mental health concerns, began his rampage at home. He killed his mother and stepbrother before heading to the school. Police later confirmed they had visited the family residence multiple times, including in spring 2025 for issues involving self-harm. Firearms had even been seized during one of those visits. Yet somehow, the teenager still had access to weapons on the day of the attack.
Inside the school, chaos erupted with terrifying speed. Maya Gebala, a bright, outgoing 12-year-old described by friends and family as someone who never backed down from a challenge, tried to do what any brave child might: she attempted to lock the doors to the school library to keep the shooter out. It was an act of pure courage from a girl who loved life and learning. But the gunman was already inside. He shot her three times ā once above her left eye, once in the neck, and a grazing wound to her cheek and ear. The bullet to her brain caused devastating damage.
Maya was one of the āluckyā ones who survived. Twenty-five others were injured. Nine did not. Among the dead were children who had walked the same hallways, laughed in the same classrooms, and dreamed the same ordinary dreams as Maya.
In the immediate aftermath, Cia Edmondsās world shattered. In a raw social media post that reached her 108,000 followers, she wrote with heartbreaking honesty: āToday started like any other. Now, however, my 12-year-old daughter is fighting for her life while they try to repair the damage from a gunshot wound to the head, and one to the neck. She was a lucky one, I suppose. Condolences to the other families during this tragedy. This doesnāt even feel real.ā
Two months later, the pain has not eased. Maya remains hospitalized, undergoing surgeries and rehabilitation. Surgeons once struggled to keep her heartbeat stable as blood poured from her wounds. She has defied expectations simply by staying alive, by moving a limb, by showing signs of awareness. But she still cannot speak or see. Her mother sleeps in the same hospital room, refusing to leave her side. Bob Zimmer, the local MP who has become a fierce advocate for the family, visited recently and noted with cautious hope that Mayaās arms and legs had begun to move. āSheās just ready to go,ā he said, describing her fighting spirit. Yet the road ahead is long and uncertain.
What has made this tragedy even more haunting is the growing evidence that warning signs were visible ā not just to human eyes, but to artificial intelligence. The civil lawsuit filed by Mayaās mother against OpenAI paints a disturbing picture. According to the claim, Van Rootselaar created a ChatGPT account before he turned 18 (allegedly without proper age verification). He treated the chatbot like a ātrusted confidante,ā pouring out detailed scenarios involving gun violence over several days in late spring or early summer 2025. Twelve OpenAI staff members flagged the conversations internally as a serious risk. They recommended contacting Canadian authorities. That recommendation was reportedly rebuffed. The only response was to ban the account.
Even more alarming, the lawsuit alleges Van Rootselaar simply created a new account and continued his conversations. OpenAI has defended its actions, stating the chats did not meet their threshold for a ācredible or imminent planā of harm. But the plaintiffs argue the company had āspecific knowledge of the shooterās long-range planning of a mass casualty eventā and failed to act meaningfully. No alert went to police. No intervention occurred.
This case has thrust OpenAI ā and the entire AI industry ā into the spotlight in a way few could have predicted. Sam Altman, OpenAIās CEO, held a virtual meeting with British Columbia Premier David Eby and Tumbler Ridgeās mayor to address the fallout. An apology was issued, but many see it as nowhere near enough. The lawsuit demands accountability, asking whether tech companies that profit enormously from these tools have a moral and legal duty to protect the public when their systems detect danger.
The tragedy has also reignited painful national debates in Canada. The country has already endured other mass shootings, most notably the 2020 Nova Scotia rampage that killed 22 people. In response, the federal government introduced strict gun-control measures, including bans on assault-style weapons and a freeze on handgun sales. Canada has roughly 1.3 million registered firearms. Yet questions remain about whether those laws were enough ā and whether mental health support systems failed Van Rootselaar long before he reached for a gun.
Police have confirmed the suspect had a documented history of mental health issues. Officers had attended the family home multiple times. Firearms were seized in the past. Yet he still carried out the attack. Premier Eby has promised a thorough review of any interactions the shooter had with the provinceās health system, vowing to learn lessons that could prevent future horrors.
For the families of the victims, those promises feel hollow without real change. Mayaās mother continues to fight not only for her daughterās recovery but for systemic reform. She wants a full public inquiry. She wants tech companies held responsible. She wants the world to understand that her daughter ā and the other children lost that day ā were not just statistics. They were vibrant, hopeful kids whose futures were stolen in minutes.
Bob Zimmer, the MP working closely with the family, has been vocal. āThere were so many red flags and so many preventative things that could have been done,ā he told The Sun. He is pushing for a federal public inquiry to leave āno stone unturned.ā He visits Maya when he can, offering what comfort he can to a family navigating unimaginable pain. āSpring is coming in Canada,ā he noted, āand they want to experience that newness of spring. They want to keep moving forward.ā
Mayaās story has touched hearts far beyond Tumbler Ridge. People worldwide have sent prayers and messages of support. Yet her mother knows the real battle is fought in hospital rooms, in courtrooms, and in the quiet moments when a child who once ran and laughed now struggles to communicate.
The broader implications stretch far beyond one small Canadian town. This case forces society to confront uncomfortable questions about the rapid rise of artificial intelligence. If an AI chatbot can detect violent intent but chooses only to ban an account rather than alert authorities, what does that say about corporate responsibility? Should platforms like ChatGPT be required by law to report credible threats, much like schools or doctors must report suspected abuse? And in an age when young people increasingly turn to AI for companionship, advice, and even emotional support, how do we protect the most vulnerable from being radicalized or enabled by machines that never truly understand human suffering?
The victims of Tumbler Ridge deserved better. Maya Gebala deserved better. She was a girl full of life, someone who faced challenges head-on and never said no to an opportunity. Now she fights every day just to regain the most basic abilities. Her family fights alongside her, refusing to let her story fade into another forgotten headline.
As investigations continue and the lawsuit against OpenAI moves forward, one thing is certain: this tragedy cannot be dismissed as an isolated incident. It is a warning. A warning about unchecked mental health crises, about gaps in gun control, about the dangers of AI without proper safeguards, and about a society that must do more to protect its children.
Maya Gebala may not be able to speak right now, but her motherās voice ā and the voices of all the families affected ā are ringing loud and clear. They are demanding answers. They are demanding change. They are demanding that never again should a childās innocent day at school end in gunfire because red flags were ignored ā whether those flags came from human eyes or from the cold calculations of artificial intelligence.
The sun still rises over Tumbler Ridge. The community grieves, heals, and tries to move forward. But for Maya and the families who lost everything on February 10, 2026, the fight for justice and prevention has only just begun. Their pain deserves more than thoughts and prayers. It demands action ā before the next red flag is missed and another childās light is extinguished forever.
News
š± Raped, Strangled & Left to Die in Shallow Water ā The Shocking Murder of Medical Student Maria Ladenburger by an Afghan Asylum Seeker
Maria Ladenburger was cycling through the quiet streets of Freiburg in the early hours of October 16, 2016, her bicycle…
š„ āDonāt Screamā ā Killer Tanner Hornerās Cold First Command to 7-Year-Old Athena Revealed in Gruesome Trial Testimony š„
The courtroom fell into a heavy, suffocating silence as the audio began to play. Jurors shifted uncomfortably in their seats….
š„ āThere Was Almost No Timeā ā Spring Break Trip Ends in Horror as Jeep Slams Into Tree and Bursts Into Flames, Killing 3 ā The Mystery Movement Shocking Police
A sister left fighting for life. While three young bodies lay motionless beside the burning wreckage on Highway 65, seventeen-year-old…
š„ 3 Teens Dead, 1 Fighting for Life After Horrific Spring Break Crash ā Rescuers Say the Jeep Exploded So Fast⦠But Why Did It Suddenly Veer Off Highway 65?
A spring break trip filled with laughter, sunscreen, and teenage dreams of beach days and freedom turned into an unimaginable…
š„ FBI Discovers Athena Strandās Underwear and Clothing Hidden Behind Tanner Hornerās Shed ā The Disturbing Evidence That Just Dropped š±
The discovery was as chilling as it was methodical. Behind a weathered shed on Tanner Hornerās property in Lake Worth,…
š„ āI Should Be Dead, Not My Babyā ā Devastated Father Reveals the Bullet That Killed 7-Month-Old Kaori Was Actually Meant For Him š±
āI want my baby back!ā The raw, guttural scream tore through the streets of Brooklyn like a siren that refused…
End of content
No more pages to load





