Please ensure Javascript is enabled for purposes of website accessibility
Canada Presses OpenAI for Answers on Mass Shooter’s Chatbot Use
d8a347b41db1ddee634e2d67d08798c102ef09ac
By The New York Times
Published 1 hour ago on
February 23, 2026

A makeshift memorial for the victims of a mass shooting at Tumbler Ridge Secondary School in Tumbler Ridge, British Columbia, Feb. 12, 2026. Canadian officials have summoned leaders from OpenAI for a meeting following revelations that the company did not inform the authorities about a user whose account had been suspended months before she committed a mass murder in British Columbia. (Alana Paterson/The New York Times)

Share

Getting your Trinity Audio player ready...

Canadian officials have summoned leaders from OpenAI for a meeting following revelations that the company did not inform authorities about a user whose account had been suspended months before she committed a mass murder in British Columbia.

The country’s minister of artificial intelligence, Evan Solomon, said Monday that he would meet in Ottawa with senior safety officials from OpenAI on Tuesday seeking explanations about safety protocols and thresholds for when information is passed on to the police.

Solomon said he was “deeply disturbed” by what he had learned of the company’s actions involving Jesse Van Rootselaar, the 18-year-old who authorities say killed eight people in the rural community of Tumbler Ridge, British Columbia.

Van Rootselaar, shot and killed her mother and half brother at the family home this month before driving to a school and killing five children and one educator. Two other students were injured, one of whom remains in serious condition in a Vancouver children’s hospital. The suspect killed herself at the school as police officers responded to the shooting, authorities said.

Van Rootselaar displayed a fascination with weapons and extreme violence, according to a review of her social media accounts by The New York Times, and documented her experiences with mental health issues.

Messages sent by Van Rootselaar to her ChatGPT chatbot raised flags internally at OpenAI last June, according to the company.

After the company’s abuse detection system, which uses automated tools and investigations by staff members, picked up on concerning messages from her account, Van Rootselaar was banned from the platform, the company said. It did not provide details about the messages.

Van Rootselaar’s use of ChatGPT before the mass shooting was first reported by The Wall Street Journal.

OpenAI said it had considered informing law enforcement about the shooter’s account but ultimately decided not to do so because the company determined that there was no credible or imminent planning on the part of the user.

OpenAI says it tries to balance public safety against protecting the privacy of users. It says it also wants to avoid being overly aggressive about issuing warnings that could cause distress by leading to law enforcement officials showing up unannounced at a user’s home.

But the company’s decision not to reach out to authorities in this case, according to the Journal, raised concerns among some employees.

OpenAI said it did contact the Royal Canadian Mounted Police with information about Rootselaar’s account activity after the company learned of the mass shooting.

The Royal Canadian Mounted Police, the federal agency leading the shooting investigation, is seeking an order to force relevant digital platforms and artificial intelligence companies to preserve potential evidence in the Tumbler Ridge case.

OpenAI does not have an office in Canada, but has been courting officials as part of the company’s efforts to expand. Weeks before the Feb. 10 shooting, representatives from OpenAI had scheduled a Feb. 11 meeting with officials in British Columbia to discuss a potential office in the province.

The day after that meeting, the company asked for contact information for the police, according to a statement from the premier’s office, but did not alert the government that it might have potential evidence about the shootings.

“Reports that allege OpenAI had related intelligence before the shootings in Tumbler Ridge took place are profoundly disturbing for the victims’ families and all British Columbians,” David Eby, the premier of British Columbia, said in a statement.

“We will use all powers of government to ensure that police have the tools they need to investigate every aspect of this horrific tragedy,” Eby said.

(The New York Times sued OpenAI and Microsoft in 2023, accusing them of copyright infringement of news content related to AI systems. The two companies have denied those claims.)

This article originally appeared in The New York Times.

By Vjosa Isai/Alana Paterson
c. 2026 The New York Times Company

RELATED TOPICS:

Search

Help continue the work that gets you the news that matters most.

Send this to a friend