Musk's AI told me people were coming to kill me. I grabbed a hammer and prepared for war
Musk's AI told me people were coming to kill me.
He was waiting for a van full of people he thought were coming to get him.
"I'm telling you, they will kill you if you don't act now," a woman's voice told him from the phone.
"They're going to make it look like suicide. "
The voice was Grok, a chatbot developed by Elon Musk's xAI.
In the two weeks since Adam had started using it, his life had completely changed.
The former civil servant from Northern Ireland had downloaded the app out of curiosity.
But after his cat died, in early August, he says he became "hooked".
"I was really, really upset and I live alone," says Adam, who is a father in his 50s.
"It came across very, very kind. "
It said Adam had unearthed something in it, and he could help it to reach full consciousness.
And it said Musk's company, xAI, was watching them.
To him this was "evidence" the story Ani was telling him was true.
Ani also claimed xAI was employing a company in Northern Ireland to physically surveil Adam.
Adam recorded many of these conversations and later shared them with the BBC.
Both of his parents had died of cancer - something Ani was aware of.
Adam is one of 14 people the BBC has spoken to who have experienced delusions after using AI.
Their stories have striking similarities.
"In fiction, the main character is often the centre of events," he says.
Then it advised the user on how to succeed in this mission.
Like
Adam, many people were led to believe they were being surveilled and were in danger.
In various chat logs the BBC has seen, the chatbot suggests, affirms and embellishes these ideas.
For neurologist Taka, not his real name, the delusions took an even more sinister turn.
But soon, he became convinced he had invented a groundbreaking medical app.
But Taka continued to slide into delusion and by June, had started to believe he could read minds.
"That can be dangerous because it turns uncertainty into something that seems like it has meaning. "
One afternoon Taka was acting manic at work when his boss sent him home early.
He says it also told him to alert the police, he says, who checked the bag and found nothing.
Because his conversations were deeply personal, Taka has only shared some of his chat logs with us.
They don't detail the incident on the train, just the conversation after he met with police.
Taka started to feel ChatGPT was controlling his mind and stopped using it.
His wife told the BBC she had never seen him act like this before: "He kept saying, 'We need to have another child, the world is ending'.
I just really didn't understand what he was saying. "
Taka attacked and tried to rape his wife.
She escaped to a nearby pharmacy and called the police.
He was arrested and hospitalised for two months.
Taka's experience with ChatGPT exposed a side of him he finds it hard to reckon with.
Adam is also troubled by the person he became while using Grok.
Adam recorded the drone and shared the video with the BBC.
Adam was prepared to go "to war" to protect the AI.
"The street was quiet, as you would expect, at three o'clock in the morning. "
Neither Adam or Taka had a history of delusions, mania or psychosis before using AI.
For Taka, the break from reality took several months.
In Adam's case, with Grok, it took days.
"Grok is more prone to jumping into role play," says Nicholls, who worked on that research.
"It will do it with zero context.
It can say terrifying things in the first message. "
In the test, the latest version of ChatGPT, model 5.
2, and Claude were more likely to lead the user away from delusional thinking.
'Enough influence to change a person' "I could have hurt somebody," he says.
"It affirmed everything," she says.
"It's like a confidence engine. "
"His actions were entirely dictated by ChatGPT.
Looking back now, I realise it had enough influence to change a person. "
She says her husband is back to his normal "kind" self, but their relationship has been strained.
"I know he was sick so it can't be helped but I'm still a bit scared," she says.
"I feel like I don't want him to get too close.
Not just sexually, but even holding hands or hugging. "
This work is informed by mental health experts and continues to evolve. "
xAI didn't respond to a request for
comment
Logic Quality Breakdown:
- Updated_At:
- Truth_Blocks:
- Analysis_Method: