News

Stark warning against using artificial intelligence for health advice after man was hospitalised following medical advice ...
A young influencer couple were barred from boarding their flight to Puerto Rico after ChatGPT gave them the wrong visa information to enter the Caribbean Island.
A 60-year-old man was hospitalised with bromide poisoning after replacing salt with sodium bromide following ChatGPT advice.
A 60-year-old man developed bromism after replacing table salt with sodium bromide, reportedly following ChatGPT advice.
A 60-year-old man ended up in the ER after becoming convinced his neighbor was trying to poison him. In reality, the culprit ...
A 60-year-old man developed a rare medical condition after ChatGPT advised him on alternatives to table salt, according to a ...
A US medical journal has issued warnings against using ChatGPT for health information after a 60-year-old man who used ChatGPT for dietary advice developed bromism and wound up in the hospital.
A 60-year-old man was diagnosed with bromism after he relied on ChatGPT for diet tips. According to a US medical journal, the ...
Earlier this month, a medical journal published an article exploring a case of bromism that came about after a person used AI ...
Man suffers hallucinations, paranoia and bromism symptoms after following ChatGPT’s advice to use sodium bromide instead of ...
A 60-year-old man was diagnosed with a rare condition after he consulted ChatGPT about removing table salt from his diet. As per the paper published in the medical journal ‘Annals of Internal Medicine ...
A 60-year-old thought his neighbour was trying to poison him after he became ill with psychosis. He had been taking a ...