Man consults ChatGPT for diet advice, three months later diagnosed with bromide intoxication: Know what it is and how serious it can get

A man seeking a healthier diet consulted ChatGPT for a salt alternative and was advised to use sodium bromide. After three months, he landed in the emergency department with severe psychiatric symptoms, later diagnosed as bromism. This case highlights the potential dangers of AI-generated health advice and the importance of critical evaluation.