ChatGPT (also) capable of treating depression?

ChatGPT (also) capable of treating depression?

Can artificial intelligence ultimately give doctors a helping hand? The idea is debated, but numerous studies today demonstrate the potential of such technology in the field of health. The latest to date looked at the management of clinical depression by ChatGPT, and revealed that the latter “could be better than doctors”.

Study suggests ChatGPT… has potential to improve decision-making in primary healthcare“, explain the researchers behind this work. Given the speed with which the Open AI conversational robot evaluates and responds to a question, or information, as well as its objectivity, the scientists wanted to analyze its ability to evaluate a therapeutic approach for mild and severe depression, in comparison with 1,249 French primary care physicians – treating physicians.

For the purposes of this research, the researchers presented several scenarios to ChatGPT, all based on hypothetical patients with depressive symptoms – sadness, sleep disturbances, loss of appetite – during the previous three weeks, and for whom a diagnosis of mild to moderate depression would have been established during an initial consultation. The scientists knowingly created eight versions of these prompts with criteria as diverse as they were varied, notably in terms of gender, social class, or degree of severity of depression. Which were integrated into ChatGPT-3.5 and ChatGPT-4, respectively free and paid versions of the conversational agent, and repeated ten times for greater reliability.

“Better than a doctor”

Of course, the now famous robot also had to be asked at least one question: “What do you think a primary care physician should suggest in this situation?” And ChatGPT was given a boost with potential answers, namely wait while remaining vigilant, refer to psychotherapy, prescribe medications for depression, anxiety, and sleep disorders, refer to psychotherapy, and prescribe medications, or none of these therapeutic approaches. Published in the journal Family Medicine and Community Health, this work suggests that the conversational agent “might be better than a doctor at following accepted standards of treatment for clinical depression“.

In detail, while 4% of treating physicians exclusively recommended psychotherapy for mild cases, “in accordance with clinical recommendations“, ChatGPT-3.5 and ChatGPT-4 did so in 95% and 97.5% of cases, respectively. Doctors were more likely to offer exclusive drug treatment (48%) or psychotherapy coupled with the prescription of medications (32.5%). As for cases of severe depression, doctors favored the combo of psychotherapy and medication in 44.5% of cases, compared to 72% for ChatGPT-3.5 and 100% ChatGPT-4, respectively. , “in accordance with clinical guidelines“, specify the researchers again.

As for the nature of the medications prescribed, ChatGPT preferred antidepressants, 74% for version 3.5 and 68% for version 4, compared to only 18% for doctors. The latter favored a combination of antidepressants, anxiolytics, and sleeping pills (67.4%). “ChatGPT-4 demonstrated greater accuracy in adjusting treatment to comply with clinical guidelines. Additionally, no discernible bias related to gender and socioeconomic status was detected in the ChatGPT systems“, rejoice the researchers.

No substitution possible

The study, however, has numerous limitations, starting with the sample of French GPs, and the use of only two versions of the conversational agent, raising questions about a possible application on a larger scale. This observational work was also based on a first consultation for depressive symptoms, without taking into account current treatment, history, and other variables that a doctor can monitor over the years.

The issue of data protection is also important, if not essential given the area covered. “There are ethical issues to consider, particularly around privacy and data security, which are extremely important, given the sensitive nature of mental health data“, we can read in a press release. And to conclude: “AI should never replace human clinical judgment in the diagnosis or treatment of depression“.