Advertisements

AI not ready to help care for the mentally ill, say experts

by Celia
Real Estate Agent

Although some applications of artificial intelligence in mental health are showing some success, experts say the jury is still out on its capabilities for wider use.

Therapists are using AI to examine large amounts of patient data, including family history, patient behaviour and response to treatment, to help diagnose and identify treatments, according to an article by the Swiss-based World Economic Forum.

Advertisements

A study led by researchers at New York University showed that AI was useful in identifying post-traumatic stress disorder in veterans.

Advertisements

Mental health professionals are using wearable devices such as FitBits to monitor sleep patterns, physical activity and variations in heart rate and rhythm, which are used to assess the user’s mood and cognitive state. The devices alert patients and healthcare providers when intervention may be needed and help users change behaviour and seek help.

Advertisements

AI chat programs using natural language programming are being used to review therapists’ reports and notes, as well as conversations during interactions with patients, to look for useful patterns. Researchers hope to help therapists build better relationships with patients and identify warning signs in patients’ choice of topics and words, the World Economic Forum reported.

Advertisements

With the success of AI comes the potential for misuse. The Forum has published comprehensive policy considerations and potential implementation strategies for AI.

The Forum recognises the current shortcomings and challenges to expanding AI in mental health. The use of AI chat in therapy, for example, raises questions about whether the technology is optimised for a consumer’s mental health outcomes or the developer’s profitability, the toolkit authors say.

“Who will ensure that a person’s mental health-related information is not used unscrupulously by advertising, insurance or the criminal justice system?” the authors write. “Questions such as these are troubling in light of the current regulatory structure.”

A study by researchers at the University of California San Diego in La Jolla warns that differences between traditional health care and mental health care create complications for AI systems.

“While AI technology is becoming more prevalent in medicine for physical health applications, the discipline of mental health has been slower to adopt AI,” said the study, published in the journal Current Psychology Reports. “Mental health practitioners are more hands-on and patient-centred in their clinical practice than most non-psychiatric practitioners, relying more on ‘softer’ skills, including building relationships with patients and directly observing patients’ behaviours and emotions. Clinical data in mental health are often in the form of subjective and qualitative patient statements and written notes”.

The World Health Organization concludes that it’s too early to predict the future of AI in mental health care.

“We found that the use of AI applications in mental health research is unbalanced, being used mainly to study depressive disorders, schizophrenia and other psychotic disorders. This indicates a significant gap in our understanding of how they can be used to study other mental health conditions,” Dr Ledia Lazeri, Regional Adviser for Mental Health at the WHO Regional Office for Europe, wrote in a report.

You may also like

blank

Dailytechnewsweb is a business portal. The main columns include technology, business, finance, real estate, health, entertainment, etc. 【Contact us: [email protected]

© 2023 Copyright  dailytechnewsweb.com