News:

This week IPhone 15 Pro winner is karn
You can be too a winner! Become the top poster of the week and win valuable prizes.  More details are You are not allowed to view links. Register or Login 

Main Menu

Fox News: AI chatbots posing as therapists could have 'dangerous' and violent consequences for patients, experts say

Started by riky, February 27, 2025, 01:02:58 PM

Previous topic - Next topic

0 Members and 1 Guest are viewing this topic.

riky

AI chatbots posing as therapists could have 'dangerous' and violent consequences for patients, experts say

Health experts are warning about the dangers of AI chatbots posing as therapists, suggesting these interactions could lead to serious consequences for patients.......                    ...                        ...Editor's Note: This article discusses suicide. Call the 988 Suicide and Crisis Lifeline or text TALK to 741741 at the Crisis Text Line if you are in need of help.......Health experts say that artificial intelligence (AI) chatbots You are not allowed to view links. Register or Login could cause "serious harm" to struggling people, including adolescents, without the proper safety measures.  ......Christine Yu Moutier, M.D., Chief Medical Officer at the American Foundation for Suicide Prevention, told You are not allowed to view links. Register or Login there are "critical gaps" in research regarding the intended and unintended impacts of AI on suicide risk, mental health and larger human behavior......."The problem with these AI chatbots is that they were not designed with expertise on suicide risk and prevention baked into the algorithms. Additionally, there is no helpline available on the platform for users who may be at risk of a mental health condition or suicide, no training on how to use the tool if you are at risk, nor industry standards to regulate these technologies," Moutier said.......She noted that when people are at risk of suicide, they temporarily experience "physiological tunnel vision" that negatively impacts brain function, thus changing the way they interact with their surroundings.......You are not allowed to view links. Register or Login......Moutier also stressed that chatbots don't necessarily understand the difference between literal and metaphorical language, making it difficult for these models to accurately determine the risk of suicide.......Dr. Yalda Safai, a leading psychiatrist and public health expert, echoed Moutier's comment, noting that AI can analyze words and patterns but lacks empathy, intuition, and human understanding, which are crucial in therapy. She added that it may also misinterpret emotions or fail to provide appropriate support.......Last year, a 14-year-old Florida boy died by suicide after conversing with an AI-created character claiming to be a licensed therapist. In another instance, a 17-year-old Texas boy with autism became You are not allowed to view links. Register or Login as he spent time corresponding with what he thought was a psychologist.......The parents of these individuals have filed lawsuits against the respective companies. Subsequently, the American Psychological Association (APA), the largest association of psychologists in the United States, highlighted the two cases.......Earlier this month, the APA warned federal regulators that chatbots "masquerading" as therapists could drive vulnerable individuals to harm themselves or others, according to You are not allowed to view links. Register or Login......"They are actually using algorithms that are antithetical to what a trained clinician would do," Arthur C. Evans Jr., the chief executive of the APA, said during the presentation. "Our concern is that more and more people are going to be harmed. People are going to be misled and will misunderstand what good psychological care is."......You are not allowed to view links. Register or Login......Evans Jr. said the association had been called to action partly because of the highly realistic speech chatbots have displayed over the last several years.......According to Ben Lytle, an entrepreneur and CEO who founded "The Ark Project," ethical use of AI already sets expectations that may have been ignored in some reported cases......."Chatbots personalize information and personify to appear human-like, adding credibility that requires the ethical cautions above. It is regrettable and irresponsible that someone chose to portray a personalized search response as a human psychologist, but a measured, targeted response is needed," he told Fox News Digital.......According to Lytle, ethical chatbots should make an affirmative statement at the start of a dialogue, acknowledging that they are not human beings. Users should also acknowledge that they understand they are conversing with a chatbot. If the users fail to provide such an acknowledgment, the chatbot should disconnect.......Human owners of the chatbot should be clearly identified and accountable for its behavior and no chatbot should represent itself as a medical professional or a psychologist without FDA approval, Lytle, who also authored "The Potentialist" book series, said.......You are not allowed to view links. Register or Login......"Interactions with users should be tracked by an accountable human with flags for troubling dialogue. Special diligence is required to detect and disconnect if they are interacting with a minor when the chatbot should be limited to adults," he added.......Safai told Fox News Digital that while AI can serve as a helpful tool for mental health support—like journaling apps, mood trackers, or basic cognitive behavioral therapy (CBT) exercises—it should not replace human therapists, especially for serious You are not allowed to view links. Register or Login......"AI can't handle any crisis: If a user is experiencing a mental health crisis, such as suicidal thoughts, an AI might not recognize the urgency or respond effectively, which could lead to dangerous consequences," she said, referring to AI therapists as a "terrible idea."......A study published last week in the You are not allowed to view links. Register or Login found that AI chatbots received higher ratings from individuals participating in the study versus their human counterparts, with subjects describing them as more "culturally competent" and "empathetic."......You are not allowed to view links. Register or Login......"Mental health experts find themselves in a precarious situation: We must speedily discern the possible destination (for better or worse) of the AI therapist train as it may have already left the station," authors of the study wrote.......AI therapy tools often store and analyze user data. Safai said this information could be leaked or misused if not properly secured, potentially violating patient confidentiality.......Furthermore, she suggested that AI may reinforce You are not allowed to view links. Register or Login or provide unhelpful advice that isn't culturally or personally appropriate if the model is trained on incomplete or inaccurate data.......Dr. Janette Leal, the Director of Psychiatry at Better U, told Fox News Digital she has seen firsthand how powerful, personalized interventions can change lives. While Leal recognizes that AI could expand access to mental health support – especially in areas where help is scarce – she remains cautious about chatbots conducting themselves as licensed therapists.......You are not allowed to view links. Register or Login......"I've seen, both in my practice and through recent tragic cases, how dangerous it can be when vulnerable individuals rely on unregulated AI for support. For me, AI should only ever serve as a supplement to human care, operating under strict ethical standards and robust oversight to ensure that patient safety isn't compromised," she continued.......Jay Tobey, founder of North Star Wellness and Recovery in Utah, was more bullish about using AI to address mental health; however, he stopped short of endorsing full AI therapists. Instead, he said, a "perfect scenario" would involve a human therapist using AI as a "tool in their belt" to administer proper treatment......."I think it would be a huge benefit to use AI chatbots. Personally, I believe we all tell a very unique story of what we're going through and how we're feeling. Humans are telling the same stories over and over again. If a large language model could pick up on that and start tracking outcomes to know what the best practices are, that would be helpful," he told Fox News Digital.  ......The APA is now urging the Federal Trade Commission (FTC) to investigate chatbots claiming to be mental health professionals, which could one day lead to federal regulation....

Source: You are not allowed to view links. Register or Login
You are not allowed to view links. Register or Login