Friday, March 6, 2026
No Result
View All Result
LJ News Opinions
  • Home
  • U.S.
  • Politics
  • World News
  • Business
  • Entertainment
  • Sports
  • Technology
  • Health
  • Opinions
  • Home
  • U.S.
  • Politics
  • World News
  • Business
  • Entertainment
  • Sports
  • Technology
  • Health
  • Opinions
No Result
View All Result
LJ News Opinions
No Result
View All Result
Home Opinions

Experts warn of AI chatbots offering mental health advice

by LJ News Opinions
February 24, 2025
in Opinions
0
Share on FacebookShare on Twitter



The American Psychological Association is warning about the growing use of chatbots “masquerading” as licensed mental health professionals. They cite two situations as examples.

First, there was a Florida boy who committed suicide after interacting with a chatbot that was claiming to be a licensed therapist. They also cited the case of a teen with autism who grew violent toward his parents after communicating with the chatbot that was claiming to be a psychologist.

According to the Association, a key problem is that these chatbots don’t challenge users’ thinking. Rather, they reinforce it, and that can be potentially dangerous for anyone who is already on a downward slope and can then fall into a complete downward spiral.

Worse now, these chatbots were offered by an app called Character.AI, and that company says that its make-believe counselors are simply a form of entertainment. It says the chatbot characters should be treated as fiction.

That’s the same argument that newspaper horoscopes make. It’s one thing for a horoscope. It’s something else entirely for an online seemingly mental health professional who is dispensing seemingly professional advice, even though it is just a chatbot.

The Psychological Association is calling on federal authorities to investigate this. I wonder if an investigation is even needed. The idea that robots are now dispensing mental health advice, even in the form of entertainment, is not just dangerous; it’s completely wrong, and it should be stopped.

Young people are especially susceptible to this. They have grown up in this world of interacting on social media and online. They don’t really question it anymore. They accept it, and in the case of a chatbot, they are now so realistic that it’s easy to get caught up in their spell.



Source link

LJ News Opinions

LJ News Opinions

Next Post

Officers arrest speeding suspect, passengers after Los Angeles highway pursuit

Recommended

How Reform and the Conservatives could be FORCED into an alliance even though Kemi AND Farage don’t want it

3 months ago

Savannah Guthrie Not Present On ‘Today’ With Search For Mom Ongoing

1 month ago

Popular News

    Connect with us

    LJ News Opinions

    Welcome to LJ News Opinions, where breaking news stories have captivated us for over 20 years.
    Join us in this journey of sharing points of view about the news – read, react, engage, and unleash your opinion!

    Category

    • Business
    • Entertainment
    • Health
    • Opinions
    • Politics
    • Sports
    • Technology
    • U.S.
    • World News

    Site links

    • Home
    • About us
    • Contact

    Legal Pages

    • Privacy Policy
    • Cookie Privacy Policy
    • Terms of Use
    • Disclaimer
    • California Consumer Privacy Act (CCPA)
    • DMCA
    • About us
    • Advertise
    • Contact

    © 2024, All rights reserved.

    No Result
    View All Result
    • Home
    • U.S.
    • Politics
    • World News
    • Business
    • Entertainment
    • Sports
    • Technology
    • Health
    • Opinions

    © 2024, All rights reserved.