this post was submitted on 20 May 2025
7 points (76.9% liked)

Technology

38763 readers
465 users here now

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 3 years ago
MODERATORS
all 11 comments
sorted by: hot top controversial new old
[–] [email protected] 22 points 1 week ago

I'm sure some people find it very helpful to talk to a chatbot. Others find it helpful to talk to a cat. Either way it's not a therapist.

[–] [email protected] 21 points 1 week ago (1 children)

I mean, good for them. Now, if anyone has actual mental health issues, please get in touch with a trained, human therapist.

[–] [email protected] 5 points 1 week ago (2 children)

As a peasant I know that professional help is not always available or viable. AI could very well have saved some of my friends who felt they had no available help and took their own lives. That being said, public facing language models should come with a warning for exacerbating psychosis. Notably the sycophantic models like chatgpt.

[–] [email protected] 15 points 1 week ago

problem: actual mental help has low availability

solution: ai can stand in where needed

outcome: ai mental health systemically expands while actual therapists remain inaccessible, as insurance refuses to cover them. mental health outcomes systemically worsen across the board.

[–] [email protected] 2 points 1 week ago

This says everything really. We live in a profit driven society, so where we should invest in public health to ensure that mental healthcare is available for everyone, instead we count pennies, driving public health workers to become private or quit completely. As a result, there's not enough healthcare professionals to go around, if we can alleviate that a little, we absolutely should invest heavily in it. Have people using AI and supervise the AI, make changes and make it the best we can. Because a free AI, which is the dream, can help to save thousands.

[–] [email protected] 13 points 1 week ago
[–] [email protected] 6 points 1 week ago

With NHS mental health waitlists at record highs, are chatbots a possible solution?

taking Betteridge's Law one step further - not only is the answer "no", the fucking article itself explains why the answer is no:

People around the world have shared their private thoughts and experiences with AI chatbots, even though they are widely acknowledged as inferior to seeking professional advice.

as with so many other things, "maybe AI can fix it?" is being used as a catch-all for every systemic problem in society:

In April 2024 alone, nearly 426,000 mental health referrals were made in England - a rise of 40% in five years. An estimated one million people are also waiting to access mental health services, and private therapy can be prohibitively expensive.

fucking fund the National Health Service properly, in order to take care of the people who need it.

but instead, they want to continue cutting its budget, and use "oh there's an AI chatbot that you can use that is totally just as good as talking to a human, trust us" as a way of sweeping the real-world harm caused by those budget cuts under the rug.

Nicholas has autism, anxiety, OCD, and says he has always experienced depression. He found face-to-face support dried up once he reached adulthood: "When you turn 18, it's as if support pretty much stops, so I haven't seen an actual human therapist in years."

He tried to take his own life last autumn, and since then he says he has been on a NHS waitlist.

[–] [email protected] 6 points 1 week ago

A few thoughts here as someone with multiple suicide attempts under his belt:

  • I'd never use an "AI therapist" not running locally. Crisis is not the time to start uploading your most personal thoughts to an unknown server with possible indefinite retention.

  • When ideation hits, we're not of sound enough mind to consider that, so it is, in effect, taking advantage of people in a dark place for data gathering.

  • Having seen the gamut of mental-health services from what's available to the indigent to what the rich have access to (my dad was the director of a private mental hospital), it's pretty much all shit. This is a U.S. perspective, but I find it hard to believe we're unique.

  • As such, there may be room for "AI" to provide similar outcomes to crisis lines, telehealth or in-person therapy. But again, this would need to be local and likely isn't ready for primetime, as I can really only see this becoming more helpful once it can take over more of an agent role where it has context for what you're going through.