this post was submitted on 05 Jun 2025
968 points (98.8% liked)
Not The Onion
16580 readers
1067 users here now
Welcome
We're not The Onion! Not affiliated with them in any way! Not operated by them in any way! All the news here is real!
The Rules
Posts must be:
- Links to news stories from...
- ...credible sources, with...
- ...their original headlines, that...
- ...would make people who see the headline think, “That has got to be a story from The Onion, America’s Finest News Source.”
Please also avoid duplicates.
Comments and post content must abide by the server rules for Lemmy.world and generally abstain from trollish, bigoted, or otherwise disruptive behavior that makes this community less fun for everyone.
And that’s basically it!
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Addiction recovery is a different animal entirely too. Don't get me wrong, is unethical to call any chatbot a therapist, counselor, whatever, but addiction recovery is not typical therapy.
You absolutely cannot let patients bullshit you. You have to have a keen sense for when patients are looking for any justification to continue using. Even those patients that sought you out for help. They're generally very skilled manipulators by the time they get to recovery treatment, because they've been trying to hide or excuse their addiction for so long by that point. You have to be able to get them to talk to you, and take a pretty firm hand on the conversation at the same time.
With how horrifically easy it is to convince even the most robust LLM models of your bullshit, this is not only an unethical practice by whoever said it was capable of doing this, it's enabling to the point of bordering on aiding and abetting.
AI is great for advice. It's like asking your narcissist neighbor for advice. He might be right. He might have the best answer possible, or he might be just trying to make you feel good about your interaction so you'll come closer to his inner circle.
You don't ask Steve for therapy or ideas on self-help. And if you did, you'd know to do due diligence on any fucking thing out of his mouth.
I'm still not sure what it's "great" at other than a few minutes of hilarious entertainment until you realize it's just predictive text with an eerie amount of data behind it.
Yuuuuup. It's like taking nearly the entirety of the public Internet, shoving it into a fancy auto correct machine, then having it spit out responses to whatever you say, then send them along with no human interaction whatsoever on what reply is being sent to you.
It operates at a massive scale compared to what auto carrot does, but it's the same idea, just bigger and more complex.