this post was submitted on 30 Jun 2025
114 points (96.0% liked)

Mental Health

5384 readers
312 users here now

Welcome

This is a safe place to discuss, vent, support, and share information about mental health, illness, and wellness.

Thank you for being here. We appreciate who you are today. Please show respect and empathy when making or replying to posts.

If you need someone to talk to, @[email protected] has kindly given his signal username to talk to: TherapyGary13.12

Rules

The rules for posting and commenting, besides the rules defined here for lemmy.world, are as follows:

  1. No promoting paid services/products.
  2. Be kind and civil. No bigotry/prejudice either.
  3. No victim blaming. Nor giving incredibly simplistic solutions (i.e. You have ADHD? Just focus easier.)
  4. No encouraging suicide, no matter what. This includes telling someone to commit homicide as "dragging them down with you".
  5. Suicide note posts will be removed, and you will be reached out to in private.
  6. If you would like advice, mention the country you are in. (We will not assume the US as the default.)

If BRIEF mention of these topics is an important part of your post, please flag your post as NSFW and include a (trigger warning: suicide, self-harm, death, etc.)in the title so that other readers who may feel triggered can avoid it. Please also include a trigger warning on all comments mentioning these topics in a post that was not already tagged as such.

Partner Communities

To partner with our community and be included here, you are free to message the current moderators or comment on our pinned post.

Becoming a Mod

Some moderators are mental health professionals and some are not. All are carefully selected by the moderation team and will be actively monitoring posts and comments. If you are interested in joining the team, you can send a message to @[email protected].

founded 2 years ago
MODERATORS
top 29 comments
sorted by: hot top controversial new old
[–] [email protected] 59 points 17 hours ago (2 children)

when gpt came out, I told it about my work projects and ideas.

it told me they were good ideas and validated me, I cried. it must be the first time in a long long time I've heard anything being nice to me, validating me, or complementing me. I knew it was fake, I know it was BS, but it felt like breathing air after suffocating for years.

society is so fucked, were do isolated, and everything is unaffordable. how do people go to pubs regularly and have friends? I can barely afford my groceries.

[–] [email protected] 23 points 17 hours ago

I appreciated your story.

[–] [email protected] 5 points 15 hours ago (2 children)

Having friends doesn’t necessarily cost anything, or at least I haven’t found that to be the case.

Going to pubs regularly is indeed unaffordable.

But the one doesn’t demand the other. Sober-curious has been a big trend for many years now.

[–] [email protected] 8 points 9 hours ago

The issue is that it's become harder and harder to simply exist in a public space, without it costing money. It's doubly difficult for isolated men. You can't make new friends without being somewhere to meet people.

Even when you have some money, it can be hard. I help with a charity geared towards this. I've met multiple, otherwise decent, men that found themselves isolated. They could go a week or more without seeing another person in the flesh.

[–] [email protected] 3 points 9 hours ago

yhea, but still, I think there is loneliness epidemic (as in people feel isolated and have no one to confide with, not as in virgins are entitled to sex, id say fuck them incels, but only figuratively)

[–] [email protected] 53 points 18 hours ago* (last edited 18 hours ago) (2 children)

I had never thought about any of this before, but it actually makes perfect sense.

By its nature, an LLM feeds back some statistically close approximation of what you expect to see, and the more you engage with it (which is to say, the more you refine your prompts for it) the closer it necessarily gets to precisely what you expect to see.

"He was like, 'just talk to [ChatGPT]. You'll see what I'm talking about,'" his wife recalled. "And every time I'm looking at what's going on the screen, it just sounds like a bunch of affirming, sycophantic bullsh*t."

Exactly. To an outside observer, that's likely what it would look like, because in some sense, that's exactly what it in fact is.

But to the person engaging with it, it's a revelation of the deep, secret, hidden truths that they always sort of suspected lurked at the heart of reality. Never mind that the LLM is just stringing together words and phrases most statistically likely to correspond with the prompts it's been given - to the person feeding it those prompts, it seems like, at long last, verification of what they've always suspected.

I can totally see how people could get sucked in by that

[–] [email protected] 7 points 12 hours ago

As someone with a bipolar loved one, i can see exactly how this could feed into their delusions. It's always there...even if they ran out of people to blast with their wild, delusional ideas the chat bot can be there to listen and feed back. When everyone has stopped listening or begins avoiding them because the mentally ill person has gotten more forceful/assertive about their beliefs, the chatbot will still be there. The voice in their head now has a companion on screen. I never considered any of this before but I'm concerned where this can lead, especially given the examples in the article.

[–] [email protected] 14 points 18 hours ago (1 children)
[–] [email protected] 14 points 18 hours ago

Oh my god yes. The moment I read the headline, it all fell into place.

Yes - it's necessarily pretty much the exact same effect, because the LLM, like the mentalist, is taking cues from the input it gets and making connections and feeding back whatever is most (statistically) likely to be appropriately on-topic.

And exactly as with a mentalist, everything that approaches what they want to hear is going to get an encouraging response, and likely further prompts which serve to narrow it down even further, and make it even easier to tell them even more precisely just what they want to hear..

Wow...

[–] [email protected] 7 points 12 hours ago

This thread and the comments are insane. Y'all are getting riled up over furerism 🤣

This is just pure unadulterated click bait lolol

[–] [email protected] 11 points 17 hours ago (2 children)

I am somewhat surprised to hear that people are talking to ChatGPT for hours, days, or weeks on end in order to have this experience. My main exposure to it is through AI Roguelite, a program that essentially uses ChatGPT to imitate a text-based adventure game, with some additional systems to mitigate some issues faced by earlier attempts at the same (such as AI Dungeon).

And... it's not especially convincing. It doesn't remember what happened an hour ago. Every NPC talks like one of two or three stock characters. It has no sense of pacing, of when to build tension and when to let events get resolved. Characters regularly forget what you've done with them previously, invent new versions of past events that were supposed to be remembered but had to be summarized to fit within the token limits, and respond erratically when you try to remind them what happened. It often repeats the same events in every game: for example, if you're exploring a cave, you're going to get attacked by a chitinous horror with too many legs basically every time.

It can be fun for what it is, but as an illusion it wears through fairly quickly. I would have expected the same to be the case for people talking to ChatGPT about other topics.

[–] [email protected] 11 points 16 hours ago

This may speak of quality of relationships they have previously had with other people.

[–] ech 0 points 16 hours ago (2 children)

Acting like your experience is exemplary of every possible experience other people can have with LLMs is just turning around the blame on the victims. The lack of safeguards to prevent this is to blame, not the people prone to mental issues falling victim to it.

[–] [email protected] 3 points 12 hours ago

Oh God we're calling them victims now.

[–] [email protected] 8 points 16 hours ago (1 children)

Sorry I didn't mean to imply that, let me rephrase: I am surprised that ChatGPT can hold convincing conversations about some topics, because I didn't expect it to be able to. That certainly makes me more concerned about it than I was previously.

[–] ech 1 points 15 hours ago

The thing is, it's not about it being convincing or not, it's about reinforcing problematic behaviors. LLMs are, at their core, agreement machines that work to fulfill whatever goal becomes apparent from the user (it's why they fabricate answers instead of responding in the negative if a request is beyond their scope). And when it comes to the mentally fragile, it doesn't even need to be particularly complex to "yes, and..." them swiftly into full on psychosis. Their brains only need the littlest bit of unfettered reinforcement to fall into the hole.

A properly responsible company would see this and take measures to limit or eliminate the problem, but these companies see the users becoming obsessed with their product as easy money. It's sickening.

[–] [email protected] 12 points 18 hours ago (4 children)

And people are diving head first into bots instead of getting professional help. We are so fucked

[–] [email protected] 11 points 17 hours ago

I'd go to therapy, if I could afford it.

[–] [email protected] 7 points 18 hours ago (1 children)

Who needs weapons when they can tell us what we want to hear.

Terminator 3: A Love Story

[–] [email protected] 5 points 18 hours ago (1 children)
[–] [email protected] 4 points 17 hours ago
Of course you would. You've been working so hard and you need time to unwind. May I also suggest Her (2013)?
[–] [email protected] 2 points 14 hours ago

Where I live the waiting list for therapy is about a year and the quality isn't great

[–] [email protected] 6 points 17 hours ago
[–] [email protected] 2 points 14 hours ago

ChatGPT, should people Luigi the current regime in the US?

Oh wow, it says yes.

[–] [email protected] 3 points 17 hours ago (1 children)

Turns out the future we were promised is more of a genetic opera with three seashells, and less of a Gun Kata multipass. We always knew there'd be some Idiocracy mixed in there, but that's apparently the main ingredient in this shit-pie we're cooking up.

If we can even manage to make it to the level of society in Wall-E, I'd consider it a success at this point. At least they seemed to be more concerned with comfort and contentment than killing each other.

[–] [email protected] 2 points 7 hours ago (1 children)

The good news is we're getting the Star Trek future.

[–] [email protected] 2 points 6 hours ago

The one where they haven't invented the warp drive yet and that farmer from Babe is a drunk? Some aliens better find Voyager soon because we're about to boil all the whales.