this post was submitted on 29 May 2025
215 points (96.9% liked)

Technology

70528 readers
3564 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 3 points 3 days ago (1 children)

In my opinion absolutely. Whatever happens in your head stays in your head and doesn't affect the other person unless you take active steps for that to happen. Images or videos on the other hand can not only be distributed far easier, even accidentally, but also have a way higher chance of affecting people's lives (how can you disprove you didn't take nude photos of yourself for example? let alone make people believe it). They can lead to loss of reputation, harassment, bullying and serious mental issues for the victims (trust issues, anxiety, depression, self-harm) - imagination can't really do that on its own.

Perhaps distribution is the real problem but easy access to tools that can create convincing results quickly and without effort makes said distribution way more probable.

[–] [email protected] 1 points 3 days ago (1 children)

Thanks for sharing your perspective. It sounds like it's the potential for harm/damage rather than the act itself that makes it an issue for you?

[–] [email protected] 4 points 3 days ago

I still think the act itself is pretty gross but yeah, the harm is the important part for me - and I don't mean that just in case of sexual images. It's also a problem in terms of content created to affect people's reputation in other ways or influence the sociopolitical situation (something that's already happening around the world).

The harmful potential of generative AI is on a completely different scale than photoshopped images already mentioned in this thread by others. That doesn't mean genAI can't be used in fun and interesting ways but stuff like what's described in the linked article is a big no no for me.