this post was submitted on 26 Apr 2025
-32 points (26.5% liked)

Technology

69421 readers
6264 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

Article mentions, briefly or more substantially:

  • Lemmy
  • Mastodon
  • Retroshare
  • Nostr
  • Bluesky
  • ZeroNet
  • Secure Scuttlebutt
  • Tor onion sites
  • etc

Not my article, just one I found.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] -3 points 12 hours ago* (last edited 11 hours ago) (1 children)

The only real moderation that needs to happen is self moderation. If you see someone saying stuff that you don’t like, block them. That persons opinions are now gone for all that matters to you. There’s boo need for their opinions to be removed for everyone. Everyone has the capabilities to moderate their own experience.

If someone keeps being racist and it bothers you, block them. If someone keeps name calling and it bothers you, block them. Those of us who aren’t bothered by opinions we don’t agree with or by people saying things we don’t like can still engage with those people and perhaps even teach (or learn!) something.

There should be very few restrictions on speech, especially in an online forum/community, imo, restricted basically only for trying to incite or threatening actual physical harm.

Moderation/censorship of speech, especially when the power to decide what gets removed and who gets banned and for how long is just given to random people on the internet, usually because they’re friends with and share ideologies and opinion with mods/admin, inevitably leads to a “safe space” echo chamber where any dissenting views are not allowed, while the allowed views are allowed to be presented in whatever manner they want, including calls for violence, abuse, etc. See Reddit as the absolute biggest and most current example. Twitter before that.

[–] [email protected] 5 points 8 hours ago

Dude, apparently unlike you, I remember Usenet, which uses precisely the sort of system you're describing, in its heyday. That means I've also seen discussion groups implode because they couldn't get rid of a single bad actor. Killfiles alone aren't enough, even when combined with community naming-and-shaming. Someone always lacks self-restraint and engages. That encourages the bad actor(s). They post more, often using multiple sockpuppets to get around people's killfiles and flood out legitimate discussion. Newcomers to the group see masses of bad actor spam and fail to stick around. The lack of new blood kills the group.

Self-moderation simply doesn't work. Yes, bad moderation happens and I've seen plenty of examples. But no overarching moderation is also the kiss of death.