this post was submitted on 30 Jun 2025
24 points (96.2% liked)

Ask

448 readers
71 users here now

Rules

  1. Be nice
  2. Posts must be legitimate questions (no rage bait or sea lioning)
  3. No spam
  4. NSFW allowed if tagged
  5. No politics
  6. For support questions, please go to [email protected]

Icon by Hilmy Abiyyu A.

founded 3 months ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 3 points 2 days ago

AI is fine, though a bit damaging purely from how people are treating it. We don't have the systems to support the effect it's having on society, though we can't blame the entire downward economic trend on AI. It seems likely to crash, we know it's eating up a shitzillion dollars and earning next to nothing in return.

I think people being essentially tricked into thinking llms are magic reasoning devices are going to be issues for a while. Llms might be a good start to a behaviour interface between humans and a real logic system, but as far as I'm aware we don't have anything like that, and it's a long ways off. Ethically, I'm not a fan of companies trawling the internet for data they can use to build hallucination machines. I'm also not a huge fan of people's creative, language, and logic skills being influenced by something like chatgpt.

Where I hope it'll go is it'll have a big collapse. Then it might go into the background for a while, and return as something actually useful. The tech won't ever go away, but neither has blockchain.