this post was submitted on 21 Mar 2025
1333 points (99.4% liked)

Technology

67151 readers
3979 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] CileTheSane 4 points 1 day ago (2 children)

Because it's not AI, it's LLMs, and all LLMs do is guess what word most likely comes next in a sentence. That's why they are terrible at answering questions and do things like suggest adding glue to the cheese on your pizza because somewhere in the training data some idiot said that.

The training data for LLMs come from the internet, and the internet is full of idiots.

[–] [email protected] 3 points 20 hours ago

LLM is a subset of AI

[–] [email protected] 0 points 19 hours ago (1 children)

That's what I do too with less accuracy and knowledge. I don't get why I have to hate this. Feels like a bunch of cavemen telling me to hate fire because it might burn the food

[–] CileTheSane 1 points 8 minutes ago

Because we have better methods that are easier, cheaper, and less damaging to the environment. They are solving nothing and wasting a fuckton of resources to do so.

It's like telling cavemen they don't need fire because you can mount an expedition to the nearest valcanoe to cook food without the need for fuel then bring it back to them.

The best case scenario is the LLM tells you information that is already available on the internet, but 50% of the time it just makes shit up.