this post was submitted on 13 Feb 2025
1004 points (97.5% liked)

Technology

62401 readers
3858 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[โ€“] [email protected] 3 points 14 hours ago (1 children)

A hallucination is a false perception of sensory experiences (sights, sounds, etc).

LLMs don't have any senses, they have input, algorithms and output. They also have desired output and undesired output.

So, no, 'hallucinations' fits far worse than failure or error or bad output. However assigning the term 'hallucinaton' does serve the billionaires in marketing their LLMs as actual sentience.

[โ€“] [email protected] 1 points 13 hours ago* (last edited 13 hours ago)

You might prefer confabulation, or bullshitting.