this post was submitted on 13 Feb 2025
997 points (97.5% liked)

Technology

62161 readers
4207 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 9 points 16 hours ago (2 children)

Is that it?

One of the things I like more about AI is that it explains to detail each command they output for you, granted, I am aware it can hallucinate, so if I have the slightest doubt about it I usually look in the web too (I use it a lot for Linux basic stuff and docker).

Some people would give a fuck about what it says and just copy & past unknowingly? Sure, that happened too in my teenage days when all the info was shared along many blogs and wikis...

As usual, it is not the AI tool who could fuck our critical thinking but ourselves.

[–] [email protected] 4 points 15 hours ago* (last edited 15 hours ago) (1 children)

I love how they chose the term "hallucinate" instead of saying it fails or screws up.

[–] [email protected] -1 points 14 hours ago (1 children)

Because the term fits way better…

[–] [email protected] 3 points 6 hours ago (1 children)

A hallucination is a false perception of sensory experiences (sights, sounds, etc).

LLMs don't have any senses, they have input, algorithms and output. They also have desired output and undesired output.

So, no, 'hallucinations' fits far worse than failure or error or bad output. However assigning the term 'hallucinaton' does serve the billionaires in marketing their LLMs as actual sentience.

[–] [email protected] 1 points 5 hours ago* (last edited 5 hours ago)

You might prefer confabulation, or bullshitting.

[–] [email protected] 2 points 14 hours ago

I see it exactly the same, I bet you find similar articles about calculators, PCs, internet, smartphones, smartwatches, etc

Society will handle it sooner or later