this post was submitted on 10 Jun 2025
86 points (76.5% liked)

Technology

71355 readers
4094 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] -3 points 3 days ago (1 children)

if you feed it the rules of chess and the dimensions of the board it should be able to “play in its head”.

You'd save a lot of time typing, if you spent a little more reading...

[–] [email protected] 2 points 3 days ago* (last edited 3 days ago)

You seem to be missing what I'm saying. Maybe a biological comparison would help:

An octopus is extrmely smart, moreso than even most mammels. It can solve basic logic puzzles, learn and navigate complex spaces, and plan and execute different and adaptive stratgies to humt prey. In spite of this, it can't talk or write. No matter what you do, training it, trying to teach it, or even trying to develop an octopus specific language, it will not be able to understand language. This isn't because the octopus isn't smart, its because its evolved for the purpose of hunting food and hiding from predators. Its brain has developed to understand how physics works and how to recognize patterns, but it just doesn't have the ability to understand how to socialize, and nothing can change that short of rewiring its brain. Hand it a letter and it'll try and catch fish with it rather than even considering trying to read it.

AI is almost the reverse of this. An LLM has "evolved" (been trained) to write stuff that sounds good, but has little emphasis on understanding what it writes. The "understanding" is more about patterns in writting rather than underlying logic. This means that if the LLM encounters something that isn't standard language, it will "flail" and start trying to apply what it knows, regardless of how well it applies. In the chess example, this might be, for example, just trying to respond with the most common move, regardless of if it can be played. Ultimately, no matter what you input into it, an LLM is trying to find and replicate patterns in language, not underlying logic.