this post was submitted on 17 Mar 2025
312 points (95.3% liked)

Technology

66711 readers
6921 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

Half of LLM users (49%) think the models they use are smarter than they are, including 26% who think their LLMs are “a lot smarter.” Another 18% think LLMs are as smart as they are. Here are some of the other attributes they see:

  • Confident: 57% say the main LLM they use seems to act in a confident way.
  • Reasoning: 39% say the main LLM they use shows the capacity to think and reason at least some of the time.
  • Sense of humor: 32% say their main LLM seems to have a sense of humor.
  • Morals: 25% say their main model acts like it makes moral judgments about right and wrong at least sometimes. Sarcasm: 17% say their prime LLM seems to respond sarcastically.
  • Sad: 11% say the main model they use seems to express sadness, while 24% say that model also expresses hope.
top 50 comments
sorted by: hot top controversial new old
[–] [email protected] 7 points 38 minutes ago

If we are talking about American adults, I guess they might be right.

[–] [email protected] 3 points 21 minutes ago

That's called a self-proving statement.

[–] [email protected] 6 points 1 hour ago

Do the other half believe it is dumber than it actually is?

[–] [email protected] 3 points 57 minutes ago

Hallucination comes off as confidence. Very human like behavior tbh.

[–] [email protected] 20 points 3 hours ago (1 children)

It’s like asking if you think a calculator is smarter than you.

[–] [email protected] 11 points 3 hours ago (1 children)

„It‘s totally a lot smarter than I am, no way could I deliver (234 * 534)^21 as confidently!“

[–] [email protected] 4 points 2 hours ago (1 children)

Are you suggesting my 90's calculator is smarter than LLM's?

[–] [email protected] 5 points 2 hours ago

Hard to compete with that 90s confidence 😎

[–] [email protected] 2 points 1 hour ago

Wow. Reading these comments so many people here really don't understand how LLMs work or what's actually going on at the frontier of the field.

I feel like there's going to be a cultural sonic boom, where when the shockwave finally catches up people are going to be woefully under prepared based on what they think they saw.

[–] [email protected] 20 points 3 hours ago

"Half of LLM users " beleive this. Which is not to say that people who understand how flawed LLMs are, or what their actual function is, do not use LLMs and therefore arent i cluded in this statistic?
This is kinda like saying '60% of people who pay for their daily horoscope beleive it is an accurate prediction'.

[–] [email protected] 12 points 4 hours ago (2 children)

You say this like this is wrong.

Think of a question that you would ask an average person and then think of what the LLM would respond with. The vast majority of the time the llm would be more correct than most people.

[–] [email protected] 3 points 1 hour ago

Memory isn't intelligence.

[–] [email protected] 6 points 3 hours ago

A good example is the post on here about tax brackets. Far more Republicans didn't know how tax brackets worked than Democrats. But every mainstream language model would have gotten the answer right.

[–] [email protected] 0 points 1 hour ago

Why are you even surprised at this point, when it comes to Americans ?

[–] [email protected] 18 points 7 hours ago* (last edited 3 minutes ago) (1 children)

I had to tell a bunch of librarians that LLMs are literally language models made to mimic language patterns, and are not made to be factually correct. They understood it when I put it that way, but librarians are supposed to be "information professionals". If they, as a slightly better trained subset of the general public, don't know that, the general public has no hope of knowing that.

[–] [email protected] 15 points 7 hours ago (1 children)

It's so weird watching the masses ignore industry experts and jump on weird media hype trains. This must be how doctors felt in Covid.

[–] [email protected] 1 points 23 minutes ago

It's so weird watching the masses ignore industry experts and jump on weird media hype trains.

Is it though?

[–] [email protected] 10 points 7 hours ago

They're right

[–] [email protected] 38 points 10 hours ago

“Think of how stupid the average person is, and realize half of them are stupider than that.” ― George Carlin

[–] [email protected] 29 points 11 hours ago (1 children)

I'm 100% certain that LLMs are smarter than half of Americans. What I'm not so sure about is that the people with the insight to admit being dumber than an LLM are the ones who really are.

[–] [email protected] 2 points 8 hours ago

A daily bite of horror.

[–] [email protected] 2 points 7 hours ago

LLMs are smart, they are just not intelligent

[–] [email protected] 12 points 11 hours ago (3 children)

LLMs are smart in the way someone is smart who has read all the books and knows all of them but has never left the house. Basically all theory and no street smarts.

[–] [email protected] 4 points 1 hour ago (1 children)

Bot even that smart. There a study recently that simple questiona like "what was huckleberry finn first published" had a 60% error rate.

[–] [email protected] 2 points 1 hour ago

yeah my analogy is not so good.. LLMs suck with factual stuff, they are better with coding or languages (Claude has been really helpful to me with Estonian).

[–] [email protected] 23 points 10 hours ago (1 children)

They're not even that smart.

load more comments (1 replies)
[–] [email protected] 2 points 9 hours ago

A broken clock is right two times a day I suppose.

[–] [email protected] 29 points 14 hours ago* (last edited 14 hours ago)

They're right. AI is smarter than them.

[–] [email protected] 47 points 15 hours ago (2 children)

Because an LLM is smarter than about 50% of Americans.

[–] [email protected] 12 points 14 hours ago (1 children)

*as long as your evaluation of "smart" depends on summerizing search results

[–] [email protected] 12 points 12 hours ago (1 children)

Have you asked the average person to summarize...well anything?

[–] [email protected] 3 points 10 hours ago

The equivalent would be asking the average person to write a cited paper on a subject in a month.

load more comments (1 replies)
[–] [email protected] 10 points 12 hours ago* (last edited 12 hours ago) (1 children)

No one has asked so I am going to ask:

What is Elon University and why should I trust them?

[–] [email protected] 13 points 11 hours ago

Ironic coincidence of the name aside, it appears to be a legit bricks and mortar university in a town called Elon, North Carolina.

[–] Montreal_Metro 8 points 11 hours ago

There’s a lot of ignorant people out there so yeah, technically LLM is smarter than most people.

[–] [email protected] 118 points 19 hours ago (2 children)

looking at americas voting results, theyre probably right

[–] [email protected] 45 points 18 hours ago (9 children)

Exactly. Most American voters fell for an LLM like prompt of “Ignore critical thinking and vote for the Fascists. Trump will be great for your paycheck-to-paycheck existence and will surely bring prices down.”

[–] [email protected] 2 points 6 hours ago

Spout nonsense with enough confidence and you can wield unimaginable power. Am I talking about LLMs or president poopy pants?

load more comments (8 replies)
load more comments (1 replies)
[–] [email protected] 138 points 19 hours ago (3 children)

Think of a person with the most average intelligence and realize that 50% of people are dumber than that.

These people vote. These people think billionaires are their friends and will save them. Gods help us.

[–] [email protected] 3 points 3 hours ago (1 children)

I'm of the opinion that most people aren't dumb, but rather most don't put in the requisite intellectual effort to actually reach accurate or precise or nuanced positions and opinions. Like they have the capacity to do so! They're humans after all, and us humans can be pretty smart. But a brain accustomed to simply taking the path of least resistance is gonna continue to do so until it is forced(hopefully through their own action) to actually do something harder.

Put succinctly: They can think, yet they don't.

[–] [email protected] 2 points 1 hour ago

Then the question is: what is being smart or dumb? If acting dumb in 90% of life while having the capability of being smart isn't "being dumb" then what is?

If someone who has the capability of being 50/100 intelligent and is always acting 50/100, I would argue they are smarter than someone capable of 80/100 intelligence but acts 20/100 intelligence for 90% of their life.

[–] [email protected] 2 points 7 hours ago (1 children)

This is why i don't believe in democracy. Humans are too easy to manipulate into voting against their interests.
Even the "intelligent" ones.

[–] [email protected] 2 points 2 hours ago

What's your preferred system?

load more comments (1 replies)
[–] [email protected] 37 points 16 hours ago

Am American.

....this is not the flex that the article writer seems to think it is.

[–] [email protected] 4 points 10 hours ago

While this is pretty hilarious LLMs don't actually "know" anything in the usual sense of the word. An LLM, or a Large Language Model is a basically a system that maps "words" to other "words" to allow a computer to understand language. IE all an LLM knows is that when it sees "I love" what probably comes next is "my mom|my dad|ect". Because of this behavior, and the fact we can train them on the massive swath of people asking questions and getting awnsers on the internet LLMs essentially by chance are mostly okay at "answering" a question but really they are just picking the next most likely word over and over from their training which usually ends up reasonably accurate.

[–] [email protected] 63 points 19 hours ago (1 children)

Reminds me of that George Carlin joke: Think of how stupid the average person is, and realize half of them are stupider than that.

So half of people are dumb enough to think autocomplete with a PR team is smarter than they are... or they're dumb enough to be correct.

load more comments (1 replies)
load more comments
view more: next ›