this post was submitted on 11 Feb 2025
601 points (99.0% liked)

Technology

62161 readers
4118 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 7 points 3 days ago (1 children)

Its a very different process. Having work on search engines before, I can tell you that the word generate means something different in this context. It means, in simple terms, to match your search query with a bunch of results, gather links to said results, and then send them to the user to be displayed

[–] [email protected] 2 points 3 days ago (1 children)

then send them to the user to be displayed

This is where my understanding breaks. Why would displaying it as a summary mean the backend process is no longer a search engine?

[–] [email protected] 4 points 3 days ago (1 children)

The LLM is going over the search results, taking them as a prompt and then generating a summary of the results as an output.

The search results are generated by the good old search engine, the "AI summary" option at the top is just doing the reading for you.

And of course if the answer isn't trivial, very likely generating an inaccurate or incorrect output from the inputs.

But none of that changes how the underlying search engine works. It's just doing additional work on the same results the same search engine generates.

EDIT: Just to clarify, DDG also has a "chat" service that, as far as I can tell, is just an UI overlay over whatever model you select. That just works the same way as all the AI chatbots you can use online or host locally and I presume it's not what we're talking about.

[–] [email protected] 3 points 3 days ago (1 children)

I see, you're splitting the UI from the backend as two different things, and Im seeing them as parts to a whole.

[–] [email protected] 4 points 3 days ago (1 children)

Well, yeah, there are multiple things feeding into the results page they generate for you. Not just two. There's the search results, there's an algorithmic widget that shows different things (so a calculator if you input some math, a translation box if you input a translation request, a summary of Wikipedia or IMDB if you search for a movie or a performer, that type of thing). And there is a pop-up window with an LLM-generated summary of the search results now.

Those are all different pieces. Your search resutls for "3 divided by 7" aren't different because they also pop up a calculator for you at the top of the page.

[–] [email protected] 4 points 3 days ago

Yeah, for some reason I was thinking you were trying to say that bolting on widgets made it no longer a search engine.