this post was submitted on 11 Feb 2025
601 points (99.0% liked)
Technology
62401 readers
4207 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Maybe I just don't know what "generating results" means. You query a search engine, and it generates results as a page of links. I don't understand how generating a page of links is fundamentally different from generating a summation of the results?
Its a very different process. Having work on search engines before, I can tell you that the word generate means something different in this context. It means, in simple terms, to match your search query with a bunch of results, gather links to said results, and then send them to the user to be displayed
This is where my understanding breaks. Why would displaying it as a summary mean the backend process is no longer a search engine?
The LLM is going over the search results, taking them as a prompt and then generating a summary of the results as an output.
The search results are generated by the good old search engine, the "AI summary" option at the top is just doing the reading for you.
And of course if the answer isn't trivial, very likely generating an inaccurate or incorrect output from the inputs.
But none of that changes how the underlying search engine works. It's just doing additional work on the same results the same search engine generates.
EDIT: Just to clarify, DDG also has a "chat" service that, as far as I can tell, is just an UI overlay over whatever model you select. That just works the same way as all the AI chatbots you can use online or host locally and I presume it's not what we're talking about.
I see, you're splitting the UI from the backend as two different things, and Im seeing them as parts to a whole.
Well, yeah, there are multiple things feeding into the results page they generate for you. Not just two. There's the search results, there's an algorithmic widget that shows different things (so a calculator if you input some math, a translation box if you input a translation request, a summary of Wikipedia or IMDB if you search for a movie or a performer, that type of thing). And there is a pop-up window with an LLM-generated summary of the search results now.
Those are all different pieces. Your search resutls for "3 divided by 7" aren't different because they also pop up a calculator for you at the top of the page.
Yeah, for some reason I was thinking you were trying to say that bolting on widgets made it no longer a search engine.