Exactly. The big problem with LLMs is that they're so good at mimicking understanding that people forget that they don't actually have understanding of anything beyond language itself.
The thing they excel at, and should be used for, is exactly what you say - a natural language interface between humans and software.
Like in your example, an LLM doesn't know what a cat is, but it knows what words describe a cat based on training data - and for a search engine, that's all you need.
It always sucks to know you paid more than the seller did - but that just means Oxfam undervalued the book.
Having worked in one, charity shops tend to have a habit of either really undervaluing or overvaluing their donated goods - cause the people who actually set the prices mostly just guess based on looks and nothing more. Only if an item looks expensive will they do any research, and even then never really enough.