I use LLMs all day and they've replaced a lot of what I'd previously searched for. Even so, I don't think I'm ready yet for an LLM as default search engine. LLMs are useful for generating information based on what they know about. For example, generating useful commands for tools like `ffmpeg` or `jq`. They're not better than a search engine for finding factual information, like "What is the population of Canada?" LLM hallucination is still a thing.
LLMs replace some of the work we'd previously had to rely on search engines for. They're not a replacement for everything, and they shouldn't be used as such.
I prefer kagi's approach. If I add a "?" to the end of my search query, I get a LLM response at the top of my search results. Any LLM response includes source links to quickly verify the information presented. So, if I am using search engine to quickly check a fact, say "What year did Tremors come out?" I get the following at the top of the results:
Quick Answer
The movie Tremors was released in 1990 12. It is an American monster horror film starring Kevin Bacon and Fred Ward 12.
References
r/movies on Reddit: Tremors (for those who saw it when it came out www.reddit.com
Tremors (1990 film) - Wikipedia en.wikipedia.org
this feature is one where ChatGPT will actually do a Bing search (sometimes rewriting your search query), and then show you a summary of the content of the links.
so "hallucinations" are more rare, although the model may be very gullible with respect to the content of the search results, which is similar.
It's just going to regurgitate the first few things it sees without applying critical reasoning. These models aren't quite there yet for search reliability.
Will this still send your information to Google because the redirect sometimes happens after the search has already been sent? I’m curious because I use a similar Kagi extension and I feel like it’s definitely sending my information to DuckDuckGo which sucks. I wish apple would allow more search options.
Depends on how the extension works probably. Had the issue with some extension but the Qwant extension for example doesn't leak - you can check yourself when capturing the http(s) traffic with something like proxyman.
LLMs replace some of the work we'd previously had to rely on search engines for. They're not a replacement for everything, and they shouldn't be used as such.
Quick Answer
so "hallucinations" are more rare, although the model may be very gullible with respect to the content of the search results, which is similar.