The difference is that there is intend in the case of google, LLMs will just halucinate anything. Always double check dates and facts LLMs give you (or just check them somewhere else in the first place)
Actually, that depends on the llm product, as it’s very very rare to interact with a raw llm these days. Most use programmed toolsets now to perform certain tasks, chatgpt for example uses a Bing api to retrieve and link urls, so it’s as good as Bing for validation in that case. Which still is not very good. Most search engines actually suck at providing a secure experience for the end user. Google has gotten better, if not downright brilliant, but they don’t vet their ads (cashflow) like their linked urls so it’s a moot point to the end user in that case imo.
The difference is that there is intend in the case of google, LLMs will just halucinate anything. Always double check dates and facts LLMs give you (or just check them somewhere else in the first place)
Actually, that depends on the llm product, as it’s very very rare to interact with a raw llm these days. Most use programmed toolsets now to perform certain tasks, chatgpt for example uses a Bing api to retrieve and link urls, so it’s as good as Bing for validation in that case. Which still is not very good. Most search engines actually suck at providing a secure experience for the end user. Google has gotten better, if not downright brilliant, but they don’t vet their ads (cashflow) like their linked urls so it’s a moot point to the end user in that case imo.