• Ptsf@lemmy.world
    link
    fedilink
    arrow-up
    2
    ·
    4 days ago

    People use Google for urls and it continues to deliver malware to them, often even sponsored as “ads”. Not sure why chatgpt would be any different.

    • lugal@lemmy.dbzer0.com
      link
      fedilink
      arrow-up
      1
      ·
      3 days ago

      The difference is that there is intend in the case of google, LLMs will just halucinate anything. Always double check dates and facts LLMs give you (or just check them somewhere else in the first place)

      • Ptsf@lemmy.world
        link
        fedilink
        arrow-up
        2
        ·
        3 days ago

        Actually, that depends on the llm product, as it’s very very rare to interact with a raw llm these days. Most use programmed toolsets now to perform certain tasks, chatgpt for example uses a Bing api to retrieve and link urls, so it’s as good as Bing for validation in that case. Which still is not very good. Most search engines actually suck at providing a secure experience for the end user. Google has gotten better, if not downright brilliant, but they don’t vet their ads (cashflow) like their linked urls so it’s a moot point to the end user in that case imo.