• Ptsf@lemmy.world
      link
      fedilink
      arrow-up
      2
      ·
      1 day ago

      People use Google for urls and it continues to deliver malware to them, often even sponsored as “ads”. Not sure why chatgpt would be any different.

      • lugal@lemmy.dbzer0.com
        link
        fedilink
        arrow-up
        1
        ·
        24 hours ago

        The difference is that there is intend in the case of google, LLMs will just halucinate anything. Always double check dates and facts LLMs give you (or just check them somewhere else in the first place)

        • Ptsf@lemmy.world
          link
          fedilink
          arrow-up
          2
          ·
          24 hours ago

          Actually, that depends on the llm product, as it’s very very rare to interact with a raw llm these days. Most use programmed toolsets now to perform certain tasks, chatgpt for example uses a Bing api to retrieve and link urls, so it’s as good as Bing for validation in that case. Which still is not very good. Most search engines actually suck at providing a secure experience for the end user. Google has gotten better, if not downright brilliant, but they don’t vet their ads (cashflow) like their linked urls so it’s a moot point to the end user in that case imo.

      • XLE@piefed.social
        link
        fedilink
        English
        arrow-up
        4
        ·
        4 days ago

        Considering Google has put effort into intentionally worsening its own product, it makes sense that their chapel alternative would be something people just use.

      • BananaIsABerry@lemmy.zip
        link
        fedilink
        arrow-up
        5
        ·
        4 days ago

        Since Bing and Google have both integrated LLMs into their search engine, it’s a valid use case, according to the people who made it.

        Copilot honestly doesn’t suck for finding obscure support contact information for companies. Obviously you still have to verify after.

        • XLE@piefed.social
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          4 days ago

          I hesitantly wonder if something like Perplexity might actually be the future of search engines. It seems relatively capable of correctly interpreting search queries full of half-remembered thoughts and potentially inaccurate text into salient results. I disregard the guestimations it makes about the links it provides (of course) but the couple of times I tried it out this way, it seemed to work better than Google.

          I also wonder how much energy it requires compared to whatever trash Google returns.

  • FarraigePlaisteaċ@lemmy.world
    link
    fedilink
    arrow-up
    3
    ·
    5 days ago

    It does it all the time when is ask it a question involving anything with a web presence. The bullet pointed information will be partially incorrect but even worse, the links attached as references connect to something entirely unrelated and seemingly random.