• Jeena@piefed.jeena.net
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    10
    ·
    1 month ago

    Sadly I have to say this seems right. It’s just easier to ask AI to find the right article on Wikipedia and summarize the thing you’re actually interested in. And most of the time it’s not a life and death situation so even if the AI lies it often doesn’t matter.

    • jjjalljs@ttrpg.network
      link
      fedilink
      arrow-up
      12
      arrow-down
      2
      ·
      1 month ago

      If the answer’s accuracy doesn’t matter why even look it up? Just guess. That’s faster and cheaper.

      • warm@kbin.earth
        link
        fedilink
        arrow-up
        6
        ·
        1 month ago

        It makes no sense because it’s already incredibly easy to find something on Wikipedia and the first paragraph is always a summary. Humans are only going to get lazier and less intelligent thanks to AI.

        • Jeena@piefed.jeena.net
          link
          fedilink
          English
          arrow-up
          2
          ·
          1 month ago

          My last question to it was:

          “What are those berries called 개멀구”

          When I googled it, Bing didn’t even find the Wikipedia article, probably because it’s spelled a bit wrong, even though a native Korean wrote it down like this. But the LLM understood what was meant and found the Korean Wikipedia article https://ko.wikipedia.org/wiki/까마중 took the summary and translated it to English for me.

    • teft@piefed.social
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      1
      ·
      1 month ago

      That’s a hell of a gamble. What happens when it lies to you about something important (like a life or death situation) that you have no knowledge in? You’d never know since these llms are bullshit artists.