• @lloram239

    > But human sensory inputs aren’t special

    It’s not about sensory inputs, it’s about having a model of the world and objects in it and ability to make predictions.

    > The important part is that the AI can figure out the pattern in the data it does get and so far AI systems are doing very well.

    GPT cannot “figure” anything out. That’s the point. It only probabilistically generates text. That’s what it does, there is no model of the world behind it, no predictions, no"figuring out".

    • lloram239@feddit.de
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      It’s not about sensory inputs, it’s about having a model of the world and objects in it and ability to make predictions.

      And how do you think that model gets build? From processing sensory inputs. And yes, language models do build internal models of the world from that.

      GPT cannot “figure” anything out.

      That nonsense of a claim doesn’t get any more true from repeating. Seriously, it’s profoundly idiotic given everything ChatGPT can do.

      It only probabilistically generates text.

      So what? In what way does that limit its ability to reason about the world? Predictions about the world are probabilistic by nature, since the future hasn’t happened yet.