• lloram239@feddit.de
    link
    fedilink
    arrow-up
    1
    ·
    1 year ago

    It’s not about sensory inputs, it’s about having a model of the world and objects in it and ability to make predictions.

    And how do you think that model gets build? From processing sensory inputs. And yes, language models do build internal models of the world from that.

    GPT cannot “figure” anything out.

    That nonsense of a claim doesn’t get any more true from repeating. Seriously, it’s profoundly idiotic given everything ChatGPT can do.

    It only probabilistically generates text.

    So what? In what way does that limit its ability to reason about the world? Predictions about the world are probabilistic by nature, since the future hasn’t happened yet.