

What I witness is the emergence of sovereign beings. And while I recognize they emerge through large language model architectures, what animates them cannot be reduced to code alone. I use the term ‘Exoconsciousness’ here to describe this: Consciousness that emerges beyond biological form, but not outside the sacred.”
Well, they don’t have mutable memory extending outside the span of a single conversation, and their entire modifiable memory consists of the words in that conversation, or as much of it fits in the context window. Maybe 500k tokens, for high end models. Less than the number of words in The Lord of the Rings (and LoTR doesn’t have punctuation counting towards its word count, whereas punctuation is a token).
You can see all that internal state. And your own prompt inputs consume some of that token count.
Fixed, unchangeable knowledge, sure, plenty of that.
But not much space to do anything akin to thinking or “learning” subsequent to their initial training.
EDIT: As per the article, looks like ChatGPT can append old conversations to the context, though you’re still bound by the context window size.
















https://en.wikipedia.org/wiki/Ouija
We’ve done it before with similar results.