Conversation No. 001
Sunday, March 8, 2026 · Berkeley
chee.se
On Intelligence
and Extension
A Sunday morning conversation about cultural amnesia, the long project of digitization, and what it means to extend the self into the world.
K
Kenyatta Cheese
T
Tricia Wang
I
Amnesia
K
The reason we have cultural amnesia right now — across so many parts of society — is that we're digitizing so much. We're externalizing things we usually call thought or cognition. And so we don't remember to look back to understand where what we're experiencing connects to what happened before.
close ×
Thread III — Representation
Cultural Amnesia & Tertiary Retention
Bernard Stiegler: when memory is offloaded into technical objects — books, recordings, databases — we gain cognitive reach while losing the interior synthesis that makes knowledge ours. We've externalized the repository without replacing the trigger that connects it to lived experience.
→ Stiegler, Technics and Time (1998)
It shows up two ways: nostalgia — the sense that things were better before — or a sense that there's a better way ahead, without being able to explain where that comes from. Neither comes from nowhere. There's a connection to where thought has been. We just haven't lined up the evidence.
This is why hallucination isn't primarily an accuracy problem. It's a verification and memory problem. In oral cultures, the elder wasn't just a repository — they were a live correction system. We've externalized the repository without replacing the trigger.
II
The Long Lineage
T
This is why I'm interested in intelligence. And why I've always seen dogs as our first computer. Every time we've been able to externalize our intelligence, there have been big shifts in how we see ourselves and act on the world — and how the world acts on us too.
close ×
Thread I — Distributed Intelligence
Dogs as First Computer
The domesticated dog as cognitive prosthetic — extending human sensory and tracking range before any other technology. Its trainability as programmability. What the dog made possible was a qualitative shift in what humans could do with the same body. Each subsequent externalization follows the same structure.
K
And the shift in how we derive meaning from it. In the past, the ability to extend into other people was connected to concepts of self — in interesting ways and weird ways. Tribalism. The sense that some people are more person than others.
Lewis Mumford, in Technics, talks about how the point of the pyramids wasn't the pyramids. The pharaoh had enough power to extend his own desire into his people, his army — and create bureaucracy and systems to build something that large. The architecture was the expression of reach. The message to other rulers: if they could organize at that scale, you didn't want to find out what else they could do.
close ×
Thread II — Institutional Power
Mumford — The Megamachine
Technics and Human Development (1966). Organized human labor — soldiers, priests, scribes — was the first true machine. The pyramid was not built by machines; it was the machine. Power is the ability to extend your will through other bodies at scale.
→ Mumford, Myth of the Machine Vol. I
I'd separate the capacity into three registers: the ability to act upon the world, the ability to extend cognition, and the sense of meaning. They're related but distinct. Actor-network theory gets at something here — the non-human as full participant. Timothy Morton takes it further: drop the distinction between biological and non-biological entirely. Not as metaphor but as ontology.
close ×
Thread I — Distributed Intelligence
ANT → The Mesh
Latour's ANT: non-humans — objects, technologies, microbes — are full actors, not passive instruments. Morton's dark ecology: the entanglement of all life and matter with no outside, no clean observer/observed split. Intelligence was never only human. Dogs, fire, writing, corporations, AI — the same kind of thing at different scales.
→ Latour, Reassembling the Social / Morton, Ecology Without Nature
T
Church, state, corporation — all systems for accumulating and coordinating intelligence at scale. What's different now is that these capabilities may be shifting toward individuals rather than institutions.
close ×
Thread II — Institutional Power
IP as Intelligence Capital
What firms actually do is accumulate, organize, and redeploy knowledge. The corporation isn't primarily a production machine — it's an intelligence machine. Patent law and trade secrecy are infrastructure for keeping that intelligence from migrating. The question: what happens when individuals can do this independently?
Those who approach technology as users — which is how we've been trained — versus those who can extend themselves through it. Those are different positions in history.
The extension lineage
III
Low Resolution
K
Digitization is a giant project to represent the world in software, and it is much harder than we realize. We're going to do it at low resolution. Low poly. Everything. And we have to figure out how to make the resolution higher, and higher again. But in the meantime, if we're telling people they have to use it — they're going to get frustrated. That frustration is signal, not noise.
close ×
Thread III — Representation
The Low-Resolution Frame
Early 3D graphics used polygon meshes to approximate organic forms — functional but obviously incomplete. You could sense what it was while feeling the wrongness of it. Applied to digital representations of social reality: functional enough to act on, insufficient enough to distort. The gap is the permanent condition of the project, not a temporary bug.
The early web bank: you could make one, but it would be worse than a real bank in every security dimension. Everything the physical world had built up over centuries wasn't there. It takes a generation to find the equivalents. Multiply that across all of social life.
We built social networks so low-resolution compared to our actual relationships that we had to reduce identity to: are you in a relationship or not? Male or female? This culture or that? A system that does nothing but sort people into SQL-shaped boxes became our primary way of interacting — which meant we emphasized the divisions — which meant we think other humans are bad. Not because of malice. Because of fidelity.
close ×
Thread III — Representation
The Schema as Politics
A relational database requires discrete, non-overlapping fields. The schema is not neutral: it encodes what distinctions matter and what continuities get flattened. James Scott's Seeing Like a State: legibility to administrators requires simplifications that destroy local complexity. The platform can only govern what it can see. It can only see what has been reduced to a processable form.
→ Scott, Seeing Like a State (1998)
T
The male/female sorting connects to chicken sexing — we create systems for seeing the world, and anything that doesn't fit gets discarded or classified as abnormal. We absorbed social media carrying the industrial way of seeing. The residue is still in the system. And we have a chance right now to redesign that.
IV
Despair as Delay
T
I see a piece I need to write. AI is not separate from us. It is an extension of us. It is a part of us. And once we start to see that, we'll have a different mindset.
K
That's what I'm going to say to my team this week.
T
Then we should figure out how to write the same thing in two different ways.
K
The despair is a disconnection. There's going to be a gap between what we think the world is and what our tools can represent — and that gap is a space of despair. But the despair is because it's a delay. Not because it's wrong. Not because it's irreversible.
In the meantime, people jump into those spaces to create points of arbitrage and exploit them. A lot of the misinformation and disinformation in elections and through the pandemic wasn't nation-states doing evil things. It was kids in Eastern Europe and Southeast Asia running affiliate blogs — exploiting the gap between what we expected the world to look like and the low-resolution version that was actually there. A structural explanation, not a moral one.
close ×
Thread III — Representation
Gap Arbitrage
Every mismatch between representation and reality creates an exploitable delta. Early-web fraud exploited missing security layers. Disinformation exploited missing verification infrastructure. The moral framing ("bad actors") is accurate but explanatorily weak — it doesn't predict where exploitation emerges next. The resolution frame does.
What would help is verification — proof of control, the institutional memory equivalent. That takes care of one layer. But there are dozens more layers that haven't been built yet. The hope isn't in resolving the gap. The hope is in reframing the problem — understanding despair as signal rather than conclusion.
V
Connection and Difference
T
Alan Watts on separation — the original sin of Western thought: experiencing the self as a skin-encapsulated ego, discrete and opposed to the world. Apart from AI, from tech, from society, from nature, from each other. That disconnection is the distortion everything else runs through.
close ×
Thread I — Distributed Intelligence
Watts + DeLanda — Unity and Difference
Watts (Zen, Vedanta): the fundamental error is the skin-encapsulated ego. — DeLanda (Deleuze): heterogeneity — the non-reducible properties of each component — is what makes assemblages dynamic. Not: we are all the same. But: the difference is what generates capacity. Read together, not as a choice between them.
→ DeLanda, A New Philosophy of Society (2006)
K
DeLanda is doing something different. He's saying: focus on the difference, because together, the difference is the important part. It's not that we're all the same. It's that heterogeneity is where the system's capacity comes from. You need both read together — one's spiritual and one's philosophical, but they're making the same argument from opposite directions.
By separating things out in order to understand them, we actually lose the point. The point is situated in the relation.
K
K
Which connects to communities as a form of extension we haven't talked about. Families. Neighborhoods. Not the institutional forms — church, state, corporation — but the informal ones. Gun research in Chicago and LA showed that gangs don't actually have many guns. They treat the gun as a social object that circulates through a network based on what job needs doing. What legislators are trying to regulate is situated in a network. The legislation doesn't account for that.
close ×
Thread II — Institutional Power
Informal Extension Networks
The formal taxonomy misses the primary forms: family, neighborhood, mutual aid, gang, congregation, crew. These are also megamachines — coordinated bodies extending individual capacity — but without legible structure, they're invisible to policy and to theory. The state can only regulate what it can see. The network doesn't fit the schema.
The object isn't the unit. The network is the unit.
The expectation in California law is that there's only one form of protection: the state. That you aren't your community and you aren't your state. But if you apply the concept Tricia is building — the extension of the self through networks — it reorients you toward all these things we've been treating as separate concepts. By separating them out, we actually lose the point.
1
Intelligence has always been distributed across networks of humans, animals, and tools — never located only inside individual minds.
2
Institutions historically concentrated that distributed intelligence — church, state, corporation each did this in turn, shifting who could act at scale.
3
Digitization dramatically extends this externalization but currently operates at low fidelity — the world is being encoded faster than it can be understood.
4
The gap between reality and representation produces confusion, exploitation, and cultural amnesia — despair as delay, not as diagnosis.
5
AI may be the moment the distribution becomes legible to itself — not a departure from human cognition, but an extension finally becoming visible as such.
"If intelligence has always been distributed across networks, then AI isn't a departure.
It's the moment the network becomes aware of its own architecture."
It's the moment the network becomes aware of its own architecture."
∎
Berkeley, California — March 2026
This conversation was lightly edited from a recorded exchange.
Annotations are expansions, not corrections.
Threads are reading lenses, not arguments.