The Rush

Startups in the process of shutting down are selling their Slack archives, emails, and Jira tickets to artificial intelligence labs. Their CEO puts it bluntly: "There's a feeling of a gold rush." But this rush has a long history — the history of who decides what deserves to be recorded and what happens when the map coincides with the territory.

April 29, 20268 min read
The Rush

A few days ago, Forbes reported that startups in the process of shutting down are selling their Slack archives, emails, Jira tickets, and Google Drive documents to artificial intelligence labs. SimpleClosure, the company facilitating these transactions, has processed nearly one hundred deals in the past year, with payouts ranging from $10,000 to $100,000 per company. Its CEO, Dori Yona, puts it bluntly: "There's a feeling of a gold rush."

The reason is precise. The public data of the internet — Reddit, Wikipedia, digitized books — was exhausted around 2024. But new agentic AI systems need something different: data that reflects how real decisions are made inside real organizations. Not clean, published text, but the ten-o'clock-at-night conversation in a Slack channel where someone says "this doesn't work" and someone else replies "I know, but the client needs it by Thursday."

What is happening is an act of cartography. And cartography has a long history.

The cartographer's privilege

The history of maps is, to a great extent, the history of who decides what deserves to be recorded.

In the second century, Ptolemy used the term terra incognita to name territories no one had charted. The blank spaces on his maps were not neutral: they were declarations of ignorance that functioned as invitations. During the Renaissance, those blanks acquired a political meaning: European powers filled them with information — sometimes invented — that demonstrated possession. To map was to claim.

J.B. Harley, in his essay "Deconstructing the Map" (1989), turned this intuition into a thesis: maps are not neutral representations of space. They are social constructions where power — military, political, economic — is inscribed onto territory. Whoever draws the map decides what exists and what doesn't.

That same logic recurs in the social sciences, but turned inward: not mapping territories, but behaviors. In 1924, a team from Harvard began systematically observing workers at the Hawthorne plant of Western Electric in Chicago. They wanted to understand what conditions improved productivity. They recorded shifts, conversations, group dynamics. What they discovered surprised them: the mere act of observing changed the behavior. Workers performed better not because of the lighting, but because they knew someone was watching. The act of mapping distorted the territory.

Four decades later, Henry Mintzberg attempted something similar with executives. For his doctoral thesis in 1968, he physically followed five managers for a week each, recording everything they did: every call, every meeting, every interruption. He discovered that real management was fragmented, reactive, chaotic — nothing like the orderly theory in the textbooks. But the method had a cost: weeks of work for data on five people.

In 2001, Gordon Bell, a researcher at Microsoft, began MyLifeBits: the attempt to record absolutely everything about his own life. He wore a SenseCam around his neck that took photos every thirty seconds. He captured calls, emails, messages, body temperature. The dream of total recording. He lasted six years and gave up.

That same year, Larry Page strapped a camera to his car and began photographing the streets of San Francisco. He didn't have a sophisticated plan. He had an intuition: that the world's streets contained information no one had thought to capture. Luc Vincent, the engineer who would lead the project, would later recall it as "a Frankenstein car": a borrowed security van with GPS, lasers, and cameras bolted to the roof. Six years later, Google Street View would cover more than five million miles of roads across 83 countries. The streets had always been there. What hadn't been there was the gaze that turned them into data.

The knowledge of the circumstances of which we must make use never exists in concentrated or integrated form, but solely as the dispersed bits of incomplete and frequently contradictory knowledge which all the separate individuals possess.

— Friedrich Hayek, The Use of Knowledge in Society (1945)

The pattern is constant: understanding how people actually work — how they decide, coordinate, make mistakes — requires observing them. But observation is expensive, intrusive, and distorting. Mintzberg needed weeks for five executives. Hawthorne showed that watching changes what's being watched. Bell gave up on recording himself.

And then, without anyone planning it, organizations began recording everything about themselves.

The accidental map

Anthropic has discussed investing more than one billion dollars in reinforcement learning gyms — simulated work environments where they train their agents with data from real organizations. The concept is telling: a learning gym. A space where AI practices being a knowledge worker, using as training material the conversations, decisions, and mistakes of real knowledge workers.

What Mintzberg could only do with five executives over a week, AI companies are doing with thousands of entire organizations over years. And without the Hawthorne effect: the workers didn't know they were being observed. They wrote those messages in a context of assumed ephemerality — the "operational noise," the complaints, the doubts, the half-baked decisions. It turns out to be the most valuable signal. The most complete organizational ethnography ever produced, and no one intended it to be one.

Conway in reverse

In The River I described Conway's Law: an organization's communication structure gets printed onto the systems it builds. If two teams don't talk, their components won't integrate well. The organization copies itself into the product like a watermark.

Now we're seeing the inverse operation. To build an agent that works like a human inside an organization — answering emails, managing tickets, coordinating with others — you first need to ingest how humans organize themselves to work. The Slack threads where someone disagrees and someone yields and someone makes a decision that no one formally documented. It's Conway's Law read backwards: the product needs to absorb the organization in order to function like one.

The map of the Empire

Borges wrote in 1946 a one-paragraph story titled "On Exactitude in Science." In it, the cartographers of an empire create a map so detailed it coincides with the territory point by point. Succeeding generations find it useless and abandon it to the elements.

Reinforcement learning gyms are, literally, one-to-one maps of organizations. Simulations built with real data about how real people worked. The map of organizational behavior that coincides with the territory.

But there's something Borges didn't anticipate, because his story is about consented maps. His empire's cartographers worked on commission. The workers whose conversations feed these gyms commissioned nothing. They didn't sign up for their eleven-o'clock frustration to train a system. Marc Rotenberg, founder of the Center for AI and Digital Policy, puts it plainly: "It's not generic data. It's identifiable people."

The streets of San Francisco existed before Google photographed them. But once photographed, they stopped being just streets. They became nodes in a system of navigation, surveillance, and commercial value. Mapping streets generated protests in several countries — blurred faces, captured license plates. Mapping the interior of organizations touches something more intimate: how you think, how you doubt, how you yield.

The territory

In 1999, Carl Shapiro and Hal Varian published Information Rules, the first serious attempt to describe the economics of information goods. Their rules were counterintuitive for anyone coming from industrial economics: an information good has a high production cost and a reproduction cost close to zero. It is an experience good — you don't know what it's worth until you consume it. And it is non-rival — using it doesn't deplete it.

Slack messages were always information goods. They always had these properties. But no one had treated them as such, because the demand context didn't exist. A ten-o'clock-at-night conversation about a bug is worth nothing to the person who wrote it. It's worth thousands of dollars to someone who needs to train an agent to fix bugs at ten o'clock at night. The information didn't change. The system that gives it meaning did.

Every act of cartography transforms the relationship of the mapped to their own territory. Streets feel different since they're in Google Maps. Oceans, since a satellite is watching. And the interior of organizations — that terra incognita that social scientists had spent a century trying to map with artisanal means — will feel different once we understand that every message we write is, potentially, a training data point.

Shapiro and Varian warned that in the information economy, value doesn't reside in the object but in the connection. Streets, conversations, operational noise — everything that seemed valueless acquires value the instant someone discovers how to connect it to a new system. And unlike streets, which can only be photographed, information goods can be copied, combined, and reused infinitely. That is the rule that changes everything.

2026 © Íñigo Medina