The Bolt

In Modern Times (1936) Chaplin's character tightens bolts on an assembly line. Beside him, a foreman watches. He does nothing else: he makes sure the bolt is tightened properly, at the right pace, in the right order. Above them, the factory owner watches everyone from a giant screen. The line speeds up. The character goes mad.
The scene is a parody, but the logic it parodies is still alive. Organizations want to make bolts. They want work to be predictable, measurable, replicable. They want to know how long it takes, how much it costs, and for the result to be practically identical to the plan. With bolts, it works. With software, it doesn't.
Software is not a bolt. It's not a material that behaves deterministically. It wasn't even before the arrival of AI. What you can't see in software is more than what you can: a seemingly simple change can trigger unexpected effects in parts of the system nobody was watching. Digital product organizations live in a tension they cannot resolve: between the focus demanded by software's fractal complexity and the constant urgency generated by its opacity. Our brains, trained on beautiful assembly lines where the fitting of elements at the input produces the expected output — from a car to a hamburger — experience their first crisis here.
The ox
Frederick Winslow Taylor published The Principles of Scientific Management in 1911. His method became the gospel of industrial production: measure every movement, eliminate the useless ones, standardize the rest. He called it "the one best way" — the single correct way to do each task.
To demonstrate his method, Taylor chose a worker at Bethlehem Steel whom he called Schmidt. His job was to load pig iron ingots onto a railcar. The average was 12.5 tons per day. Taylor calculated it should be 47. It's a story you can still find today in business management courses.
"Schmidt, are you a high-priced man?" / "Well, I don't know what you mean."
"A high-priced man does exactly as he's told from morning till night. When he tells you to walk, you walk. When he tells you to sit down, you sit down. And you don't talk back."
Schmidt loaded 47.5 tons. He earned $1.85 instead of $1.15. Taylor considered it a success. But the most revealing thing is not the experiment. It's what Taylor thought of Schmidt:
"One of the very first requirements for a man who is fit to handle pig iron as a regular occupation is that he shall be so stupid and so phlegmatic that he more nearly resembles in his mental make-up the ox than any other type."
That sentence contains in miniature everything that came after. If you assume people are oxen, you design systems for oxen. Assembly lines, foremen, stopwatches. The bolt Chaplin tightens is exactly that: work reduced to a repetitive gesture that neither requires — nor allows — thought.
The knowledge worker
In 1959, Peter Drucker coined a term that should have changed everything: knowledge worker. His argument was that the type of work that was emerging could not be managed like an assembly line. It couldn't be measured in units per hour. It couldn't be standardized. It couldn't be supervised by looking over someone's shoulder.
"No one can motivate him. He has to motivate himself. No one can direct him. Above all, no one can supervise him."
"Knowledge workers must know more about their job than their boss does — or else they are no good at all."
Drucker called the increase in manual worker productivity "the great management contribution of the twentieth century — a fifty-fold increase." And he said the challenge of the twenty-first century would be to do the same with the knowledge worker. But he warned that the methods would be completely different. What motivates the knowledge worker, he wrote, "is what motivates volunteers: they need challenge, they need to know the organization's mission and to believe in it, they need to see results."
Most organizations listened to Drucker, nodded, and kept managing like Taylor.
The Japanese countermodel
While the West was perfecting the assembly line, something different was happening in Japan. Toyota developed a production system based on premises opposite to Taylor's. Instead of a "one best way" designed from above, kaizen proposed that continuous improvement should come from all levels of the organization. Instead of eliminating human variability, the system leveraged it. Workers could stop the production line if they detected a problem — unthinkable in a Taylorist factory.
Takeuchi and Nonaka described in 1986 how the most innovative Japanese teams worked in an integrated fashion, overlapping phases and learning in short cycles. They called it "the new product development game." They didn't know it, but they were describing the direct precursor of what fifteen years later would be called Scrum.
J.K. Liker systematized these principles in The Toyota Way (2004): Toyota's competitive advantage came not from technology or capital, but from culture and method. Respect for people. Elimination of waste. Decisions made as close as possible to where the work happens.
"The Toyota style is not to create results by working hard. It is a system that says there is no limit to people's creativity. People don't go to Toyota to 'work' they go there to 'think'."
— Taiichi Ohno
It was, in essence, the antithesis of Taylor. But translating it to Western software organizations turned out to be much harder than it seemed.
Seventeen people in the snow
In February 2001, seventeen developers gathered at a ski resort in the Wasatch Mountains of Utah. They came from different currents — Extreme Programming, Scrum, Crystal, Adaptive Software Development — but shared a frustration: heavy processes, endless documentation, plans that didn't survive contact with reality.
They rejected calling themselves "lightweight methods." Alistair Cockburn explained: "I don't mind the methodology being called light in weight, but I'm not sure I want to be referred to as a lightweight attending a lightweight methodologists meeting. It somehow sounds like a bunch of skinny, feebleminded lightweight people trying to remember what day it is."
What they produced in two days — four values and, weeks later by email, twelve principles — became the Agile Manifesto. People over processes. Working software over comprehensive documentation. Responding to change over following a plan.
Andrew Grove, who in 1983 was piloting Intel's strategic pivot from memory to microprocessors, put it in a metaphor the signatories would have endorsed:
"We must recognize that no amount of formal planning can anticipate changes such as globalization and the information revolution. Does that mean that you shouldn't plan? Not at all. You need to plan the way a fire department plans. It cannot anticipate where the next fire will be, so it has to shape an energetic and efficient team that is capable of responding to the unanticipated as well as to any ordinary event."
— Andrew Grove, High Output Management
The irony is that Agile became exactly what it was meant to replace. Martin Fowler, one of the signatories, said bitterly in 2018: "The Agile Industrial Complex imposing methods upon people is an absolute travesty." And he added something that sounds like an involuntary echo of Taylor: "We must fight against the imposition of one best way of doing things."
Dave Thomas, another signatory: "The word 'agile' has been subverted to the point where it is effectively meaningless. What passes for an agile community seems to be largely an arena for consultants and vendors."
A philosophy born to free teams from bureaucracy was industrialized through rigid frameworks, commercial certifications, and compliance checklists.
Taylor would have recognized the pattern.
The self-fulfilling prophecy
In 1960, Douglas McGregor published The Human Side of Enterprise and described two lenses for looking at people. Theory X assumes people avoid work, need constant supervision, and only respond to incentives and punishments. Theory Y assumes people seek responsibility, are intrinsically motivated, and work better when given autonomy.
McGregor never said Y was "the good one." What he did say is more powerful:
"The limits of human collaboration are not limits of human nature but of management's ingenuity in discovering how to realize the potential represented by its human resources."
What he observed is that a manager's assumptions become self-fulfilling prophecies. If you assume people are lazy and need control, you design surveillance systems that produce exactly the behaviour you expected: people who do the minimum, take no initiative, and wait for instructions. If you assume people want to contribute, you design conditions that allow them to, and people respond.
Frederick Herzberg added a crucial nuance: the factors that prevent demotivation are not the same as those that generate motivation. Salary, working conditions, stability — they work like oxygen. If they're missing, everything stops. But more oxygen doesn't make you run faster. What truly motivates is autonomy, mastery of one's craft, purpose, recognition of work well done.
The connection is direct: a Theory X culture tends to manage only the hygiene factors — more stick or more carrot. A Theory Y culture focuses on intrinsic motivators. And most organizations invest enormous energy in optimizing hygiene factors thinking it generates motivation, when all it does is prevent demotivation.
Inertia
With over a century of accumulated evidence, you'd expect organizations to have learned.
They haven't.
Startups that publicly declare they bet on autonomy, that you already have permission, that the bias is toward action — and then design quarterly evaluation systems with traffic lights, four timed sections in every one-on-one meeting, and calibrations between managers. The diagnosis is one paragraph long and the prescription is ten pages. Intentionally flat organizations importing processes designed for companies with three layers of management, without asking whether the problem those processes solve is a problem they actually have.
It's not an execution error. It's a pattern. Organizations tend to imitate practices without understanding the context that made them work — exactly the same tendency that led to adopting waterfall as a prescription when Royce had described it as an anti-pattern, or turning Agile into a certification system when its creators conceived it as a set of values.
Modern Times, Charlie Chaplin (1936)
The companies that have found alternatives share one trait: they invest their energy in hiring well and creating context, not in measuring performance afterwards. 37signals eliminated 360 reviews after trying them for 2 years — in hundreds of evaluations, only once was there a significant follow-up. Netflix dropped formal annual reviews for being ritualistic and bureaucratic — Reed Hastings ended up titling the book about their culture No Rules Rules. Shopify banned KPIs in the classic Silicon Valley sense; its CEO invokes Goodhart's law: the moment a metric becomes a target, it ceases to be a useful metric. Haier, with 80,000 employees, eliminated 12,000 middle managers and reorganized into 4,000 autonomous micro-enterprises of 10-15 people.
The common denominator is not the absence of structure. It's that structure serves the work, not the other way around.
There's a thread connecting all these experiences to a deeper transformation in how we understand leadership. David Marquet, commander of a US nuclear submarine, articulated it with a simple distinction: moving from a leader-follower model to a leader-leader model. Instead of a captain giving orders to a crew that obeys, a system where every person has enough authority and context to make decisions. Marquet didn't do it out of philosophy. He did it because on a submarine, if the captain is the only one thinking, people die.
That same logic runs through the digital product organizations that work today. Marty Cagan has spent years insisting on the difference between teams that receive features to build — feature teams — and empowered teams that receive problems to solve. The difference is not semantic. A team that receives an assignment executes. A team that receives a problem thinks. It's the distance between Schmidt loading pig iron and the Toyota worker who can stop the line.
It's no coincidence that the organizations that work best with software are also the ones that have invested most in distributing authority. Software is a material that demands thinking at every layer — and if the capacity to think is concentrated at the top, decisions arrive late, arrive wrong, or don't arrive at all.
Breaking the chain
Taylor wanted to find "the one best way." Chaplin showed where that led. Drucker said it wouldn't work with knowledge workers. The signatories of the Agile Manifesto tried to build something different and watched it become a new version of the same thing. McGregor explained why: the systems you design reveal what you think of your people, and what you think of your people determines how they behave.
Software dissolves our aspirations of total control over the final result. It's not wood, nor metal, nor any material whose behaviour we can anticipate with precision. And organizations that work with it need to accept that nature instead of fighting it. The ones that manage it don't do so by finding the right method — they do it by stopping the search for one.
Chaplin's bolt is still there. The question is whether we keep tightening it or begin to understand that the material we have in our hands demands something else.