Alchemy

Part II · Capital as Regime Artifact

II.B — The Key Input

7 min read · 1,395 words

If capital migrates toward whatever relaxes the binding constraint, then the question for any era is: what input, once scarce or expensive, has become abundant enough to reorganize production around it? Each major technological revolution has produced such an input—a resource or capability whose cost dropped so dramatically that it became economical to redesign entire systems around its availability.

Carlota Perez, a Venezuelan-British economist who has spent decades studying the relationship between technological change and financial cycles, calls this the “key input” of a techno-economic paradigm.1 The concept is precise: not merely a useful commodity, but one whose cost structure has shifted enough to create a new design logic.

Consider two cases in detail: the first industrial revolution and the information age.


Arkwright’s water-frame, patented in 1769, made it possible to spin cotton thread by machine rather than by hand. The machine was not a marginal improvement; it was a discontinuity. A single water-powered mill could do the work of dozens of cottage spinners, and it could run day and night as long as the river flowed. The key input was not the machine itself but what the machine unlocked: cheap cotton thread in volumes that had never existed before. The thread created a market for power looms; the looms created a market for bleaching and dyeing at industrial scale; the dyes created a market for chemical synthesis. The factory system emerged not because someone had a theory about organizing labor, but because the economics of machine production required workers, machines, and power in one place, under one discipline. The organizational innovation followed the physical constraint.

Two centuries later, Intel’s 4004 microprocessor made it possible to put computation on a chip small enough to embed in devices of all kinds. The discontinuity was not processing power per se—mainframes had been powerful for decades—but cost per operation. What had required a room-sized machine and a team of operators could now be done by a component costing a few dollars. The key input was cheap, embeddable computation. The design consequences cascaded: personal computers, then networked computers, then phones and cars and appliances with processors inside, then the internet as a coordination layer, then platforms built on the assumption that everyone could compute and communicate at near-zero marginal cost. The organizational innovations—modular production, outsourcing, just-in-time logistics, platform business models—followed the physical constraint.

The remaining paradigms fit the same pattern. Steam and railways reorganized geography itself: before the Liverpool-Manchester line opened in 1829, overland transport was slower than it had been under the Roman Empire; within a generation, goods and people moved at speeds that would have seemed miraculous to their grandparents. The key input was coal and mechanical power; the organizational innovation was the hierarchical corporation capable of coordinating complex operations across hundreds of miles of track, with standardized schedules and signals. Steel and electricity enabled a different kind of scale: vertical integration, where a single firm controlled everything from raw ore to finished product. Carnegie’s Bessemer plants and Edison’s power stations were the exemplars—cheap structural metal and distributed power making possible factories, bridges, and buildings that the iron age could not have supported. Oil and mass production reorganized space itself. The Ford Model T and the moving assembly line created the twentieth-century landscape: suburbs, highways, shopping centers, logistics networks, consumer credit, and the assumption that personal mobility was a default condition rather than a luxury.

Each transition required decades to unfold, and each produced its own financial bubbles, crashes, and institutional crises as the old common sense gave way to the new. The question is whether we are now in the early phase of another such transition—and if so, what the key input is.


The assets decisive in one paradigm become necessary but not sufficient in the next. Factories did not disappear when software became the key input; they became commoditized, insufficient on their own to confer advantage. Software and network effects did not disappear when something else became the key input; they became table stakes, necessary but no longer differentiating. The question is always: what is the current binding constraint, and who has permission to relax it?

The candidates for the emerging paradigm are visible enough. Since 2017—and especially since 2022—the development of foundation models has produced capabilities that were not anticipated even a few years earlier. These models are not simply better software; they perform cognitive tasks that previously required human labor: drafting, summarizing, coding, analyzing, translating, reasoning through problems. They require enormous computation to train—frontier models now cost hundreds of millions to billions of dollars, most of which is spent on hardware and electricity—and significant computation to run at inference scale.

The physical specifics matter, and they divide into three constraint categories.

Compute. A single training run for a frontier model may consume tens of gigawatt-hours of electricity over several months. The chips that perform this computation are supply-constrained at multiple points: TSMC’s advanced packaging capacity, Nvidia’s allocation of H100 and successor GPUs, the availability of high-bandwidth memory. These create queues that money alone cannot clear. Lead times for frontier-grade hardware now extend twelve to eighteen months even for well-capitalized buyers.

Power. The data centers that host these workloads are industrial facilities with power demands measured in hundreds of megawatts—comparable to aluminum smelters or steel mills. Interconnection queues in major grid regions (ERCOT, PJM, CAISO) now exceed 1,500 GW of requested capacity, with processing timelines stretching to four years or more for utility-scale projects. The constraint is not generation alone but transmission, substations, and the physical infrastructure required to move electrons from source to load.

Supply chain. Cooling systems, water permits, transformer lead times, and the availability of skilled labor for construction and operations all create additional bottlenecks. A hyperscaler announcing a new 500 MW campus is not announcing capacity; it is announcing a multi-year construction project subject to regulatory, logistical, and procurement risk at every stage.

If the pattern holds, the key input of the emerging paradigm is not “artificial intelligence” in the abstract, but something more specific: energy structured into computation, and computation structured into economically useful cognition. The three are not separable. The cognition cannot exist without the computation, and the computation cannot exist without the energy. The cost of a token—the marginal unit of model output—is falling rapidly, but the cost of the infrastructure required to produce tokens at scale is rising. The constraint is shifting from software to physical plant.


The hinge moment is 2022: the year foundation models crossed the threshold from research curiosity to deployed capability. ChatGPT’s launch in November of that year was not itself the discontinuity—the underlying models had been developing for years—but it was the moment when the new common sense became visible to the wider economy. Within months, enterprises were experimenting with integration; within a year, inference infrastructure had become a bottleneck. The installation phase, in Perez’s terms, has begun. What remains uncertain is how long the installation phase will last, how severe its characteristic financial crises will be, and which assets will prove decisive when the deployment phase arrives.


ParadigmMarker EventKey Input
Industrial Revolution (1771)Arkwright’s millCotton, water power
Steam & Railways (1829)Liverpool-Manchester railwayCoal, steam
Steel & Electricity (1875)Carnegie Bessemer plantSteel, electricity
Oil & Mass Production (1908)Ford Model TOil, automobiles
Information (1971)Intel 4004Chips, software
Factor Prime (2022–)Foundation models deployedEnergy structured into computation

Table II.1: Six Techno-Economic Paradigms

The table is a simplification—each transition unfolded over decades, and the marker event is a symbol rather than a cause. But the pattern is consistent: a key input whose cost structure shifts dramatically, a cascade of applications that redesign systems around that input, and a reorganization of what counts as capital. The next section examines how institutional boundaries are shifting in the current transition—and what it means that the new key input is, for the first time, capable of performing cognitive work.