Part III · The Physics of Thought
III.A — The Cost of Forgetting
6 min read · 1,103 words
Information is physical.
— Rolf Landauer, IBM Journal of Research and Development (1961)
The previous part traced capital’s migration toward whoever controls the binding constraint. In the emerging regime, that constraint is energy-structured-into-computation. But computation is not an abstraction; it is a physical process, subject to physical limits. Those limits determine what the new regime can and cannot do—and understanding them is prerequisite to understanding who will control it.
The marriage of information theory and thermodynamics was consummated in a single inequality, published by Rolf Landauer in 1961. The result is deceptively modest: erasing one bit of information requires dissipating at least joules of energy, where is Boltzmann’s constant and is temperature.11Rolf Landauer, "Irreversibility and Heat Generation in the Computing Process," IBM Journal of Research and Development 5, no. 3 (1961): 183--191.View in footnotes ↓ At room temperature this works out to about joules—a number so small it seems academic, almost decorative. But its implications are structural, not numerical. It establishes that computation has a thermodynamic floor, and that floor cannot be negotiated away by cleverness alone.
The bound applies not to “thinking” in general, but to the moments when a physical system is forced to forget. Computation, in its logical form, often involves operations that are logically irreversible—gates that discard information about which inputs produced a given output.22Charles H. Bennett, "The Thermodynamics of Computation—a Review," International Journal of Theoretical Physics 21, no. 12 (1982): 905--940.View in footnotes ↓ This logical irreversibility must be physically compensated. The Second Law of Thermodynamics forbids the total entropy of a closed system from decreasing. When a computational system erases information—when it reduces the number of distinguishable microstates in the memory register—that entropy must go somewhere. It goes into the environment as heat.
Landauer’s limit is not a technological barrier. It is a thermodynamic law—as fundamental as the prohibition on perpetual motion.
Modern transistors operate roughly six orders of magnitude above the Landauer limit. A state-of-the-art processor in 2024 dissipates approximately joules per logical operation, while the Landauer minimum at room temperature is joules.33Seth Lloyd, "Ultimate Physical Limits to Computation," Nature 406 (2000): 1047--1054.View in footnotes ↓ This gap represents the engineering overhead: the energy lost to resistance, capacitance, imperfect switching, clock distribution, and all the other realities of silicon at scale.
The gap is both a ceiling and an opportunity. A millionfold improvement is theoretically available before physics imposes an absolute stop. But the room is shrinking. Dennard scaling—the principle that as transistors shrink, power density remains constant—held efficiency gains on a stable trajectory until approximately 2006, after which it broke down.44Robert H. Dennard and Fritz H. Gaensslen and Hwa-Nien Yu and V. Leo Rideout and Ernest Bassous and Andre R. LeBlanc, "Design of Ion-Implanted MOSFET's with Very Small Physical Dimensions," IEEE Journal of Solid-State Circuits 9, no. 5 (1974): 256--268.View in footnotes ↓ Since then, the approach toward the Landauer floor has slowed, but it has not stopped.
Charles Bennett showed in 1973 that logically reversible computations can in principle be performed with arbitrarily low energy dissipation; entropy increases only when information is discarded.22Charles H. Bennett, "The Thermodynamics of Computation—a Review," International Journal of Theoretical Physics 21, no. 12 (1982): 905--940.View in footnotes ↓ The tradeoff—storing all intermediate states, accepting slower circuits—is unfavorable for current applications, but the theoretical point stands: Landauer’s limit applies to erasure, not to computation as such.
Seth Lloyd extended the analysis to ask what maximum computational rate a physical system of fixed energy and size could achieve.33Seth Lloyd, "Ultimate Physical Limits to Computation," Nature 406 (2000): 1047--1054.View in footnotes ↓ His answer—drawing on both quantum mechanics and thermodynamics—suggests approximately thirty orders of magnitude of theoretical headroom between current systems and the ultimate physical limit. But this headline number is misleading. The Lloyd limit assumes perfect utilization of all matter as computational substrate—technologies that remain speculative. Physics does not cap computation at human-relevant scales; engineering and economics do. The thermodynamic limits are far away. The engineering limits are much closer. And the economic limits—the cost of energy, the scarcity of fabrication capacity, the time required to build infrastructure—are binding now.
The Landauer and Lloyd limits matter for economics because they establish boundary conditions on a quantity that is becoming a primary factor of production. Computational work is bounded: , where is the energy budget allocated to computation. This is a hard constraint, not a soft one. No amount of algorithmic ingenuity can compute past it.
The constraint becomes binding under two conditions. First, energy scarcity: if total energy throughput is limited by resource constraints, climate policy, or cost, then computation faces a ceiling. Second, thermodynamic maturity: if devices approach Landauer efficiency, further improvement requires either more energy or lower temperatures. Neither condition is imminent, but both are visible on the trajectory. AI training runs in 2024 consume megawatts. If model capabilities continue to scale with compute, and compute continues to require energy, then the energy-computation-output linkage becomes a structural feature of the economy, not a footnote.
Where compute is the bottleneck, advantage accrues to whoever controls the cheapest reliable joules delivered to silicon at scale. This is the economic translation of Landauer’s principle. The physics sets the floor; the engineering determines how far above the floor we operate; and the economics determines who can afford to operate at scale. A company with access to cheap, reliable power and efficient hardware has a structural advantage over a company without—not because of cleverness, but because of thermodynamics.
One way to understand the Landauer limit is to distinguish between exponential and constant factors in the cost of computation. The exponential factors—Moore’s Law, algorithmic improvements, architectural innovations—have driven the millionfold decrease in cost per operation over the past half-century. These factors are engineering achievements, not physical constants, and there is no guarantee they continue indefinitely.
The constant factor——is physics. It sets a floor that no amount of engineering can breach without changing the rules: lowering temperature, or discovering new physics that the current framework does not anticipate.
The historical trajectory of computing has been dominated by the exponential factors. We have moved from vacuum tubes ( J/op) to transistors ( J/op) to modern CMOS ( J/op), a twelve-order-of-magnitude improvement. But the Landauer floor is at J/op, meaning only six orders of magnitude remain before the constant factor dominates.
This is why the thermodynamic perspective matters for understanding the emerging regime. The era of costless scaling has a physical endpoint. After that endpoint, more computation requires more energy, full stop. The constraint may not bind today, but it will bind eventually—and the infrastructure decisions being made now will determine who can operate at scale when it does. Landauer’s floor is not a curiosity for physicists; it is the foundation on which the economics of Factor Prime will be built. The next section examines how much room remains, and what closing the gap would require.