Alchemy

Part VI · The Investment Thesis

VI.A — The Investment Thesis

12 min read · 2,210 words

The preceding parts established a production function: energy structured through computation and disciplined by selection. They established a hurdle rate: the floor created by Bitcoin’s direct conversion of electricity to globally liquid value. They established actuation bottlenecks: the physical, institutional, and regulatory constraints that gate deployment. They established market participation requirements: identity, settlement, verification, liability, and the term structure necessary for multi-period coordination.

This section asks what these dynamics imply for capital allocation. The answer is not a set of predictions about which firms will win. It is a structure for understanding where value accrues during paradigm transitions, and where skepticism is warranted toward advantages that rest on commoditizing foundations.


Every techno-economic paradigm creates a characteristic pattern of value capture. Understanding the pattern matters more than predicting the winner.

David Teece formalized this observation.1 When a core technology diffuses faster than any single firm can appropriate it, returns migrate to owners of complementary assets: the capabilities, relationships, and institutional positions that the technology requires but cannot replicate. The core technology becomes necessary infrastructure; the complementary assets become the bottleneck. The bottleneck captures the margin.

The pattern recurs across paradigm transitions. During the railroad era, the technology of steam locomotion commoditized rapidly. Multiple firms could build engines to similar specifications. The durable positions were the land grants that determined which routes could be built, the terminal facilities that controlled access to urban centers, the regulatory franchises that excluded competitors. During the electrification era, the technology of generation commoditized. Turbines and dynamos became standardized equipment. The durable positions were the transmission networks that moved power from generation to load, the regulatory territories that granted exclusive service rights, the customer relationships that created switching costs.

In both cases, returns migrated from the core technology to the complementary assets as the core technology diffused. The pattern is structural, not accidental.


The resulting structure admits a schematic representation. Below lies a sea of cognitive capability, expanding month by month and accessible at diminishing cost to anyone with electricity and network connection. Models multiply; training costs per unit of capability fall along curves engineers project with increasing confidence; inference becomes infrastructure rather than competitive advantage. Work that constituted professional differentiation a decade ago is now service accessible through API, and the work that differentiates today will follow the same trajectory within its turn.

At the surface lies an authorization membrane: the credentials, licenses, and legal standing that convert cognitive output into binding consequence. A model can draft the contract; a licensed attorney signs it. A model can recommend the prescription; a credentialed physician authorizes it. A model can generate the trade; a registered broker executes it. This membrane does not expand with computation. It expands through the slow processes by which professional schools admit candidates, licensing boards confer authority, and legislatures define the scope of regulated activity. The membrane is thin by design—accountability constrains supply—and its thinness is not inefficiency but feature, the institutional mechanism that ensures consequential actions remain traceable to parties capable of bearing responsibility.

Above lies infrastructure ownership: the generators and transmission systems that supply electrical power, the fabrication facilities that produce semiconductors, the data centers where computation occurs, the physical plant that converts capital into productive capacity over timelines measured in years. These assets are durable, physical, and constrained by factors software cannot circumvent. Their owners set terms for everyone operating below.

Returns distribute according to this topology. Value drains from the cognitive sea as capability commoditizes, margins compressing toward the cost of electricity and silicon. Value accumulates at the membrane, where authorization is scarce and converts potential into permission, and at the apex, where infrastructure ownership determines who participates. The gradient of returns follows the gradient of constraint.


Cognitive capability has the characteristics of a commoditizing core technology.

Foundation models improve through scale, data, and architectural innovation. Each improvement diffuses. Training techniques that produce frontier capability become public knowledge within months of publication; open-weight models compress the gap between frontier and commodity; inference costs fall as hardware improves and competition intensifies. The pattern resembles semiconductor manufacturing: the frontier advances, but the distance between frontier and commodity shrinks rather than expands.

This does not mean cognitive capability is worthless. It means durable advantage does not reside in the capability itself. Frontier general-purpose capability erodes as a moat. What persists is the integration of capability with distribution, data, workflow, and institutional relationships. A model embedded in an enterprise workflow with proprietary data and switching costs has defensibility that raw capability lacks. The moat is not the model but what surrounds the model.

The electricity analogy clarifies the layer economics. Base inference—the raw provision of tokens per dollar—may come to resemble utility economics: necessary, competitive, modest margins, with advantage accruing to scale and cost structure rather than differentiation. But layers above base inference retain software economics: integrated workflows that accumulate switching costs, vertical applications with domain-specific data, feedback loops where usage improves the product. The error is treating cognitive capability as a single category when different layers exhibit different dynamics. The commodity layer and the integrated layer will have different margin structures, different competitive dynamics, and different investment implications.


The complementary assets are the actuation constraints established in Part V: physical throughput, trusted interfaces, verification infrastructure, liability capacity. These are institutional, physical, and regulatory assets that cannot be created by training a larger model. They capture returns as the core technology diffuses.

The term structure, if it emerges, occupies a distinctive position among these assets. The overcollateralized bonding that substitutes for legal enforcement, the two-layer monetary topology that separates settlement from transaction numeraire, and the benchmark rate for multi-period commitments: these constitute infrastructure that cognitive capability requires but cannot replicate. The firms that establish and publish the benchmark rate will occupy structural positions analogous to the creators of dominant reference rates in traditional finance. Liquidity concentrates around the reference; early positions compound as adoption scales.

Whether this infrastructure emerges is not certain. The coordination required is substantial; the regulatory environment is unclear; the existing financial system may adapt faster than new infrastructure can establish itself. The position is high-conviction if the infrastructure emerges, and worthless if it does not. This asymmetry—binary outcome, structural position if successful—characterizes the highest-variance opportunities in the transition.


The hurdle rate from IV.E operates as a filter on investment viability.

A kilowatt-hour routed to inference must generate more value than the same kilowatt-hour routed to Bitcoin mining, or the capacity routes to mining. This creates a floor beneath which cognitive deployments are uneconomic. The floor rises and falls with mining difficulty and Bitcoin price, but the mechanism is structural: energy has an alternative use with deterministic yield.

The investment implication is that cognitive layer investments face continuous margin pressure. As inference costs fall, the floor rises relative to the margin available in any particular application. Applications that cannot maintain spread above the floor become uneconomic regardless of their technical capability. This is not a one-time shakeout but a continuous discipline that operates as long as the energy-to-value conversion remains available.

Actuation layer investments face a different calculus. Physical throughput, regulatory licenses, and liability capacity do not compete directly with mining for electrical capacity. Their returns are governed by scarcity of the complementary asset, not by the hurdle rate on energy conversion. The spread between actuation margins and cognitive margins determines which vertical integrations are profitable, and that spread widens as cognitive costs fall while actuation costs remain sticky.


The Coasean logic from V.D predicts firm structure.

When cognitive transaction costs fall but enforcement costs remain high, firm boundaries shift toward the enforcement bottleneck. The discriminator is verification cost. If verification is cheap and standard, markets thicken and activities move outside firms. If verification is expensive or bespoke, firms thicken and internalize the bottleneck.

Healthcare illustrates the integration case. Cognitive systems can diagnose, recommend treatment, generate documentation. They cannot prescribe, bill, or bear malpractice liability. A vertically integrated health system that employs physicians, owns facilities, manages insurance relationships, and bears liability can deploy cognitive capability at scale while competitors face integration friction at every interface. The liability backstop—the institutional capacity to absorb malpractice exposure—is the bottleneck that determines deployment velocity. A standalone cognitive provider, however capable, faces the question: who signs the prescription?

Financial services illustrate the mixed case. Trading, analysis, and customer service automate readily because verification costs are low: the trade executed or it did not; the analysis can be checked against outcomes; the customer’s issue was resolved or it was not. But custody, settlement, and regulatory compliance require licensed entities with capital requirements and audit obligations. The firms that combine cognitive capability with regulatory infrastructure occupy structural advantage. The broker-dealer license, the banking charter, the insurance underwriting authority—these are the permissions that gate deployment regardless of capability.

The pattern generalizes: wherever liability concentrates, wherever regulation requires licensed entities, wherever physical assets must be controlled, vertical integration provides structural advantage over pure-play cognitive providers. The boundary of the firm traces the boundary of the permission stack.


Geography reasserts itself through the production function.

Computation generates heat; heat must be dissipated; dissipation occurs at physical locations. Energy must be generated, transmitted, and converted. These are physical processes with geographic signatures that algorithmic improvement cannot overcome.

The most valuable locations for computational infrastructure cluster around specific endowments: abundant, cheap, reliable power; favorable climate for cooling; proximity to fiber networks; permissive regulatory environments. These endowments are not evenly distributed. West Texas, with its wind and solar resources and permissive grid interconnection, attracts different infrastructure than Northern Virginia, with its fiber density and proximity to government customers, or Iceland, with its geothermal power and natural cooling. Each location offers a different combination of endowments; each attracts a different type of workload.

The hurdle rate reinforces this concentration. Locations with the cheapest reliable power face the lowest floor, widening the margin available for cognitive deployment and attracting capital that higher-cost locations cannot justify. As the transition proceeds, geographic concentration will increase rather than decrease. The cloud may be virtual; its substrate is not.


Carlota Perez’s framework identifies the phases of paradigm transitions.2

The installation period is characterized by financial excitement, speculative investment, and narrative overshoot. Capital floods into the new technology; valuations detach from near-term fundamentals; the gap between promise and institutional readiness widens. The installation period is when fortunes are made by those who identify the transition early—and lost by those who enter at valuations that assume the deployment period has already arrived.

The deployment period is characterized by institutional adaptation, productivity realization, and the emergence of organizational forms that the technology enables. The complementary assets develop; the actuation bottlenecks loosen; the technology becomes embedded in economic life. The deployment period is when the structural positions compound—when the infrastructure built during installation generates the returns that installation-phase valuations anticipated.

The transition between phases is marked by a turning point, often a financial crisis that resets valuations and forces institutional adaptation. The dot-com crash of 2000-2002 was such a turning point: the internet thesis was correct; the timing of installation-phase capital commitment was catastrophic for those who entered at peak valuations.

The structure does not predict when the turning point arrives. It predicts that the pattern of value capture shifts from core technology to complementary assets as the transition moves from installation to deployment. The shift is structural. The timing is contingent. Investors who understand the structure can position for the shift; investors who mistake installation for deployment will find their capital trapped in positions that require patience they do not possess.


Cognitive capability is necessary infrastructure. It is not, by itself, a durable source of competitive advantage. The advantage accrues to those who control what cognitive capability requires but cannot replicate: physical throughput, trusted interfaces, verification infrastructure, liability capacity, and the settlement rails that enable multi-period coordination.

If cognitive capability does not commoditize—if frontier models maintain durable moats through data, distribution, or regulatory capture—then returns will not migrate to complementary assets, and the framework mispredicts. If actuation bottlenecks loosen faster than cognition commoditizes, the complementary assets do not bind as tightly, and the premium accruing to bottleneck owners dissipates. If the term structure fails to emerge, agent-mediated markets remain shallow, and the settlement infrastructure positions capture nothing. The pattern of which activities move to markets versus remain inside firms, which sectors transform rapidly versus slowly, and whether the term structure emerges—these are the observable tests.

The framework does not predict which firms will win. It predicts where the winnings will be. The bottlenecks are not cognitive. They are physical, institutional, regulatory, and infrastructural. Capital that positions for the bottlenecks—while maintaining the patience to hold through the installation period and the discipline to avoid overpaying for the thesis—will capture the transition. Capital that chases cognitive capability will find itself holding commoditizing assets in a market that has moved downstream.

The following sections examine how different capital structures can access these positions, what can go wrong, and what the transition implies for the broader distribution of economic outcomes.