The Inference Standard
Let's talk about the actual bottleneck of the modern economy.
We spent the last decade arguing over fiat versus Bitcoin. We argued about gold, inflation, and federal interest rates. Podcasters screamed about the money printer. Reddit went to war over whether the dollar was collapsing. Everyone had a very strong opinion about Jerome Powell.
While all of that noise was happening, the real transition took place quietly. In data centers. Behind security fences in rural Virginia and the outskirts of Dallas. Inside concrete buildings with no windows and very serious cooling systems.
The future is not backed by gold. It is not backed by the full faith and credit of the United States government.
It is backed by GPUs.
What Backed What
There was a time when currency meant something physical. The gold standard was simple: every dollar in circulation was backed by a fixed amount of gold sitting in a vault. The currency was just the paper. The gold was the value. You could walk into a bank and trade one for the other.
Then we dropped the gold standard. Nixon ended it in 1971. The dollar became fiat, backed by nothing but trust in the US government and the collective agreement that these green rectangles were worth something. That system ran for fifty years on pure institutional momentum. It still runs today.
But a new standard is forming underneath it. One that most people have not recognized yet because they are too busy watching inflation charts and interest rate announcements.
The Inference Standard.
Here is how it works. Raw electricity gets converted into compute. Compute gets converted into tokens. Tokens get converted into output: code, analysis, content, automation, entire products. That is the value chain now. Every link in it is measurable, tradeable, and already being priced by the market.
Gold backed dollars. Compute backs inference. The structure is identical. The asset class is different.
The New Gold Rush
In the gold standard era, economic power meant having the biggest reserves. Whoever had the most gold had the most credible currency. Wars were fought over it. Entire colonial empires were built on top of it.
Now look at what is happening with compute.
In 2026, the five largest US tech companies (Microsoft, Alphabet, Amazon, Meta, and Oracle) collectively committed to spending between $660 billion and $690 billion on capital expenditure. In a single year. Nearly $700 billion. Amazon alone is projecting $200 billion. Alphabet is between $175 and $185 billion. Meta is at $115 to $135 billion.
For context, that is more than the GDP of most countries on Earth. These companies are individually outspending entire nation-states on infrastructure. And they are not doing it to build consumer products. They are doing it to control inference capacity.
They are filling the vaults.
And they are not just buying GPUs. They are buying the energy to run them. Microsoft signed a 20-year deal to restart a nuclear reactor at Three Mile Island. Yes, that Three Mile Island. 835 megawatts of dedicated power for their data centers. Google signed the first US corporate deal to build a fleet of small modular reactors. Amazon is investing over $20 billion to convert a facility into an AI compute campus. Meta signed nuclear power deals for up to 6.6 gigawatts of clean energy capacity by 2035.
A single gigawatt can power roughly 750,000 homes. Meta is locking down enough nuclear capacity to power nearly five million homes. Not for air conditioning. For inference.
These are not R&D budgets. This is not speculative investment. This is the modern equivalent of nations racing to fill their gold reserves. Except instead of digging metal out of the ground, they are building reactors to power the machines that produce intelligence.
Whoever controls the compute controls the reserve currency. Everyone else is just spending it.
The Exchange Rate
Every currency has an exchange rate. The dollar has one against the euro. The euro has one against the yen. These rates shift based on supply, demand, and how much faith the world has in the economy behind them.
Inference has an exchange rate too. It is measured in tokens per dollar. And unlike fiat currencies that inflate, this rate is deflating at a speed that has no historical parallel.
The cost to achieve GPT-3.5 level performance dropped over 280 times between late 2022 and late 2024. Andreessen Horowitz coined the term "LLMflation" to describe it: for an equivalent level of AI intelligence, cost is decreasing by roughly 10x every single year. GPT-4 level output that cost $20 per million tokens in 2023 was running at about $0.40 per million by late 2025.
That is a 50x cost reduction in two years. No other resource in economic history has deflated this fast. Not semiconductors during Moore's Law. Not bandwidth during the dotcom boom. Not storage. Nothing.
And here is what makes this different from fiat. The exchange rate is not set by a central bank or a policy decision. It is set by physics, hardware efficiency, and model architecture. No one can inflate the token supply by printing more of them. You have to actually burn the electricity and run the compute.
The exchange rate is a function of engineering, not politics.
Tokens Are Already Money
This is not a thought experiment. Look at how the economy is already being denominated.
Your Claude Pro subscription? Priced in token access. Your company's AI budget? Measured in tokens per month. When an engineering manager decides whether to hire a contractor or use an LLM, they are doing the conversion math: human hours versus token cost for equivalent output.
A currency has three jobs. Unit of account (the thing you measure value in). Medium of exchange (the thing you trade for goods and services). Store of value (the thing you hold onto).
Tokens already handle two of those. They are a unit of account. Budgets, API pricing, infrastructure costs are all denominated in tokens. They are a medium of exchange. You trade tokens directly for productive output. Code, analysis, content, automation. The transaction is literal.
Store of value is the one they do not handle. Tokens are consumed the instant they are created. You cannot save them. You spend them and they are gone. They are pure economic energy, created and burned in the same motion.
That is fine. That is the design. Not every layer of a financial system needs to do all three jobs. The dollar did not replace gold because it was a better store of value. It replaced gold because it was a more efficient medium of exchange. The token is doing the same thing to the dollar. Not killing it. Just making it less relevant at the layer where value actually gets created.
The Layer Underneath
If you work in technology, this shift is already affecting you whether you have language for it or not.
When a company evaluates a SaaS product, they are increasingly asking: how many tokens does this consume and what do we get back? When a startup pitches investors, the question is shifting from "what is your headcount" to "what is your inference leverage?" How much output do you produce per employee, amplified by compute?
The companies pouring $700 billion into infrastructure are not doing it because they think AI is a cool trend. They are doing it because they understand that controlling the production of tokens is the single most important economic position of the next century. They are the new central banks. They set the supply. They set the price. They control the reserve.
If you own the compute, you own the backing. If you are buying tokens at market rate from someone else's API, you are just a consumer spending a currency you have no control over. You are economically dependent on someone else's infrastructure in the same way a small country is dependent on the dollar.
This is the Inference Standard. It is not a prediction. It is a description of the transition that is already underway.
And it is only the first layer.
Bitcoin proved a decade ago that raw compute, powered by electricity, can produce something the market recognizes as value. That was the proof of concept.
But it was just the beginning.