Unsurprisingly, the news that China’s DeepSeek AI had leapfrogged competitors triggered an investor sell-off. It dragged down Nvidia, the American chipmaker powering the AI revolution, as well as related tech stocks, from Micron and Advanced Micro Devices to manufacturing juggernaut TSMC. There is, after all, some wisdom in worrying that we don’t know who the ultimate winners in the AI race will be.

Markets don’t always exhibit wisdom, however. The DeepSeek news also created a collateral sell-off of all manner of power-related companies, including GE Vernova, Vistra, Siemens Energy, and Schneider Electric. The sell-off even spread to utilities like Constellation Energy and to upstream energy suppliers like natural-gas driller EQT, pipeline companies Williams, and Energy Transfer, and even small modular reactor darlings like Oklo and Nuscale Power.

Some analysts were quick to assert that DeepSeek’s leap in AI efficiency—using far less computing horsepower and thus far less electricity—blows a hole in the widely accepted narrative that AI is power-hungry. Some in the climate community are already signaling relief that AI’s magic could be available with a lighter energy footprint.

Regulators and policymakers might think this resolves recent fears about supplying enough electricity to fuel growth, especially for powering data centers. But the idea that a leap in energy efficiency solves the AI power-demand challenge is a misread of reality.

By that logic, investors would have dumped shares in turbine manufacturers, airframe suppliers, and oil companies circa 1958, when PanAm began commercial service with the then-revolutionary Boeing 707. That aircraft technology leap brought not only far greater fuel efficiency than anything previously seen in commercial air service but also more than triple the passenger capacity. The result wasn’t fewer airplanes and less fuel used, but more of both. And that, in turn, drove the expansion in the associated aviation infrastructures.

We know the same logic applies to computers. The arrival of desktop PCs, at a time when mainframes ruled the land, was made possible by staggering gains in computer energy efficiency. If not for those gains, we wouldn’t have today’s ubiquitous computing, along with its massive energy use. One smartphone, if it had to operate at the energy efficiency of a 1980 computer would use as much electricity as a football stadium. Similarly, a single (non-AI) data center today, operating at 1980 efficiency, would consume the entire nation’s electricity production.

Thanks specifically to the astonishing gains in computing efficiency, the global digital ecosystem’s energy appetite rose from being negligible circa 1980 to nearly matching global aviation’s energy use by 2020 (again, pre-AI). The energy cognoscenti know this natural “law” as Jevons Paradox, named for the nineteenth-century economist who observed the same phenomenon back when experts of that day worried there wouldn’t be enough coal to power steam engines unless the engines became far more efficient. The engines got more efficient; coal use boomed.

Add to this a feature of AI that’s quite different. AI requires the complementary and massive use of conventional chips for data collection, processing, storage, and transport. This would be the energy equivalent of the rising use of aircraft requiring the construction of more passenger ships. Faster adoption of AI will accelerate the already-massive demand for building, and powering, conventional computer chips.

Just how quickly markets will adopt AI-centric services remains a key question. Analysts and promoters point to case studies, conduct surveys, and offer theories of what businesses and consumers will do with AI. But the underlying economic trend is the best indicator. Progress in twenty-first century cloud technology is driving down the cost of computing-as-a-service at a rate some fiftyfold faster than did progress in the costs of transportation in the twentieth century.

Nvidia CEO Jensen Huang got it right when he said, “DeepSeek is an excellent AI advancement.” But the invisible hand of the market will determine how and where AI gets used, and which AI chip and hardware companies dominate the still-emerging boom. Meantime, the physics and economics of information hardware make it easy to predict that such progress isn’t a “sell” signal but a massive “buy” signal for companies that will power the AI-infused future. And that means that the challenges will come faster for expanding a reliable, affordable electric infrastructure.

Photo Illustration by Justin Sullivan/Getty Images

Donate

City Journal is a publication of the Manhattan Institute for Policy Research (MI), a leading free-market think tank. Are you interested in supporting the magazine? As a 501(c)(3) nonprofit, donations in support of MI and City Journal are fully tax-deductible as provided by law (EIN #13-2912529).

Further Reading

Up Next