Nuclear Learning Rates

In the late 1970s, there was a sense that renewables were just around the corner. I grew up in Colorado, and it was a time when tract homes were offering solar hot water heating. Carter’s Whitehouse installed solar hot water heating. Photovoltaics were available in hobby stores, allowing you to see a motor spin from the energy of the sun. It was an exciting time.

Concurrently, there was a rising tide against nuclear. Movies from the previous decade showed what happened to animals if they were exposed to radiation. From Godzilla to The Hulk, radiation caused scary things to happen. Serious films such as The China Syndrome showed corporations driven by greed making very bad decisions, suggesting that the greed combined with nuclear could nearly burn a hole through the entire earth.

These two forces—the belief that renewable energy was imminent and the belief that nuclear was dangerous—pushed polite society towards…more fossil fuel. Nobody was willing to deal with true renewable energy in the 80’s. (very expensive, spotty availability), nor were they willing to deal with nuclear (too dangerous). And so, powers-that-be continued the status quo. AKA fossil fuel.

Since the late 1970’s, the US has emitted about 5B tons of CO2 annually, and about 1/3 of that is electricity generation, . Over 40 years—from 1980 to 2020—this is perhaps 70B T of CO2 that could have been avoided if the US had optimistically switched on a dime to nuclear in 1980.

More importantly, the innovation associated with the move would have fueled falling generation costs. It’s hard to know precisely how much costs could have fallen given continued innovation. But Peter Lang has taken a look at this and made a compelling case in a paper titled “Nuclear Power Learning and Deployment Rates; Disruption and Global Benefits Forgone”. In this paper, he argues that had nuclear innovation continued and enjoyed the same learning curves as computers, TVs, airplanes, etc, that the cost of nuclear energy today would be just 10% that of coal, and that annually we’d avoid 11B T of CO2.

The bitterest pill to swallow today is that while wealthy economies such as LA and Seattle can afford to overpay for cleaner electricity, poorer economies cannot. And if you are just entering the global middle class and getting your first taste of routine travel by automobile, you’d much rather get an ICE car which can deliver you a much larger bang for the buck. And it’s the same story with electricity generation. It’s hard to believe for some that China and India will emit more coal in 2040 than the US and EU combined in 2000, but that is the economic reality of the situation.

What should have happened ideally was that wealthy, innovative nations would have developed small, inherently safe reactors that could be buried in the ground and could use their nuclear fuel to exhaustion—there’d be no waste. That dream is still being chased by Bill Gates with his investments in TerraPower, but it’s been delayed by 10 to 30 years due largely in part to a belief that renewables were imminent in the 1980’s. They were not.

Coal the Decades Ahead