Tech giants and other major energy users Amazon, Google, Meta, Dow, Occidental, Allseas and OSGE have signed a pledge supporting the goal of at least tripling global nuclear capacity by 2050.
;
Isn’t the much simpler and likely explanation that they know that these models will be way more efficient shortly and won’t need that much compute power?
This efficiency-focused perspective makes more business sense than the pessimistic view that “AI is a nothingburger.” If Microsoft truly believed AI had no future, they wouldn’t have invested so heavily in OpenAI in the first place.
The decision to cancel 1000GW of future builds could simply reflect Microsoft’s confidence that they can achieve their AI goals with less infrastructure due to coming efficiency improvements, rather than a fundamental doubt about AI’s potential.
If AI was actually going to replace every human worker on earth, even with huge efficiency gains you’d still want to build out absolutely ungodly amounts of compute capacity.
Also, it should be noted that while Deepseek has demonstrated that its possible to substantially reduce the compute requirements for transformer based models, doing so relies heavily on a “Good enough” approach that moves the results further away from being enterprise capable. It’s not a cut and dried solution to the backend costs of running these models at the scale that investors want to see them running.
Isn’t the much simpler and likely explanation that they know that these models will be way more efficient shortly and won’t need that much compute power?
This efficiency-focused perspective makes more business sense than the pessimistic view that “AI is a nothingburger.” If Microsoft truly believed AI had no future, they wouldn’t have invested so heavily in OpenAI in the first place.
The decision to cancel 1000GW of future builds could simply reflect Microsoft’s confidence that they can achieve their AI goals with less infrastructure due to coming efficiency improvements, rather than a fundamental doubt about AI’s potential.
If AI was actually going to replace every human worker on earth, even with huge efficiency gains you’d still want to build out absolutely ungodly amounts of compute capacity.
Also, it should be noted that while Deepseek has demonstrated that its possible to substantially reduce the compute requirements for transformer based models, doing so relies heavily on a “Good enough” approach that moves the results further away from being enterprise capable. It’s not a cut and dried solution to the backend costs of running these models at the scale that investors want to see them running.