A report from Morgan Stanley suggests the datacenter industry is on track to emit 2.5 billion tons by 2030, which is three times higher than the predictions if generative AI had not come into play.
The extra demand from GenAI will reportedly lead to a rise in emissions from 200 million tons this year to 600 million tons by 2030, thanks largely to the construction of more data centers to keep up with the demand for cloud services.
ow nevermind, bitcoin mining alone is consuming 112.31 TWh annually (it’s a guess). While AI is using 29.2 TWh annually (also a guess).
And AI will eventually go down (it will go up a lot before that though) as hardware becomes fast enough.
Crypto by design will never decrease.
I’m afraid the power needs for AI will also not decrease. Even if individual models become more efficient and the hardware become more AI optimized. The next logical step is to run even more if those AIs as agents and creating huge chain of thoughts… So no, ai power usage will increase.
I can guess too! With my guess, AI is already using 420 TWh annually!
What if we wouldn’t guess anything like this? This is not just not meaningful, but straight out misleading.
Well… the numbers were from Wired: https://wired.me/science/energy/ai-vs-bitcoin-mining-energy/
Only they called it “estimates”