U.S. data centers use more than 90 billion kilowatt-hours of electricity a year, requiring roughly 34 giant (500-megawatt) coal-powered plants. Global data centers used roughly 416 terawatts (4.16 x 1014 watts) (or about 3% of the total electricity) last year, nearly 40% more than the entire United Kingdom. And this consumption will double every four years.
Streaming video has already changed the game, but the explosion of artificial intelligence and internet-connected devices will change the entire landscape. AI is the future, and AI is hungry for processing power. IoT is projected to exceed 20 billion devices by 2020 (some analysts believe we will reach that number this year alone). Given there are currently 10 billion internet-connected devices, doubling that to 20 billion will require massive increases to our data center infrastructure, which will massively increase our electricity consumption.
How on earth can we possibly build all the power plants required to supply electricity to twice as many data centers in the next four years? The simple answer is that we can’t.