Resources

Why Energy Is A Big And Rapidly Growing Problem For Data Centers

U.S. data centers use more than 90 billion kilowatt-hours of electricity a year, requiring roughly 34 giant (500-megawatt) coal-powered plants. Global data centers used roughly 416 terawatts (4.16 x 1014 watts) (or about 3% of the total electricity) last year, nearly 40% more than the entire United Kingdom. And this consumption will double every four years.

Streaming video has already changed the game, but the explosion of artificial intelligence and internet-connected devices will change the entire landscape. AI is the future, and AI is hungry for processing power. IoT is projected to exceed 20 billion devices by 2020 (some analysts believe we will reach that number this year alone). Given there are currently 10 billion internet-connected devices, doubling that to 20 billion will require massive increases to our data center infrastructure, which will massively increase our electricity consumption.

How on earth can we possibly build all the power plants required to supply electricity to twice as many data centers in the next four years? The simple answer is that we can’t.

Bitcoin Mining Now Accounts For Almost One Percent of the World’s Energy Consumption

It is well-established established that Bitcoin mining — aka, donating one’s computing power to keep a cryptocurrency network up and running in exchange for a chance to win some free crypto — uses a lot of electricity. Companies involved in large-scale mining operations know that this is a problem, and they’ve tried to employ various solutions for making the process more energy efficient.

But, according to testimony provided by Princeton computer scientist Arvind Narayanan to the Senate Committee on Energy and Natural Resources, no matter what you do to make cryptocurrency mining harware greener, it’s a drop in the bucket compared to the overall network’s flabbergasting energy consumption. Instead, Narayanan told the committee, the only thing that really determines how much energy Bitcoin uses is its price. “If the price of a cryptocurrency goes up, more energy will be used in mining it; if it goes down, less energy will be used,” he told the committee. “Little else matters. In particular, the increasing energy efficiency of mining hardware has essentially no impact on energy consumption.”

In his testimony, Narayanan estimates that Bitcoin mining now uses about five gigawatts of electricity per day (in May, estimates of Bitcoin power consumption were about half of that). He adds that when you’ve got a computer racing with all its might to earn a free Bitcoin, it’s going to be running hot as hell, which means you’re probably using even more electricity to keep the computer cool so it doesn’t die and/or burn down your entire mining center, which probably makes the overall cost associated with mining even higher.

Bitcoin driving huge electricity demand, environmental impact

In a normal year, demand for electric power in Chelan County grows by perhaps 4 megawatts ­­— enough for around 2,250 homes — as new residents arrive and as businesses start or expand. But since January 2017, as Bitcoin enthusiasts bid up the price of the currency, eager miners have requested a staggering 210 megawatts for mines they want to build in Chelan County. That’s nearly as much as the county and its 73,000 residents were already using. And because it is a public utility, the PUD staff is obligated to consider every request.

The scale of some new requests is mind-boggling. Until recently, the largest mines in Chelan County used five megawatts or less. In the past six months, by contrast, miners have requested loads of 50 megawatts and, in several cases, 100 megawatts. By comparison, a fruit warehouse uses around 2.5 megawatts.