Resources

What Happens When Big Tech’s Datacenters Come to Small Towns?

Few big tech companies that are building and hiring across America bring that wealth with them when they set up in new communities. Instead, they hire armies of low-paid contractors, many of whom are not guaranteed a job from one month to the next; some of the contracting companies have a history of alleged mistreatment of workers. Nor do local governments share in the companies’ wealth; instead, the tech giants negotiate deals — the details protected by non-disclosure agreements — that exempt them from paying taxes that would fund schools, roads and fire departments….

Globally, by the end of 2020, there were nearly 600 “hyperscale” data centers, where a single company runs thousands of servers and rents out cloud space to customers. That’s more than double the number from 2015. Amazon, Google and Microsoft account for more than half of those hyperscale centers, making data centers one more field dominated by America’s richest and biggest companies… Google in March said it was “investing in America” with a plan to spend $7 billion across 19 states to build more data centers and offices. Microsoft said in April that it plans to build 50 to 100 data centers each year for the foreseeable future. Amazon recently got approval to build 1.75 million square feet of data-center space in Northern Virginia, beyond the 50 data centers it already operates there. Facebook said this year it would spend billions to expand data centers in Iowa, Georgia and Utah; in March it said it was adding an 11th building to its largest data-center facility in rural Prineville, Oregon…

Facebook has spent more than $2 billion expanding its operations in Prineville, but because of the tax incentives it negotiated with local officials, the company paid a total of just $119,403.42 in taxes to Crook County last year, according to the County Assessor’s list of top taxpayers. That’s less than half the taxes paid by Brasada Ranch, a local resort. And according to the Oregon Bureau of Labor and Industries, the data center has been the subject of numerous labor complaints… “I’ve spent way too much of my life watching city councils say, ‘We need a big tech company to show that we’re future-focused,'” says Sebastian Moss, the editor of Data Center Dynamics, which tracks the industry. Towns will give away tax breaks worth hundreds of millions of dollars, his reporting has found, and then express gratitude toward tech companies that have donated a few thousand computers — worth a fraction of the tax breaks — to their cash-strapped school systems. “I sometimes wonder if they’re preying on desperation, going to places that are struggling.”

Communities give up more than tax breaks when they welcome tech companies. Data centers use huge amounts of water to cool computer equipment, yet they’re being built in the drought-stricken American West.

The article cites Bureau of Labor Statistics showing that 373,300 Americans were working in data processing, hosting, and related services in June — up 52% from 10 years ago.

Data Centres Exacerbate Droughts

A data center can easily use up to 1.25 million gallons of water each day — and “More data centers are being built every day by some of America’s largest technology companies,” reports NBC News, “including Amazon, Microsoft and Google and used by millions of customers.”

Almost 40 percent of them are in the United States, and Amazon, Google and Microsoft account for more than half of the total. The U.S. also has at least 1,800 “colocation” data centers, warehouses filled with a variety of smaller companies’ server hardware that share the same cooling system, electricity and security, according to Data Center Map. They are typically smaller than hyperscale data centers but, research has shown, more resource intensive as they maintain a variety of computer systems operating at different levels of efficiency.

Many data center operators are drawn to water-starved regions in the West, in part due to the availability of solar and wind energy. Researchers at Virginia Tech estimate that one-fifth of data centers draw water from moderately to highly stressed watersheds, mostly in the Western United States, according to a paper published in April…

The growth in the industry shows no signs of slowing. The research company Gartner predicts that spending on global data center infrastructure will reach $200 billion this year, an increase of 6 percent from 2020, followed by 3-4 percent annually over the next three years. This growth comes at a time of record temperatures and drought in the United States, particularly in the West. “The typical data center uses about 3-5 million gallons of water per day — the same amount of water as a city of 30,000-50,000 people,” said Venkatesh Uddameri, professor and director of the Water Resources Center at Texas Tech University. Although these data centers have become much more energy and water efficient over the last decade, and don’t use as much water as other industries such as agriculture, this level of water use can still create potential competition with local communities over the water supply in areas where water is scarce, he added…

Sergio Loureiro, vice president of core operations for Microsoft, said that the company has pledged to be “water positive” by 2030, which means it plans to replenish more water than it consumes globally. This includes reducing the company’s water use and investing in community replenishment and conservation projects near where it builds facilities.

Amazon did not respond to requests for comment.

Samsung Chip Output at South Korea Plant Partly Halted Due To 1-Minute Electricity Glitch

A 1-minute power glitch on Tuesday, December 31, partially shut down Samsung chip production at its Hwaseong chip complex in South Korea for “two or three days”. DRAM and NAND lines were affected. Preliminary inspections show “no major damage” but losses are still expected to be in the millions.

Why Energy Is A Big And Rapidly Growing Problem For Data Centers

U.S. data centers use more than 90 billion kilowatt-hours of electricity a year, requiring roughly 34 giant (500-megawatt) coal-powered plants. Global data centers used roughly 416 terawatts (4.16 x 1014 watts) (or about 3% of the total electricity) last year, nearly 40% more than the entire United Kingdom. And this consumption will double every four years.

Streaming video has already changed the game, but the explosion of artificial intelligence and internet-connected devices will change the entire landscape. AI is the future, and AI is hungry for processing power. IoT is projected to exceed 20 billion devices by 2020 (some analysts believe we will reach that number this year alone). Given there are currently 10 billion internet-connected devices, doubling that to 20 billion will require massive increases to our data center infrastructure, which will massively increase our electricity consumption.

How on earth can we possibly build all the power plants required to supply electricity to twice as many data centers in the next four years? The simple answer is that we can’t.

Bitcoin Mining Now Accounts For Almost One Percent of the World’s Energy Consumption

It is well-established established that Bitcoin mining — aka, donating one’s computing power to keep a cryptocurrency network up and running in exchange for a chance to win some free crypto — uses a lot of electricity. Companies involved in large-scale mining operations know that this is a problem, and they’ve tried to employ various solutions for making the process more energy efficient.

But, according to testimony provided by Princeton computer scientist Arvind Narayanan to the Senate Committee on Energy and Natural Resources, no matter what you do to make cryptocurrency mining harware greener, it’s a drop in the bucket compared to the overall network’s flabbergasting energy consumption. Instead, Narayanan told the committee, the only thing that really determines how much energy Bitcoin uses is its price. “If the price of a cryptocurrency goes up, more energy will be used in mining it; if it goes down, less energy will be used,” he told the committee. “Little else matters. In particular, the increasing energy efficiency of mining hardware has essentially no impact on energy consumption.”

In his testimony, Narayanan estimates that Bitcoin mining now uses about five gigawatts of electricity per day (in May, estimates of Bitcoin power consumption were about half of that). He adds that when you’ve got a computer racing with all its might to earn a free Bitcoin, it’s going to be running hot as hell, which means you’re probably using even more electricity to keep the computer cool so it doesn’t die and/or burn down your entire mining center, which probably makes the overall cost associated with mining even higher.

Bitcoin driving huge electricity demand, environmental impact

In a normal year, demand for electric power in Chelan County grows by perhaps 4 megawatts ­­— enough for around 2,250 homes — as new residents arrive and as businesses start or expand. But since January 2017, as Bitcoin enthusiasts bid up the price of the currency, eager miners have requested a staggering 210 megawatts for mines they want to build in Chelan County. That’s nearly as much as the county and its 73,000 residents were already using. And because it is a public utility, the PUD staff is obligated to consider every request.

The scale of some new requests is mind-boggling. Until recently, the largest mines in Chelan County used five megawatts or less. In the past six months, by contrast, miners have requested loads of 50 megawatts and, in several cases, 100 megawatts. By comparison, a fruit warehouse uses around 2.5 megawatts.