IT managers may be too cautious about managing power, and businesses are unwilling to invest in efficiency, study finds
U.S. data centers are using more electricity than they need. It takes 34 power plants, each capable of generating 500 megawatts (MW) of electricity, to power all the data centers in operation today. By 2020, the nation will need another 17 similarly sized power plants to meet projected data center energy demands as economic activity becomes increasingly digital.
Increased electrical generation from fossil fuels means release of more carbon emissions. But this added pollution doesn’t have to be, according to a new report on data center energy efficiency from the National Resources Defense Council (NRDC), an environmental action organization.
Whatever happened to Green IT?
In term of national energy, data centers in total used 91 billion (kilowatts) kWh in 2013, and by 2020, will be using 139 billion kWh, a 53% increase.
NRDC Data center efficiency chartThis chart shows the estimated cost and power usage by U.S. data centers in 2013 and 2020, and the number of power plants needed to support this increased demand. The last column shows carbon dioxide (CO2) emissions in millions of metric tons. (Source: NRDC)
The report argues that improved energy efficiency practices by data centers could cut energy waste by at least 40%. The problems hindering efficiency include comatose or ghost servers, which use power but don’t run any workloads; overprovisioned IT resources; lack of virtualization; and procurement models that don’t address energy efficiency. The typical computer server operates at no more than 12% to 18% of capacity, and as many as 30% of servers are comatose, the report states.
The paper tallies up the consequences of inattention and neglect on a national scale. It was assembled and reviewed with help from organizations including Microsoft, Google, Dell, Intel, The Green Grid, Uptime Institute and Facebook, which made “technical and substantial contributions.”
The NRDC makes a sharp distinction between large data centers run by large cloud providers, which account for about 5% of the total data center energy usage. Throughout the data center industry, there are “numerous shining examples of ultra-efficient data centers,” the study notes. These aren’t the problem. It’s the thousands of other mainstream business and government data centers, and small, corporate or multi-tenant operations, that are the problem, the paper argues.
The efficiency accomplishments of the big cloud providers, “could lead to the perception that the problem is largely solved,” said Pierre Delforge, director of the NRDC’s high-tech sector on energy efficiency, but it doesn’t fit the reality of most data centers.
Data centers are “one of the few large industrial electricity uses which are growing,” Delforge said, and they are a key factor in creating demand for new power plants in some regions.
Businesses that move to co-location, multi-tenant data center facilities don’t necessarily make efficiency gains. Customers may be charged on space-based pricing, paying by the rack or square footage, with a limit on how much power they can use before additional charges kick in. But this model offers little incentive to operate equipment as efficiently as possible.
In total, the report says U.S. data centers used 91 billion kilowatt-hours of electricity last year, “enough to power all of New York City’s households twice over and growing.” By 2020, annual data center energy consumption is expected to reach 140 billion kilowatt hours.
If companies used data center best practices, the report states, the economic benefits would be substantial. A 40% reduction in energy use, which the report says is only half of the technically possible reduction, would equal $3.8 billion in savings for businesses.
The report also finds that energy efficiency progress is slowing. Once the obvious efficiency projects, such as isolating hot and cold aisles, are accomplished, addition investment in energy efficiency becomes harder to justify because of cost or a perception that they may increase risk. IT managers are “extremely cautious,” about implementing aggressive energy management because it could introduce more risk to uptime, the report notes.
There are a number of measurements used to determine the efficiency of data centers, and the report recommends development of tools for determining CPU utilization, average server utilization, and average data center utilization. It says that “broad adoption of these simple utilization metrics across the data center industry would provide visibility on the IT efficiency of data centers, thereby creating market incentives for operators to optimize the utilization of their IT assets. ”
The NRDC isn’t the first to look at this issue. In 2007, the U.S. Environmental Protection Agency, working with a broad range of data center operators and industry groups, released a report on data center power usage that found that the energy use of the nation’s servers and data centers in 2006 “is estimated to be more than double the electricity that was consumed for this purpose in 2000.” It called for energy efficiency improvements.