Data centres have been using less electricity than you think... or at least, compared with what they have in the past.
According to a study by Jonathan Koomey, a consulting professor at Stanford and a climate and energy researcher, data centre energy use in the last five years rose only about 56% vs doubling in the period between 2000 and 2005. And in the US, it rose only 36% instead of doubling.
Electricity used in global data centres in 2010 accounted for between 1.1% and 1.5% of total electricity use, the Koomey study found. And less than 1% of electricity used by data centres worldwide was consumed by Google, perhaps the highest-profile user of data centre servers.
The Koomey report attributes the lower usage to a lower-than-predicted server installed base rather than energy efficiency improvements.
Growth in the installed base of servers had already begun to slow by early 2007 because of virtualisation and technological factors. The 2008 recession, combined with further improvements in virtualisation, led to a significant reduction in actual installed base by 2010 when compared with a forecast published in 2007 by market research firm IDC.
Servers are still the largest and most important electricity consumer in data centres, the Koomey report states. And Google is estimated to have about 900,000 of them, compared to 25,000 in 2000.
Growth in electricity used per server may have accounted for a larger share of demand growth from 2005 to 2010 than it did in 2000 to 2005, the report suggests. Meanwhile, the dominant driver of electricity demand from 2000 to 2005 was growth in the installed base of volume servers, which doubled over that five-year period, according to the Koomey report.
Indeed, Koomey notes that the main reason for the lower estimates in this study is the lower IDC installed base estimates, and not the operational improvements and installed base reductions from virtualisation.
And going forward, IDC forecasts virtually no growth in installed base from 2010 to 2013, which presages continued slower growth in electricity use, the Koomey study states. The IDC forecast assumes virtualisation will become more and more prevalent, reducing the need to install more physical servers.
This trend should improve energy efficiency while driving up utilisation of remaining servers, which spreads energy use and other costs over more computations per server.
And the impact of cloud computing on energy usage?
Cloud computing installations typically have much higher server utilisation levels and infrastructure efficiencies than do in-house facilities, the Koomey report states. So increased adoption of cloud should result in lower electricity use than if the same computing services were delivered using more conventional approaches.
Find your next job with techworld jobs