Energy efficiency - that's the name of the game.
Every company I meet these days drops energy efficiency (or power efficiency - the terms seem to be interchangeable, suggesting that rather a large number of people in the IT business skipped their physics classes) into the conversation. It's become almost as ubiquitous as 'leverage' in the pantheon of IT phrases.
I've just sat through a presentation from data centre power company Schneider Electric which was almost an object lesson in the way these things work. It all started promisingly enough, Schneider's Philipper Arsonneau outlined the issues facing data centre managers and pointed out that global electricty consumption was set to double by 2030 and that all businesses should be taking steps to reduce energy consumption.
That's fair enough - no-one can disagree with that statemen. Most people have now bought into the argument that the climate is changing and human intervention plays a big part in that - and even if they haven't, there's still the imperative to save costs. However, some of the claims verge on the grandiose: Schneider's Arsenneau talked about four strands of the company's energy efficiency strategy, the first of which would save up to 30 percent of energy costs and the second would save up to 20 percent, he was silent about the level that some of his other technologies provided, except that one was "highly efficient". This, of course, is against a background of more energy efficient servers.
Listening to this sort of talk - and Schneider's not alone in this, that was the company I saw today but I'm willing to bet that many other firms made similar claims - the savings that are being made are such that we must be approaching the time when electricity companies will be paying data centre managers.
Of course, it doesn't happen like that. It reminds me of the time I went through my house making energy efficiency changes: loft insulation? check. Energy-efficient boiler? Check. Cavity wall insulation? Check? Double glazing? Check... and still face bills that are twice as high as they were a few years ago." Or perhaps, to put it into another domestic context, the the provision of broadband to the home - a service that never seems to reach the heights intimated by the service provider.
It's an indication of the many factors that could influence energy costs. There's the fact that the savings that vendors talk about are achieved in test conditions. I don't doubtfor one minute that Schneider ran trials that suggested energy savings of around 30 percent , but I bet that these savings aren't replicated in data centres. There's also the price of electricity itself - a factor that no data centre manager or vendor has any control over, but a 20 percent saving in electricity usage could easily be offset by a price rise. And, of course, hardware doesn't stay dormant- there could be more users that a data centre has to support, or more machines.
But perhaps the most pressing difficulty is one that was set out by Burton last week - the difficulty of measuring energy savings in a world where data centre managers have little idea what the energy costs are - what Burton called 'faith-based economics'. When IT managers are really, really aware of their total electricity costs, then we can start talking about the real savings that can be made. It's certainly true that vendors are taking steps to reduce costs but not at the levels that are being spoken about.
At the moment it's often a guessing game, great for buzzword bingo and headline games but not figures to be taken seriously. And it's certainly something to be takens seriously, making data centres more efficient is one of the most pressing concerns facing the industry - let's start by measuring accurately.