The power consumed by IT systems is set to become a major issue, analyst Gartner has predicted.
Steve Prentice, research VP at Gartner, said IT was uniquely positioned to confront increasing energy costs as the technology is one of the largest consumers of power within organisations.
"You cannot keep it a secret," Prentice said. "Performance-per-watt will become critical over next decade," he said.
During his keynote address at this year's Gartner Data Center Summit in Sydney, Prentice said that as hardware prices continued to fall and power costs continued to rise, heat generation would become a problem when organisations took advantage of commodity hardware.
"The requirement for modern hardware is going to be huge [and] for every watt that goes into a processor you will use another watt to cool it," he said, adding vendors' promotion of low-power consumption products is indicative of this trend.
Prentice cited the Barcelona Supercomputer Center's 4800-processor cluster as an example of the scale of power consumption by computers. Reports have it that 75 percent of the university's electricity is being consumed by the supercomputer.
Seeking low-power alternatives is not the only way to reduce energy consumption, as virtualisation technology emerges to provide better utilization.
With typical utilisation rates of 10 percent, Prentice was blunt in describing the amount of capacity being wasted.
"Of the computing power you buy, you are throwing away 90 percent," he said, adding that with increased virtualisation, utilisation levels will climb to 30 percent.
"Virtualisation is nothing new [but] it's a critical tech that opens a path to automation."
Virtualisation will also herald the ability to reconfigure virtual servers to suit applications, resulting in big performance improvements.