Power consumption at data centres is once again in the spotlight, after analyst house Gartner came up with a list of best practises in the data centre, designed to save electricity and improve cooling.

Gartner modestly claims that if companies follow all of its best practices, it could typically expect to save one million kilowatt hours. It says that in a conventional data centre, between 35 and fifty percent of the electricity consumed is for cooling compared to only 15 percent in best practise or ‘green' data centres.

"Virtually all data centres waste enormous amounts of electricity using inefficient cooling designs and systems," said Paul McGuckin, research vice president at Gartner in a statement. "Even in a small data centre, this wasted electricity amounts to more than 1 million kilowatt hours annually that could be saved with the implementation of some best practices."

The main reason for the waste in conventional data centre cooling is the "unconstrained mixing of cold supply air with hot exhaust air."

"This mixing increases the load on the cooling system and energy used to provide that cooling, and reduces the efficiency of the cooling system by reducing the delta-T (the difference between the hot return temperatures and the cold supply temperature). A high delta-T is a principle in cooling," McGuckin said.

Gartner's eleven top tips for reducing power consumption are:

  • Plug holes in raised floor. This point was also raised by SunGard Availability Services last month, as apparently holes in the floor allows cold air to escape and mix with hot air. This single low-tech retrofit can save as much as 10 percent of energy used for data centre cooling, says Gartner.
  • Installing blanking panels. Data centres are full of racks, and unused rack space needs to be covered with a blanking panel so that airflow can be properly managed, for example by preventing hot air leaving equipment in one section of the rack and then entering the cold air intake for other equipment elsewhere in the rack. Gartner says when these panels are used effectively; supply air temperatures can be lowered by as much as 22 degrees Fahrenheit (or minus 5 degrees Celsius).
  • Co-ordinate CRAC units. A CRAC unit (according to Gartner) is an older Computer Room Air-conditioning Unit (CRAC). Apparently these units operate independently of each other when cooling and dehumidifying the air. Gartner suggests that if these units could be tied together with newer technologies, and their efforts co-ordinated, then the results would be a much more efficient cooling system.
  • Improve underfloor airflow. This typically affects older data centres, where the space under the raised flooring is a lot more constrained than newer builds. Many old data centres also use the underfloor spacing for running data and power cables, thereby restricting airflow. A clear out of these spaces is advised.
  • Implement hot and cold aisles. This is one of the most obvious best practises. Gartner says traditional data centres use a "classroom style" to position their racks, whereby all the intakes face in one direction. The problem with this setup is that hot air exhausted from one row, mixes with cold air being drawn into the adjacent row, thereby increasing cold-air supply temperature in uneven ways. Newer rack layout practises over the last ten years instead organise rows into hot and cold aisles, which offering much better control of air flow.
  • Install sensors. Seems obvious, but how do you tell if you have a temperature problem in a certain area of your data centre? Gartner says a minimal investment in this technology could reap big insights into data centre operations and can also provide a method for analysing the results of improvements made to the cooling systems.

  • Implement cold or hot aisle containment. When a data centre uses hot and cold aisles, dramtically improved separation of cold supply air and hot exhaust air through containment becomes a viable option. Gartner reckons effective containment of the hot or cold aisles will be, for most users, the single largest payback of any of these best practices.
  • Raise the temperature in the data centre. Most data centres are run too cold, and raising the temperature by a few degrees often will not affect the equipment.
  • Install variable speed fans and pumps. Traditional CRAC units run at a single speed, but a variable speed fan should be used rather. A reduction of 10 percent in fan speed, apparently yields a reduction in fan electrical consumption of roughly 27 percent. Reducing fan speed by 20 percent yields electrical savings of approximately 49 percent.
  • Exploit free cooling. Depends a lot on local climate, but in winter in the UK, cold air is readily available outside the data centre.
  • Designing new data centres using modular cooling. Traditional data centres have been cooled by the raised floor perimeter air distribution. Mounting evidence strongly points to the use of modular cooling (in-row or in-rack) as a more energy efficient cooling strategy.

Additional information can be found in a Gartner report, "How to Save a Million Kilowatt Hours in your Data Center."