In 50 years' time, engineers and economists will wonder why we spent vast amounts of energy using computers to heat data centres and server rooms, and then threw as much if not more energy at the problem of cooling them down again. One would hope that, by then, a more elegant solution will have been devised.

For now, we're stuck with the problem. And unless you've been on Pluto for the last couple of years, you will also know that the problem has been exacerbated by the growing trend towards consolidation. As servers become more capable, and reliable, but as the costs of floor space grow, economics dictate that we cram them together as tightly as possible.

The result is that densely packed server racks can generate as much as 10kW of heat, requiring the expenditure of as much energy removing it. With costs at a premium, the objective of any data centre or server room manager must be to make the most of the resources available, which means ensuring that the cooling systems are working at maximum efficiency.

So here are some essential items for your checklist.

Basic checks
Servers and racks need to be physically arranged to ensure that there's a clear path between the source of cool air and device inlets, and for exhaust air to escape back to the air-conditioning system.

Your first and simplest task is to ensure that the capacity of the air-conditioning is up to the task, working on the basis that 1W of heating needs a corresponding 1W of cooling -- at least.

Check that temperatures at key monitoring points are what you'd expect. For example, if air returning to the a/c unit is cooler than the average air temperature -- or even the cooled temperature -- then you've probably got an airflow problem that results in expensively cooled air uselessly bypassing the racks.

Temperature and humidity sensors should be installed in the aisles between racks, centred between rows, about one per four racks. You should also be able to monitor the temperature at the top, middle and bottom of each rack, to allow you to check the air is flowing as and where you expect.

Underfloor areas and cables need to be clean, or particles will be drawn up into the equipment and will over time block vents and slats, reducing cooling capability.

Cables should be tidily stacked. Unstructured cabling can restrict airflow if it rambles around the back of the racks like untamed ivy. Like power cords, data cables should be tied back, which offers the side benefit of allowing you to locate the right cable more quickly.

Similarly, underfloor cabling can obstruct airflows. Much of it may been left over from previously installed racks and, no redundant, should be removed. APC recommends that, where possible, cabling should run overhead rather than underfoot.

Examine floor tiles regularly, to ensure that they're delivering cool air where they're supposed to. For instance, perforated tiles where racks used to stand but no longer do should be replaced with blanks.

Maintenance
The a/c equipment itself needs to be working at top efficiency, so regular scheduled maintenance is as essential as it is for the computer equipment.

It sounds basic but the Uptime Institute visited data centres and found that well over half of them showed the same general set of operational deficiencies. They all resulted in poor cooling, although the causes varied -- they included inadequate maintenance and humidity management, and hot spots or uneven temperatures across raised floors.

The Institute found evidence of inadequate maintenance included a lack of preventative maintenance schedules or, if the work was done, it either wasn't performed well or in the case of contractors, policed with sufficient rigour.

Installation issues
Power and data centre infrastructure vendor APC reckons that something as simple as the proper installation of blanking plates can play a large part in ensuring that hot air from one side of a rack tower doesn't flow back to the cooler inlet side.

For example, if there are empty slots within a rack, if you blank them off to keep the two airflows separate, heated exhaust air cannot return round the side of the unit to the inlets.

Try to avoid hotspots. These can occur where several densely packed, hot racks are located together. Although the total cooling requirement may remain the same, in the hot-rack area it could fall short of demand. Instead, if hot racks are distributed around the centre, demand for cooling is averaged out, so hot racks take advantage of unused cooling capacity from the remaining cooler systems.

Airflow management is key to efficient use of the a/c system. Ideally, a data centre will have cold aisles and hot aisles between racks. If hot exhausts point at the inlets of adjacent racks, the latter will be inadequately cooled. Similarly, a/c outlets will be as close to the rack's air inlets as possible, giving warm exhaust air as little opportunity as possible to mix with it. By the same token, a/c inlets should be positioned above or below hot aisles to extract heat quickly.

Conclusions
There's no doubt that money can be saved through consolidation but it's a waste to throw it away through either inadequate cooling, which shortens equipment life, or inefficient use of costly conditioned air.

These checklist items, gleaned from a number of best practice documents, should help you keep your data centre in good health.