They call it HECToR and it’s the latest and greatest of the UK’s supercomputers, a breed of machines that have evolved from their distant ancestors in the mainframe world of 1960’s cleanrooms to become one of the most important phenomena of modern computing.

Installed in a non-descript building in a wooded facility south of Edinburgh, it is run as open-access computing serviced by the Edinburgh Parallel Computing Centre (EPPC), a wing of the University of Edinburgh. Made up of a series of wardrobe-like Cray XT4 towers that sit beside one another in rows, HECToR (or High-End Computing Terascale Resource, to use its longer and less cuddly name), its creators will try to convince visitors that it is as beautiful as its £113 million ($220 million) project price tag would imply.

Come face to face with it for the first time, and what that hits you first is not its looks but its noise – a loud, steady boom of rushing air – the consequence of the beast’s immense thirst for cooling. Then you are swamped by the warm, dry air. It is imposing in an alien sort of way, like a massive fist that is no longer attached to a real body, but which continues to flex. Around it stand a small team of technicians in labcoats, who fuss over it like gardeners tending to a morbid rose.

Beyond the eerie charisma of a calculating furnace, there’s remarkably little to say about HECToR as a physical object. It’s a bunch of cabinets packed with nests of 2.8GHz AMD Opteron microprocessors (11,328 in total), strung together with fancy interconnects at the rear of each tower, and cooled like crazy to stop the whole thing melting.

Scheduled to be replaced in two years, its matter-of-fact design hints comically at its obsolescence. One day they’ll unplug HECToR’s cabinets, wheel them out of the room, and replace with something that looks exactly the same but which has around four times as much processing power. Then they’ll repeat this exercise at roughly two-year intervals from now until the ability to resolve the world through digital muscle hits some kind of law of diminishing returns. Will that happen one day? Nobody can say.

It’s HECToR’s vital figures that make its case persuasive. Four times the rating of its predecessor, it can manage up to 63 billion calculations per second, equivalent its makers say, to every human being on earth doing 10,000 calculations each, over the same second. According to the respected supercomputing league table run by Top500.org, this makes it the 17th most powerful such system on earth, impressive if you consider that a number ahead of it on the list are systems run by the US military and not as open to the civilian scientific community as is HECToR.

Then there’s the electricity bill for all this, a staggering £8.2 million ($16.4 million) a year to keep it lit up, roughly a quarter of which is just the cooling.

In a wonderful congruence of metaphor and reality, supercomputers are hot. Their influence was once a curiosity, a tool that speeded up scientific modelling a bit, but sounded even more impressive. Now, arguably, they have become far more important than even that decent raison d’etre, the third great dimension of scientific endeavour, that of calculation, experiment and theory being the other two.

What supercomputers do is allow science to simulate experiments, or model theories, in ways that are not practical in the physical world, gaining what scientists call greater ‘resolution.’

This can mean understanding weather patterns to smaller and smaller map grids (weather is notoriously local), or just the interaction of complex molecules to a higher level of refinement. In modelling, or simulating, problems of fluid flow, airflow and turbulence, fuel combustion or weather patterns - sometimes in three dimensions plus time - the basic technique is to divide the problem into a grid made up of millions of boxes. The supercomputer runs set equations on each one of these boxes, combining the end results to build up a larger picture that is hugely more accurate than would have been possible before their invention.

This number crunching is becoming critical to science. Making generalisations based on the untested or intuitive assumptions that form the bedrock of so much science and engineering is no longer good enough because it locks up secrets by hiding chaos and complexity, leaving our knowledge ‘smudged’ in ways that can turn out to be limiting or even dangerous.

The importance of supercomputing is sometimes wrapped up in economic rhetoric. “In order to compete, countries must compute,” said the EPPC’s Professor Arthur Trew during HECToR’s official launch on January 14, with some justification, but also one eye on funding – HECToR was built thanks to research councils, funded by the taxpayer, putting their hands deep into their pockets.

Predictably, there is now a supercomputer race on of sorts, with countries, or alliances of countries, competing to fund and build the most powerful as a badge of honour. Like all races of any significance, this is a marathon not a sprint. The big supercomputer of today, even HECToR, is out-performed within months of being opened by another machine somewhere in the world. In the end, the scientists see the competitive aspect of supercomputing in prosaic terms; it interests people and helps to get the things built.

Supercomputers are useless without the scientific thinking to use them appropriately. They advance knowledge by speeding up the rate at which it can be fathomed, but they don’t create knowledge without the brains to feed them the right problems and understand the output. The field’s biggest long-term contribution to knowledge could turn out to be the generation of talent that spends long hours using them.