Pharmaceutical modelling company Schrödinger underscored the massive potential of cloud computing in April by commissioning a 50,000-core utility supercomputer in the Amazon Web Services cloud.

Although few business applications could take advantage of computing on such a massive scale, the message is clear. What processing do you have on tap? With cloud computing the sky is the limit, and it is not just cloud computing that opens new horizons.

Whether Moore’s Law, which says computing power will double roughly every two years, is a universal truth or not, next generation processors and servers are raising the prospect of what is effectively high performance computing in your own data centre.

What could your organisation do with all this power? The simple answer could be to shift cost and complexity away from your business. The more complex answer is that it offers the opportunity to delve deep into your business, analyse data and processes and create new business and services opportunities.

Whether you are going for public cloud, or building a private cloud on highly virtualised new servers, you have to think differently about resource allocation, capacity planning, security and just what IT can offer the business.

Despite all the relentless barrage of advice, to say nothing of the hype, around cloud and virtualisation, too many organisations oscillate between rigid capacity planning and ad-hoc cloud purchasing and resource allocation.

Too many computing departments still set up their resources to match the worst-case scenario of unmet demand. That is: what would happen should the peak load of batch processing exceed capacity? IT directors think about the cost to the business of failing to support this maximum demand, and work backward to figure out the resources it may need.

On the other hand, too many business units still demand ‘their own’ fixed resources to run their own apps, while at the same time by-passing the IT department to buy public cloud services when they feel the need.
Organisations with a clear overview of their business can change this approach. IT teams can convince business units to plan for average consumption of computing resource, rather than the peak demand, with the IT team planning overall demand and using external service providers for extra capacity from time to time.

Part of winning this argument is to push the business itself to identify what it could do with a massive increase in raw computing power, and then make it happen. Where high performance computing was once the domain of quantum physics, weather forecasting, climate research, oil and gas exploration and molecular modelling, it has got plenty to offer finance, engineering and manufacturing and the public sector.