A newly formed consortium is looking to formulate open standards for the cloud, so that each cloud will not be treated like a separate island, but will instead be able interoperate with each other.

The Open Cloud Consortium (OCC) is made up of a group of universities that is both trying to improve the performance of storage and computing clouds spread across geographically disparate data centres, and promote open frameworks that will let clouds operated by different entities work seamlessly together.

Cloud is certainly one of the most used buzzwords in IT today, and marketing hype from vendors can at times obscure the real technical issues being addressed by researchers such as those in the Open Cloud Consortium.

"There's so much noise in the space that it's hard to have technical discussions sometimes," says Robert Grossman, chairman of the Open Cloud Consortium and director of the Laboratory for Advanced Computing (LAC) and the National Center for Data Mining (NCDM) at the University of Illinois at Chicago.

Say you're running an application with one cloud provider, such as Amazon's EC2 service, and want to switch to another one. "Our goal would be that you would not have to rewrite that application if you shifted the provider of cloud services," Grossman says.

The OCC wants to support development of open source software for cloud-based computing and develop standards and interfaces for the interoperation of various types of software that support cloud computing.

OCC members include the University of Illinois, Northwestern University, Johns Hopkins, the University of Chicago, and the California Institute for Telecommunications and Information Technology (Calit2). Cisco is the first major IT vendor to publicly join the OCC, though more could be on the way.

The consortium's key infrastructure is the Open Cloud Testbed, a testbed consisting of two racks in Chicago, one at Johns Hopkins in Baltimore and one at Calit2 in La Jolla, all joined with 10 Gigabit Ethernet connections.

Grossman and colleagues recently used the testbed to measure the performance penalty when doing computation over wide areas. Grossman says by using Sector and Sphere, open source software developed by the National Center for Data Mining for use in storage and compute clouds, they were able to transport data about twice as fast as Hadoop, an Apache Software Foundation project.

One of several reasons for the speed improvement is the use of the UDT protocol, which is designed for extremely high speed networks and large data sets. Most cloud services use TCP, Grossman says.

The Open Cloud Consortium is just getting started, having formed in mid-2008. Grossman says the group is looking at the same technical issues as companies like VMware, which is developing a broad operating system that can manage the entire data centre.

The main idea is to gather universities and IT companies in a non-competitive way to exchange technical information, hopefully leading toward cloud computing that is faster, more secure and based on open standards and open source software.

"I'm not a marketing guy," Grossman says. "This is really trying to understand interoperability issues that I still don't think are clearly understood, and issues about how you operate clouds over wide areas."

Grossman is hoping more major IT vendors will sign on too.

"At this point we're just trying to get a critical mass of vendors to exchange information," he says.