Virtualisation has been around for quite some time, but it's only within the past 2 years that the technology has really started to take off and gain in popularity. Likewise, utility computing is only now starting to get the notice that it believes it's due. Commercial utility computing solutions based on virtualisation such as 3Tera's AppLogic and Amazon's EC2 are starting to get more attention.

To help try and understand the utility computing market better, I spoke to 3Tera's Bert Armijo, senior vice president of sales, marketing and product management, and Peter Nickolov, president and CTO.

Q: Virtualisation is the latest popular buzzword in the technology industry and every media outlet is talking about it in some form or fashion. Can you tell us, why aren't more people talking about Utility Computing? A: New technologies, developments that aren't linear extensions of existing systems, always take time to catch on. Virtualisation came out in 1999, but it wasn't until five years later, in 2004, that it became a hot topic. Utility computing really started just last year with the introduction of 3tera's AppLogic in February 2006, and six months later Amazon launched EC2.

Now that we have users who have been in production for more than a year and new releases of code are coming out, we're seeing more and more interest and coverage.

Q: Do you believe that utility computing is the next step for people once they get into virtualisation? Is the technology inevitable? A: There are a lot people adopting AppLogic that have never used virtualisation, so I don't see it as a required stepping stone for utility computing. The value propositions are different.

Utility computing is a business enabler. Most Web 2.0 users start using AppLogic because they want to be sure they can scale when they get demand. SaaS vendors want to be able to replicate applications for users at will. Enterprise users are interested in making infrastructure become responsive to business requirements. Virtualisation, on the other hand, is most often used for server consolidation. The adoption has been driven by cost savings, and that's clearly reflected in the coverage they've gotten.

Is utility computing inevitable? I'm biased, of course, but I believe so. Building data centres, racking servers and plugging network cables in no longer adds value to most businesses. We've proven that not only can you tap readily available computing resources as easily as plugging in a toaster, but that the result is more resilient and flexible. Technology transitions don't happen over night, though, so we're working with many customers who want to build their own utility as well.

Q: Where do you see server virtualisation technology lacking at the moment? Are there any missing features? A: Virtualisation isn't lacking, it was simply built for a different purpose. Virtualisation is designed to carve a resource into smaller pieces for efficient usage. It was also, as it turns out, a needed technological stepping stone to utility computing.

Utility computing is really about aggregating resources and making them consumable in a new way. That's why when folks simply try to apply virtualisation to utility computing the lack of certain services becomes acute resulting in compromised storage and networking features. We've written about that in a previous article "The 7 services virtualisation lacks for utility computing."

As an example of the difference in scope between virtualisation and utility computing, consider an actual debugging example. We have a customer running a search engine on AppLogic who was troubleshooting lost page requests. About 1 out of 1,000 requests was being dropped. After an hour on WebEx with our engineers it became clear it'd be easier if we could run our own tests. The customer simply exported a copy of the app to us and two hours later we had our own running copy. Yes, I really mean they copied and exported an entire search engine; load balancers, firewalls, web servers, data bases and more. And when we got it all we had to do was hit run. That type of power in manipulating a huge application is what I mean by utility computing is an enabler.

Q: In your opinion, are there any current virtualisation vendors that are approaching the notion of utility computing? A: I think they'd like to. Certainly, the OVF shows that they're thinking about it. However, OVF also shows that they lack an understanding of the fundamental issues that need to be solved to truly enable utility computing.

Perhaps more importantly, Xen and VMware have become so successful that their markets are forcing them in different directions. For instance VMware has initiatives for virtualisation on the desktop and Xen is now being used in cell phones. These are huge exciting markets that will require major technological breakthroughs to fully exploit - and they require quite different solutions from utility computing.

We do find other vendors are noticing as well, though, as the series of cloud announcements over the last year shows. However, thus far most of these appear to be revamping of existing technologies or new specialised programming environments rather than general purpose utility systems.

Q: Where does 3Tera's AppLogic and Amazon's EC2 fit into the equation? A: AppLogic and EC2 are the first demonstrable utility computing systems on the market.

The systems share some similarities. For instance, both use grid architecture. Both are also based on virtualisation as a foundation layer. In fact, both are based on Xen. The reason for that, as I noted earlier, is that existing virtualisation lacks certain required services. Xen, being open source, allowed for easy extension.

However, the two systems take different approaches to some key issues like storage and networking. EC2 has no permanent storage, and if you try building a utility system you'll quickly discover why - storage in a system like this is extremely complex. AppLogic, on the other hand, incorporates the direct attached storage in the servers directly into the grid which allows storage volume and performance to increase as the grid grows.

Q: To finish things off, what other information about 3Tera's AppLogic can you leave InfoWorld Virtualisation Report readers with? A: In the end, utility computing isn't simply a service, but rather an ecosystem. We currently work with half a dozen partners who run AppLogic in more than 12 data centres in the US and Europe. This week AppLogic was demonstrated in Japan for the first time during the Web 2.0 conference in Tokyo. Plus, as I mentioned earlier, we'll license the system to enterprises looking to build their own in-house utility.

We're also working on an exchange for virtual appliances and complete application infrastructures. Thus, you can build applications completely devoid of hardware and select where, and at what scale they'll run only when they're actually executed. If business needs change you can reduce, or increase resources, almost at will. You can even move an app to a new data centre with a single command. This is true utility computing.