It would be safe to say that columnists Frank Hayes and Mark Hall don't always see eye to eye on IT issues. Today's topic of contention? Virtualisation. Hayes says 2008 is the year to wrap our brains around the whole virtualisation idea. Hall begs to differ.

Virtual Stride - by Frank Hayes

OK, Mark, I'll say it: 2008 is the Year of Virtualisation.

True, readers say that virtualisation is one of the most overhyped, under-delivering technologies in IT today. And no wonder - it's almost impossible to figure out what it is. Are we talking about virtualising servers? Storage? Networks? Desktops? Databases? Applications? Operating systems?


Every one of those pieces of IT can be virtualised. They all should be. But they're all virtualised in different ways. As a result, "virtualisation" has the kind of meaningless- buzzword appeal that used to belong to "object-oriented" and "down-sizing" (which can still mean either moving applications off mainframes and onto smaller servers, or consolidating applications from many servers onto a few mainframes).

And it doesn't help that some IT vendors love slapping a hot buzzword on whatever technology they're selling. With a term as fuzzy as virtualisation, it's no wonder it seems to be everywhere - and mean nothing.

But the reason virtualisation is so fuzzy is that it's not a technology. It's really just the idea that we don't want anything - users or hardware or software - connecting directly with anything else.

We only want them dealing with an abstraction - the "virtual" version of a server or memory architecture or database. That way, we can change what's on the other side of the virtualisation curtain, adding computing resources or reconfiguring server farms or redirecting storage, without having to rebuild from scratch.

We're already hip-deep in virtualisation. The concept has been around for decades, and IT products that use it are nothing new. Even in the most conservative of IT shops, we deploy VPNs, operating systems with virtual memory, and relational databases with virtual rows and columns.

Then why do we call it overhyped new tech? Because it was vendors that got on the virtual bandwagon years ago, not us. They're the ones that have been moving things around behind that virtual curtain. And not all vendors - just the ones it made business sense for, like CPU and database vendors.

Now it's our turn to go virtual. Not because it's easy or convenient - when was anything in IT ever easy or convenient? - but because it's the only way we can move fast enough to do what users need.

When they need more server power for applications, we have to be able to deliver it immediately. Otherwise, they lose business.

When they need more storage, they want us to re-architect our disk farms. By the time we're done, the opportunity is gone.

When they need more flexibility or security or capability, being able to move fast is a real advantage.

That's what virtualisation can deliver.

If we can figure it out.

And we'll only figure it out one piece at a time.

That's why 2008 needs to be the Year of Virtualisation. We have to start somewhere, and this is the year to choose where - and wrap our brains around the whole virtualisation idea.

Maybe that means virtualising servers, identifying how to set them up so that instead of just putting one application per server, we can parcel out processor power by the pound.

Or maybe it will mean virtualising storage - figuring out how we can ramp up disk space as quickly as we can link new hardware.

More likely it'll be something as simple as redirecting "My Documents" on each Windows PC to store files on a server instead of the local disk drive. That makes backups easier, improves security and gets users back up and running faster when their machines crash. It's also the kind of sweet spot that makes virtualisation pay off right away as we simultaneously master the concept.

That may not sound like much progress. But it's a first step - and 2008 is the year to take it.

Virtual Sprawl - by Mark Hall

Those readers who, like the two of us, are a bit long in the tooth undoubtedly recall the annual breathless announcements in the 1980s that local-area networks were about to become ubiquitous. The years came and went - 1982, 1983, 1984 ... - with no discernible triumph of the LAN over, say, point-to-point networks. Then, one day, without any commotion, LANs were everywhere. They had conquered networking, but no one could say exactly when. History shifted, but no one knows exactly when.

That, too, will be the fate of virtualisation.

I can't argue with you, Frank, that virtualisation solves many a problem today. And you cogently describe its value for companies using it now. But before our readers take your words of wisdom and join the march to virtualise their data centres, they need to take a moment to reflect on exactly where their IT pain hurts the most.

If, as you rightly point out, a CIO's biggest issue is responding fast to business users' needs, virtualisation can be an excellent tool to quickly provision and deploy server or storage systems. However, if datacentre managers are under different pressures, VMs might just make things worse.

I'm thinking about a story we published late last year about a Web hosting service that attempted to consolidate servers into a virtual environment. It turned out to be a pretty complex process that took 165,000 websites offline for nearly a week.

That news underscored a survey commissioned by Symantec last September about problems faced by datacentre managers. Two of the top four issues revealed in that study are interrelated - datacentre complexity and staff skills. Two-thirds of the respondents said that IT operations were becoming too complex and that there were too many applications (a trend VMs will fuel), making datacentre management increasingly difficult. In fact, 68 percent said their current staffs don't have the skills to manage the complexity of their datacentres today.

And make no mistake about it, virtualisation adds another layer of complexity in a datacentre. It may be seen as a solution to server sprawl, but VM sprawl can happen just as fast - even faster - because IT doesn't need to get a separate purchase order for hardware when adding VMs.

You'll be interested to hear that about half of the respondents to the Symantec poll said that they're trying server and storage virtualisation technology. Wisely, though, precious few are using VMs on mission-critical applications. In part, that's because virtualisation requires datacentre workers with new skills - workers who are hard to find and expensive to keep when you do find them.

Still, those who are moving down the VM road will definitely need people who understand the nuances of virtualisation tools. For example, while it's simple to create a VM guest on a server to run a business application, mixing and matching VMs on a single hardware server can create a service-level agreement mess when they all start contending for I/O, sending server performance into a tailspin. Therefore, hands-on VM experience is crucial.

Virtualisation is great, but each CIO needs to assess his own pain. If more of it comes from, say, managing complexity and finding good staff as opposed to dealing with server sprawl or quickly provisioning storage for an application - the Year of Virtualisation may need to be pushed out for another year or two.

Ultimately, though, you're right, Frank. Just as with LANs a quarter-century ago, virtual machines will creep into our lives at a steady pace until one day we look up and see VMs everywhere, doing everything for everybody.