Mike Williams considered his virtualisation project a success after consolidating 17 US data centres into three. But then the traffic jams started.
As CIO of the US Defense Contract Management Agency (DCMA), which monitors work on military contracts, Williams had a problem on his hands. Consolidating all those data centres without reconfiguring the WAN was like consolidating 17 cities into three without widening the freeways.
“It is important to make sure to optimise the WAN. We actually didn’t,” Williams says. “All of a sudden the speed of light is not so fast anymore.”
Williams’ story is one of many cautionary tales surrounding early virtualisation efforts. Although virtualisation promises cost-saving optimisation of data centre resources, the path to that pay-off is littered with hazards.
Not just network configuration, but software licensing, security, and systems management are all potential pitfalls, say industry experts and enterprises that have gone virtual. And people issues can be more troublesome than technical ones if the corporate culture resists virtualisation.
When the North Carolina city of Charlotte began virtualisation, some departments hesitated, says Philip Borneman, assistant director of information technology. “You’ll always have some early adopters and others who want to wait and see.”
Elsewhere, some IT fiefdoms simply won’t share. “I have heard of companies that have gotten a lot of pushback from departments who don’t want to give up their own hardware or applications,” says Charles King, principal analyst with Pund-IT, a technology analysis firm.
The organisation chart can complicate other projects, adds Nick van der Zweep, vice president for virtualisation at Hewlett-Packard. One unidentified insurance company, notes van der Zweep, maintained separate IT resources for group insurance, individual insurance, financial investments, and other departments. “When you decide to bring them together, you get turf wars,” van der Zweep says. “You’ve got to convince a lot of people.”
Virtualisation also up-ends the software model. Typically, software is licensed to run on just one server, but having to license it to each of 50 virtual servers limits potential cost savings, van der Zweep explains.
HP faced that problem when deploying BEA WebLogic software on 400 virtual servers. HP created a shared application server utility, a cluster of five server nodes. HP paid for five licences even though each cluster feeds up to 60 virtual machines.
Some but not all software companies are revising their software licences for virtualisation, while others withhold support if their software is run in a virtual environment.
Virtualisation presents security issues, too, says Michelle Bailey, an IDC research director.
If security software runs on a physical server but one of the virtual servers is moved to another physical server without it, “that could be a problem,” says Bailey. “The security policy has to live somewhere else, such as on the network layer.”
Using the right management tool is critical to making virtualisation work, she says, and maintaining security is just one of its functions.
Companies assigning virtual workloads to physical servers need to make sure they are properly configured, have up-to-date patches and don’t still contain rogue software that could cause problems, says Erik Josowitz, vice president of product strategy at Surgient, a provider of virtualisation for software development and testing.
The rush to virtualisation poses a danger that some companies will practice the equivalent of “finding a server by the side of the road and plugging it in,” says Josowitz. “There will be some breach this year that will be a lesson to all,” he predicts.