Complexity in the data centre has a number of unwelcome effects on the enterprise, from increased costs to reduced agility and even downtime. For the past five years, organisations have been virtualising their data centres in an effort to reduce complexity and increase efficiency. But while virtualisation offers significant benefits, many such projects have shifted rather than eliminated complexity in the data centre. To truly mitigate data centre complexity, organisations need training, standardisation and information governance.

"So many people think that virtualisation is the penicillin of the data centre, but in reality, what we've seen is that while people are investing heavily in virtualisation, they didn't necessarily have the foresight to see the ramifications of virtualising so quickly," says Danny Milrad, director of product marketing at Symantec, which just released the results of its 2012 State of the Data Center Survey. "One of the benefits of virtualisation is spinning up an application so quickly, but they don't think about how big the footprint of that application can become."

Business-critical apps drive data centre complexity

The increasing number of business-critical apps is the primary driver of complexity in the data centre: 65 percent of respondents in Symantec's study listed it as a driver of the complexity of their data centres. Symantec contacted 2,453 IT professionals from 32 countries. They included senior IT staff focused on operations and tactical functions, as well as staff members focused on planning and IT management.

"Show me an app that isn't a business critical application outside of file and print these days," Milrad says. "Now you've got to replicate it, and your storage footprint goes up. With all these new applications coming online, they're being virtualized, and you've got a ton more data than you ever expected."

When that happens, organisations hit a wall. "As they virtualize more and more, the cost of storage and the cost of virtualization licenses and everything that falls out of that grows faster than expected," he says. "Storage is cheap, but it's still very expensive when you have to buy 10 times more than you expected."

Other key drivers of data centre complexity include the growth of strategic IT trends such as mobile computing (cited by 44 percent of respondents), server virtualisation (43 percent) and public cloud (41 percent). The most commonly cited result of data centre complexity is increased costs (47 percent). But other effects include reduced agility (39 percent), longer lead times for storage migration (39 percent) and provisioning storage (38 percent), security breaches (35 percent) and downtime (35 percent).

Complexity a key contributor to data centre outages

The survey found that the typical organization experienced an average of 16 data center outages in the past 12 months, at a total cost of $5.1 million. On average, one of those outages was caused by a natural disaster (costing $1.5 million), four were caused by human error (costing $1.7 million) and 11 were caused by system failure resulting from complexity (costing $1.9 million).

That's not to say virtualisation is a bad thing, Milrad is careful to note, but it does mean IT needs to pay attention and prepare for the potential side effects.

"It's much like what happened with the introduction of SharePoint," Milrad says. "SharePoint created a power and cooling nightmare. It wasn't expensive for marketing or sales to spin them up, but power, cooling and storage costs went up as a result. It's the same thing with virtualisation. IT needs to get [its] arms around it and manage it as part of the infrastructure. It's just a matter of slowing down and looking at what you're doing."

The survey found that 90 percent of organisations are implementing or actively discussing information governance in an effort to get their data centre complexity under control. They cite enhanced security, ease of finding the right information in a timely manner, reduced costs of information management and storage, reduced legal and compliance risks and moving to the cloud among the benefits they seek to achieve.

Best practices for mitigating data centre complexity

Trevor Daughney, also a director of product marketing at Symantec, recommends adopting the following best practices to help reduce data center complexity:

  • Get visibility beyond platforms. Understand the business services that IT is providing, and all of their dependencies, to reduce downtime and miscommunication.

  • Understand what IT assets you have, how they are being consumed, and by whom. This will help cut costs and risk. The organisation won't buy servers and storage it doesn't need, teams can be held accountable for what they use and the company can be sure it isn't running out of capacity.

  • Reduce the number of backup applications to meet recovery SLAs and reduce capital expenses, operating expenses and training costs. The typical organisation has seven backup applications, generally point products for particular databases.

  • Deploy deduplication everywhere to help address the information explosion and reduce the rising costs associated with backing up data. It's not to simply deduplicate the backup. Consider placing an archive that has deduplication capabilities next to applications such as Exchange or SharePoint that tend to be the biggest data offenders.

  • Use appliances to simplify backup and recovery operations.

  • Establish C-level ownership of information governance. Building an information-responsible culture and creating an umbrella of information governance can help organisations capture synergies across focused projects.