2012 marked the beginning of a fundamental shift in the responsibilities of the CIO. As Gartner analyst Mark McDonald recently wrote on his blog, after years of delivering ‘more for less’, CIOs began to refocus on delivering growth through innovation - building the business rather than putting it on a diet.  

However, most CIOs know they won’t suddenly see their treasure chest filled with new cash, and will have to look for other ways to deliver innovation. 

One of the most effective ways to free up time and money is to remove the complexity and duplication that has been created over the years and puts a strain on resources. At Delphix, we call these resource gobblers the ‘innovation bottlenecks’ and one of the biggest is the database.

Why the database? 

We are living in a world where the amount of data generated and stored is increasing by 35-40 per cent a year, which would be a lot if we only had one copy of all that data, but an average company makes 8-10 copies of each database. The problem isn’t the amount of data,, it’s the IT and human resources involved in making and storing copies, masking the data in each of those copies and refreshing databases. 

The need for these copies is legitimate, from backup and disaster recovery to testing and training, but it no longer makes sense to make and move physical copies of databases or for masses of engineers, developers and analysts to wait days on end for fresh data or a new database environment. All of these copies are sitting on physical hardware and for each 1TB of original content created, 8TB of duplicate data, AKA database drag, is produced. 

This is the fattest part of the IT budget, yet the slowest point in the datacentre. This is the big opportunity for virtualisation. 

Stop waiting, virtualise your database

To remove the database bottleneck that impacts the ability for companies to move fast and support multiple projects at the same time, some companies have begun to virtualise their databases This doesn’t mean running a database in a virtualised environment, which has been done for some time now, but virtualising the database itself. 

Database virtualisation is done by virtualising the data files within a database and creating a single, highly-compressed copy of the original data blocks, then serving that data to multiple database management system (DBMS) servers. 

Each DBMS receives its own fully functional read/write database, and the results are completely transparent to both applications and users. However, each new copy does not create new storage demands, as it would have previously. 

The impact is profound. Resources that have previously been dedicated to managing the multiple copies of a database are freed up and can support new requests from developers, analysts and various departments who previously couldn’t access the database. 

The limitations to how many projects can run concurrently or how frequently a database can be refreshed are gone, because it takes seconds not days. And as an added bonus, companies are able to free up 8TB of the database drag, along with all the manpower and costs that it takes to manage it. 

Early adopters of database virtualisation

Whilst still a new IT concept, database virtualisation is already finding strong proponents. By virtualising their databases, companies including Facebook, Informatica and StubHub have accelerated the application process by as much as 500 percent, reducing complex data refreshes and provisioning tasks from 10 days to 10 seconds and removing the database drag. 

Informatica has been able to complete projects in half the time; Facebook has driven down its monthly financial close from 21 to 2 days, and StubHub has taken out one of the key bottlenecks in agile development by providing engineers with their own virtual databases, which would have been cost prohibitive on physical hardware. 

Mark MacDonald was absolutely right - the goal of the CIO is no longer to do ‘more for less’ but use the technologies that exist to create the type of innovation required for growth. 

By dramatically reducing redundant infrastructure, slow processes, and the time and resources needed to approve, create, refresh and recover databases, organisations can spend more time on testing, development, running numerous projects at once - they can spend more time innovating. 

Posted by Iain Chidgey, VP and General Manager EMEA, Delphix 
Enhanced by Zemanta