The overall concept
When we consider the terms utility storage or utility computing, various concepts come to mind such as:

- Storage and computing resources that can be called on freely to meet the growing or changing requirements of users or applications;
- A flexible architecture that enables users to scale their systems, to add new services and applications as business requirements dictate and to be able to implement these quickly. Such an architecture must be reliable and trusted and able to balance the requirements of many customers.

When we look at the utility concept in a pragmatic fashion, the underlying message is to find a mechanism, architecture or management regime that will enable users to flex the computing infrastructure as business requirements dictate. And to do this without interfering with the need for continuous business operations.

The challenge to deliver utility computing
Companies such as IBM and HP have been extolling the virtues of utility computing for some time. On Demand from IBM purveys the idea of using what you need when you need it. HP’s adaptive computing approach encompasses the concept of scaling down as well as scaling up the system resources as business volumes require. In both cases you pay for what is used or needed on the day. Is this just a services play, whereby the contract is for a scaleable service, or is this a set of deployable technologies?

Utility storage - or utility computing - goes beyond this. The goal is to deliver a resilient managed environment using system and storage resources effectively and efficiently. VERITAS for example, embodies in its concept of utility computing the ability to adjust the infrastructure to meet changing demands according to existing policies and practices.

Delivering the managed environment must take account of the type of systems installed today. For example, data protection services are frequently deployed to support remote sites, and hence are an additional service that may need to be implemented as part of the overall environment. Hence the new utility infrastructure must respond to central policies to ensure that all information is properly backed up and can be quickly restored. Equally importantly, this must be done without adding agents on every device and potentially impacting system maintenance for each and every user.

In such a case, the data protection services are being extended to enhance the business practices and make such a process transparent across the organisation. These additional services, available from companies such as InTechnology and Iron Mountain, help to build the trusted solution and develop the utility storage environment.

NetApp offers another example of how additional managed features can be added to develop the utility storage environment. Tools such as FlexVol and FlexClone enable users to build flexibility into their storage management through better utilisation of resources while responding to changing demands by reconfiguring systems on the fly.

Virtualisation of resources is a key step in the process
As we consider the necessary steps to deliver a well-utilised system infrastructure, it is clearly evident that this must encompass each system component whether it be hardware, system management tools or applications. Examples of how some of these system components are coming together can be found in EMC’s approach to server virtualisation with VMware and in the way 3PAR has developed virtualisation in the storage array enabling users to feel they have as much storage as needed, while only using and accounting for what is used.

These new products, amongst others, are setting new standards for system resources utilisation and management in demanding and changing business environments. It is by virtualising these resources, that such gains can be made, reflecting a better value from new as well as existing investments.

Additionally, active management of system performance and resource utilisation will be necessary. This is one stage further in the development of these tools. Combining the development of these system management tools with the flexibility derived from the virtualisation of servers, storage and networks, will establish a new basis by which the utility value of systems can be realised.

Utility is a step towards a managed environment
The concept of utility is a step in the direction of managed environments, where users obtain better utilisation of their computing resources. The delivery of this concept will be a mixture of new storage, server and networking technologies that offer an assured delivery of better system utilisation. As businesses expect round the clock computing services and access to data, the systems will be highly resilient. Grid storage is one of the key architectural developments for the delivery of utility storage. This will enable planned or unplanned system management functions to be completed transparently, and as necessary, to reconfigure system resources to meet the service demands of the day.

These demands can relate to full system testing when new features or system upgrades are required, meeting business peak workloads at the end of a month or during high season trading. Utility storage will provide a more secure system environment built on a secure infrastructure. This can be harnessed to deliver continuous system and data protection, building a framework whereby storage resources can be managed and new services such as security, network management or archiving can be transparently introduced.