We hear a lot of talk from analysts and big-name companies about the need to automate. Example questions include:
"As someone who manages a large data centre with thousands of servers, I'm concerned that in the rush to automate, I'll lose control of my server environment. Can you tell me what the benefits are of automating the IT server management process, and how can I guarantee my CIO that by automating the process there won't be unwanted disruptions in service?"

First, your concerns are shared by a lot of other people, including many CIOs and data-centre managers I've talked with in recent years. And you're right; the hype around data centre automation gets pretty thick. I believe that automation in the data centre is inevitable: just as automation came to manufacturing facilities and to processes as complex as flying jets, automation is making its way into the world of software and hardware.

Why? It's because of the advantages it will bring. Automation - when it's done right - can mean lower costs, lower error rates, greater reliability, increased stability and availability.

All that being said, be assured that any automated process that makes you feel a loss of control is ugly automation. Good automation, successful automation provides selective control and transparency.

As a comparison, a car's automatic transmission takes care of a lot of manual processes. You put the car in gear, and watch data indicated on the tachometer and the speedometer, but normally, you don't need to see gear ratios and engine speed to operate the vehicle comfortably. There's intelligence built into the vehicle that takes care of those things, yet you still maintain control but only over certain aspects, not all of them.

Cruise control allows us to set a speed, and the car maintains it automatically. We don't see all of the systems involved in maintaining that speed, and perhaps most important of all, control is immediately returned to us when we communicate that we need it, by braking.

In our cars, we trust that automation is doing what we need and we can verify it through selective transparencies (like the speedometer). The same applies in the data centre. Well-designed automation allows you to verify that applications, processes and systems are doing what you need them to do. You've got costs and budgets, SLAs and so on and you must have the ability to view what's happening. You can't afford unplanned disruptions and good automation will ensure that you don't have them.

It's important to understand that well-designed automation does not force you to do everything at once: it's a step-by-step process of automating selected processes based on your comfort level and business priorities.

Being concerned about control is just as natural as the fighter pilot who puts the jet on autopilot for the first time. But not having autopilot at all can be dangerous.

On the subject of danger, let me warn you about first-generation automation. It tends to be pretty clunky. There are products out there claiming to be automation that are really just miles and miles of script attempting to duplicate choices and conditions that a human could make in any situation.

The first automated machines used rods and pulleys to automate the same motions a human being used to accomplish a physical task. The scripts are like these rods and pulleys. OK, maybe they are a step forward, and a steel rod has been grafted onto a lever. But the problem in the case of software is that we're talking about hundreds of thousands of "moving parts": a server might have certain volume of files of various types on it at one point, and the next moment - because something was written to a database, for example - it might have totally different ones. The system is dynamic, with constantly changing sets of information. It's impossible to script all of the possible events that can occur.

And worst of all, none of this so-called automation has anything to do with business objectives. IT automation must be driven by business policy and the sophisticated, proven approaches do this.

Software automation has gone through around five generations. What works today - but more important, what will work into the future - is model-based and policy-driven automation. Bringing intelligence to the system is the only real solution to automating something that is constantly changing. And this intelligence is also the key to control: being able to trust and verify what the system is doing at any given time and what it will do in any situation in the future, based on business policy.

Fitzgerald is CTO and director of product development of HP Change & Configuration Software Operations (formerly Novadigm).