Vendors would have you believe that utility computing is on the horizon. In the pretty pictures they paint of this next big shift in powering IT, standards play a central role. There's TCP/IP, Linux or Windows, and Web services with which to build a common utility infrastructure that customers can share on demand. Why, vendors say breathlessly, it's just like those other utilities - electricity, water, the telephone. You use it when you need it, and pay for only what you need when you need it.
Baloney.
Real utilities have the most important standard for their services - pricing. No matter where you live, your power company will charge you for the kilowatt hours you consume. Water companies bill you by the gallon. And phone companies make you pay by the minute.
How do utility computing vendors charge their customers? Who knows?
Each vendor is pursuing its own path. IBM is considering a complicated pricing model based on service units (SU). Inside HP's Project Tycoon, business managers are mulling over the possibility of creating something called a "computon" to measure usage of on-demand resources. But neither company has struck gold in determining just what constitutes a computon or an SU. And when either company does announce its pricing breakthrough, it will be just that one company's pricing model, not an industry standard.
While we know that a kilowatt-hour from Southern California Edison is identical to one from, say, Con Edison, we also can be sure that an SU from IBM will have little or no relation to HP's computon when they finally define those metrics.
Nora Denzel, a senior vice president at HP, says the vendor's goal is to create a fair pricing model so that "companies who use more get charged more; those who use less get charged less." That's exactly how your power company treats your business, except the electric utility knows exactly how many joules were needed for each watt you consumed. IT vendors are a long way from knowing exactly what goes into an SU or a computon.
Little wonder, too. Computing isn't like other basic utilities. Although IT vendors can get precise about the amount of network bandwidth you consume and even the server CPU cycles you use, everything gets very fuzzy when they start factoring middleware and software into the equation. Would HP charge more per computon for an application running on top of HP-UX than it would for one running on Linux? Would IBM's mainframe SUs cost more than ones generated on its iSeries systems? Sounds logical, but if your water company charged you more for river water than for well water, you might think it was trying to rip you off.
Today, when you read stories about users of utility computing, keep in mind that there's no utility pricing involved. Each user is working out a non-standard customised pricing arrangement. Its deal is its deal. You're on your own.
For utility computing to fulfil its promise, there has to be a standard pricing model that all users can apply to their operations. Until then, on-demand computing will be just another complex, proprietary pricing strategy vendors use to keep you from fairly and accurately comparing one service to another. Without a pricing standard, utility computing is one more way for vendors to lock in users to their technology and services.
Once we have an industry-wide utility computing pricing model (and I believe we will one day), utility computing will be real. But for now, it's only a pretty picture of a distant horizon.