Everyone loves benchmarks. You say you dont like them one bit, and you tell me theyre all meaningless, misleading, or skewed to the service of the vendors that sponsor them. I tell you that the benchmarks I run arent the least bit fun, and Id just as soon be rid of them.
But all of us who groan about and deride benchmarks realise the truth: We need them. Without benchmarks wed have nothing but hard numbers to guide us: clock speed, gigabytes per second, rpm, polygons, I could go on.
How excited can one be about a 2.2GHz Opteron when there is a 3.6GHz Xeon within reach? If you can afford either, is it smarter to buy five 7,500-rpm Serial ATA drives or three 15,000-rpm SCSI drives? DDR2 (DDR, second generation) is faster than DDR, right? These read like word problems from your algebra textbook, but at least those had right answers. You can know every last detail about your hardware and still be more or less winging it when it comes to buying choices and, later, allocating systems and storage to tasks.
Its no easy thing to balance and prioritise the needs of several groups within your organisation. Some hard numbers do make these decisions for you I didnt need to be convinced that Gigabit Ethernet was a smart move once switch prices fell below $30 per port, but you dont get many shots at simple maths like that. When youre spending serious money on systems and storage, you need to put more faith in squishy numbers. Benchmarks make good business partners if you choose them well, know where they fit in your strategy, and know when to listen and when to ignore.
A benchmarks primary function is to provide you with a necessary measure that hard numbers cant give you: capacity. An enterprise is often more interested in how heavy a load its technology can shoulder than in how quickly it can move the load from here to there.
There really are good benchmarks out there. I know because Ive been shacking up with one. Ive been in touch with my lab-coated alter ego lately about climbing back into the SPEC (Standard Performance Evaluation Corporation) benchmark suites. I wrote a story about AMDs new dual-core Opteron CPU and, although Ive done it so many times before, this time I couldnt let it go without including some benchmarks. In the process of doing the benchmark thing, I learned and conveyed knowledge that couldnt have been derived any other way.
It may be that the benchmarks I favour are unique. I have no use for packaged, one-click benchmarks built for something older than my hardware. I have no need for an ad hoc benchmark that lives predominately on one platform and changes so often that its test results cannot be compared from year to year. I built 64-bit SPEC CPU2000 benchmarks using Intel 64-bit compilers on Microsoft 64-bit Windows on an AMD 64-bit Opteron system. I went into the process with lots of theoretical knowledge and left with the balance tilted toward the practical.
Its those hard, steady, take-to-the-bank numbers that are too often arbitrary, misleading, and untrustworthy. I stand behind my squishy benchmark numbers because I know exactly where they came from. Theres power in that.