Back in the early 1990s, when Linux initiator Linus Torwalds and open source software started to make headlines, the idea of giving software away seemed crazy. Looking at the headway the movement has made since then, you might be forgiven for wondering why Linux desktops have failed to become as ubiquitous as Linux servers are.

Oh, how they laughed when the open source movement started. But today, Linux is present in almost every enterprise and, according to market researcher IDC, it's set to keep on growing. In fact, Linux server sales have hit their second consecutive quarter of double-digit growth and constitute 12.7 percent of the overall server market, or $1.6 billion for the first quarter of 2007, reckons the research company.

What's more, server technology has moved forward in leaps and bounds, with virtualisation - now the accepted way of helping to combat the combined treats of humungous and expensive power consumption and adding to the global climate crisis - only the latest technology to make it into the Linux kernel.

So there are millions of Linux boxes out there, performing tasks such as serving Web pages and hosting databases. And there's a fair number of Linux machines running as desktops, mainly for technically savvy enthusiasts and computer professionals. Exact numbers are impossible to come by for a free OS, but estimates made on the Ubuntu forum -- Ubuntu is the fastest growing and probably the most-used desktop distro today -- suggest that the numbers could be between 500,000 and one million.

Enterprise desktops: the promised land

Yet there's one area in which Linux appears to have stalled: the enterprise desktop. What's holding it back?

There's a number of reasons cited by critics. They include lack of technical support, lack of hardware drivers, lack of standard documentation, and lack of appropriate skills in the enterprise IT team. Plus there's the cost of switching desktop users over to Linux, in time, hardware and software, and of training them, none of which will come cheap.

But even if the move can be cost-justified -- and there are bound to be hidden and unexpected costs -- there may be other reasons too. Linux developers have argued for some time that drivers for most hardware are now available and that hardware support shouldn't be an issue any more.

Up to a point that's true. However, once you stray off the beaten path and want support for, for example, a 3G PC Card for a laptop, things can start to get messy. Yes, you can make it work, but it's not pretty and it often requires individual tuning. And that's just one example.

Linux development in crisis?

But it's worse than that.

Coming up next week is Linux' first collaboration summit, organised by the Linux Foundation and aimed at developers. The tagline is that the event "will bring together the brightest minds in the Linux ecosystem to discuss where Linux is, where it needs to go and how we can all help get it there."

In discussions before the event, it seems that some honesty is being called for. Linux is no longer the poor relation but a heavily used enterprise-level tool. Yet there are areas where the technology falls well below what could be expected of an OS at this level -- and it may even be that the OS development model is in part to blame.

For example, Torvalds has said that the file system and power management need improvement, and that device drivers in particular often work but don't implement advanced functions.

One example is power management. "We're still pretty stupid about power management," Linux kernel developer Morton says. Devices are able to support low-power states. For example, a network card that's not receiving incoming traffic can go to a lower-power state. But Linux supports only on and off and, Morton says, "we're having trouble getting off and on working."

Another is file systems, which will need an overhaul if they're to stand up to the rate of increase in hard disk capacities but not speed. File system developer Val Henson points out that disk capacities are likely to grow by a factor of 16 by 2013, but that bandwidth will only grow by a factor of five, and seek time by a factor of 1.2. That means that the file-system-checking utility, fsck, will take longer and longer to run -- it could take days.

Torwalds has also offered veiled criticism of the way the open source development model operates. He singled out the development workflow process. Much of the work done by the Linux kernel on startup and during operation is wasted because some developers aren't exploiting the capabilities of some of the latest code libraries. Examples include the repeated opening of the same files by different modules within the kernel, and over-zealous checking of the hardware specification.

Morton also reckons that developers either don't or can't profile their code -- in other words, find out what the code is doing to the hardware, how much cache and main memory it uses, and which chunks of code actually execute.

For the end user, all this can translate into hardware devices that don't run as efficiently they did under Windows, that use more power and run down laptop batteries quicker, and which take longer to start up and run than they should. Some developers complain about how some Linux desktops doesn't seem as 'snappy' on low-spec machines.

If Linux is to make the final leap into the average user's machine, especially in the enterprise, all these matters need to be addressed.

One solution could be greater central control and tighter checking procedures and metrics before code is passed off. However, it remains to be seen if the community can make the collective effort to fix this following the summit, where these and other issues will be aired.

If it doesn't, though, then it gives critics carte blanche to argue that the open source development model is flawed. And that's an argument to be taken seriously.

Don Marti of LinuxWorld contributed to this article.