Loren Miller, director of IT research, development and engineering at tyre maker Goodyear, thinks he can easily make the case for wider corporate use of supercomputing technology.
Simulations made possible by supercomputing have enabled Goodyear to gradually reduce the amount of money it spends on building physical tyre prototypes, from 40 per cent of its total research and development budget to just 15 per cent, Miller said last week. The US-based company is using the money it saves to fund more research work.
"From our standpoint, the results have been dramatic," Miller said at the Supercomputing 2005 conference here. Other companies in the US need to realise that they can gain a competitive advantage from high-performance computing systems, he added.
Efforts are under way to broaden the corporate base of supercomputing users. For example, the Ohio Supercomputer Center (OSC) in Columbus has launched a program called Blue Collar Computing that's designed to provide businesses that lack high-performance computing expertise with tools for testing the technology.
And in a speech at last week's conference, Microsoft's Bill Gates predicted that one day some supercomputers will cost less than $10,000. He also said that "mass computing" and supercomputing share common technical challenges and could benefit from combined R&D efforts.
William Kramer, head of high-performance computing at the National Energy Research Scientific Computing Center at the Lawrence Berkeley National Laboratory, said that Gates' appearance was an indication of the growing awareness of supercomputing's importance. "The output of [high-performance computing] activities are no longer hidden behind a curtain, if you will," said Kramer, the conference's general chairman.
Supercomputing is "being scaled down so more people can make use of these very complicated tools," he added. "And I think that's one of the indications of Microsoft's interest here."
Like Gates, Stanley Ahalt, the OSC's executive director, envisions wide-scale use of high-performance systems by companies looking to run complex simulations and visualisations of products during the design and testing process. The OSC, which is beginning to talk with potential commercial users of its systems, hopes to encourage businesses to adopt the technology by offering help as well as access to its supercomputing resources.
Ahalt said he's convinced that supercomputing is critical to improving the competitiveness of US-based companies. But he thinks that many IT managers still aren't even considering the technology.
"CIOs are focused so acutely on the bottom line, they aren't ready for the next big thing," Ahalt said. IT managers, he added, have to take the message about supercomputing's potential value to corporate executives "and explain that we are about to go through one of these radical shifts on our economic systems." Shortening product-development cycles will become even more important to companies than it already is, Ahalt predicted.
There were about 9,250 attendees at the supercomputing conference -- more than a 10 per cent increase from the prior year's. But many appeared to be from large companies, life sciences firms, national laboratories and academic institutions -- the types of organisations that have already invested in supercomputing.
Better product designs
Procter & Gamble uses high-performance systems to run computerised visualisations of products that it's developing. Thomas Lange, director of modelling and simulation for corporate research and development at P&G, spent time at the OSC's trade show booth last week and spoke highly of the Blue Collar Computing initiative.
Some of the products that P&G buys from its suppliers, such as bottle caps, could benefit from computational design programs that run on supercomputers, Lange said.
"Oftentimes, these suppliers could be making lighter products, stronger products, better products," he noted. "It's the suppliers that supercomputing can make a difference for."
Ping isn't a typical supercomputing user. The maker of golf clubs is a medium-sized company with about 1,000 employees. But earlier this month, Cray Inc. announced that Ping is using one of its supercomputers to simulate golf club designs.
The Cray XD1 system installed at Ping is based on 12 Opteron processors from AMD and has 24GB of memory. Eric Morales, a staff engineer at Ping, said the system has enabled the company to drastically reduce product development times. Simulations of product changes that once took a full day to run can now be processed in 20 minutes or less, Morales noted.
"It takes the development [cycle] from weeks down to days, and it helps us get to market faster," he said.
Gaining ground
IDC analyst Earl Joseph said he expects the worldwide high-performance computing market to reach $7.25 billion this year, a net increase of 49 per cent since 2003. The technology should continue to gain ground with both technical and commercial users, "primarily due to the attractive pricing of clusters, combined with their growing capabilities," Joseph said.
Price/performance improvements, which are partly the result of increasing use of commodity processors, are helping to make supercomputing more accessible to businesses.
An annual listing of the world's 500 largest supercomputers, which was released last week, showed that more than three-quarters use processors from Intel or AMD. That's up from 41 per cent in the 2003 listing and just 12 per cent the year before that, according to the researchers who compile the list.
Dave Turek, vice president of IBM's Deep Computing operations, said demand for supercomputing systems is being driven by forces such as the emergence of new businesses that rely heavily on high-performance systems to support uses such as digital animation and bioinformatics.
In addition, many companies are collecting "vast amounts of data that demand rapid analysis for real-time decision-making," Turek said. In particular, he pointed to the growing use of radio frequency identification devices to track products.
But there are limiting factors as well. Although both the price and performance of supercomputing hardware have improved dramatically, the same isn't true for much of the software used on high-performance systems. Programs such as fluid dynamics applications can be costly because demand for individual products is still relatively limited. And according to a study released by IDC last summer, many software vendors aren't increasing the scalability of their code to take advantage of systems with hundreds or thousands of processors.
Concerns about security are also an issue, particularly for corporate users of shared high-performance systems.
For example, WestGrid, a high-performance computing consortium involving seven Canadian universities, has made its systems available to a number of companies for research uses. But it has found that some businesses are reluctant to use the systems for competitive reasons.
"They don't want two companies working on the same problem sharing [computing] resources," said Rob Simmons, a distributed systems architect at WestGrid.