U.S. officials last week announced plans to spend $325 million on two new supercomputers, one of which may eventually be built to support 300 petaflops, faster than any supercomputer running today.
The U.S. Department of Energy, the major funder of supercomputers used for scientific research, wants to have the two systems -- each with a base speed of 150 petaflops -- possibly running by 2017. Going beyond the base speed to reach 300 petaflops will take additional government approvals.
If the world stands still, the U.S. may conceivably regain the lead in supercomputing speed from China with these new systems. But how adequate this planned investment will look three years from now is a question.
The DOE also announced another $100 million in "extreme" supercomputing research spending.
The funding was announced at a press conference at the U.S. Capitol attended by lawmakers from both parties. But the lawmakers weren't reading from the same script as U.S. Energy Secretary Ernest Moniz was when it came to assessing the U.S.'s place in the supercomputing world.
Moniz said the awards for the two systems, which will be built at the DOE's Oak Ridge and Lawrence Livermore National Laboratories, "will ensure the United States retains global leadership in supercomputing."
But U.S. Rep. Chuck Fleischmann (R-Tenn.) put U.S. leadership in the past tense. "Supercomputing is one of those things that we can step up and lead the world again," he said. The Oak Ridge lab is located in his state.
U.S. Rep. Dan Lipinski (D-Ill.), whose state is home to the Argonne National Laboratory, said the U.S. lead "is being challenged by other countries," and pointed out that the U.S. has dropped from having 291 supercomputers in the Top 500 list to 233.
"Our technology lead is not assured," said U.S. Rep. Bill Foster (D-Ill.), who lamented the movement of computer chip manufacturing to offshore.
Foster, in an interview, said he believes there is good bipartisan support for supercomputing research, but the research may face a problem if GOP budget proposals in the House slash science funding by double digit percentages.
It's "going to be very hard to defend supercomputing budgets if you're facing that sort of cut across all of science," Foster said.
The U.S. leads the world in supercomputing in terms of the dominance of its vendors, research capability, and as Lipinski pointed out, in the overall number of systems in the Top 500, but not in speed.
China has the top-ranked system, the Tianhe-2, at about 34 petaflops, and Japan and Europe have major investments underway in this area. (A petaflop is 1,000 teraflops, or one quadrillion floating-point operations per second. An exascale system is 1,000 petaflops.)
One system, the Summit, will be built at Oak Ridge, and will use about 10 MW of power, which is close to the power usage of the existing system, the Titan, ranked second in the world. The Summit, while using the same amount of power as the Titan, will have five times the performance gain. The Lawrence Livermore lab in California will be the home of Sierra, the second system.
These systems will use IBM Power CPUs and Nvidia's Volta GPU, the name of the chip still in development.
Bill Dally, chief scientist at Nvidia, said in an interview that the GPUs will provide 90% of the compute capability on the new DOE machines. The improvement in power efficiency involved getting rid of overhead, including logic operations not directly involved with computation. The company also looked at the data movement and focused on architectures to improve efficiency, such as co-locating processes and minimizing the distance in which the data has to move.
Dally said chip efficiency will have to improve by a factor of 10 to get to exascale, but he believes that's possible with this architecture. "We have enough things on our target list," he said, referring to possible changes in the chip design.
The DOE announcement was made ahead of this week's supercomputing conference in New Orleans.
Moniz said supercomputing leadership is not only about the speed of the computer, but how one matches and integrates that with the algorithms and software, and in that area the U.S. has the deepest experience. "We will sustain that leadership," he said.