To try to assess global warming's impact on the environment and see if the world faces an abrupt climate change, Zhengyu Liu, director of the centre for Climatic Research at the University of Wisconsin in Madison, is turning to supercomputing technology.
Liu using a supercomputer made by Cray to run a continuous simulation of climate changes over the past 21,000 years, a span that reaches back to distant ice ages.
To prove what will happen in the future, Liu said he first must be certain of the past. That means running his computer simulations backward in time to see if they can accurately model past climatic events. If the models do provide an accurate mirror of climate history, then they can be used to predict the impact of increasing atmospheric greenhouse gases on the planet, he said.
The work isn't just an academic exercise for Liu. "What is most urgent - what we want to know - is: 'Can the model produce some abrupt climate change event,'" he said.
By abrupt, Liu said he means an event that could occur within someone's lifetime, such as a vegetation collapse in part of the world similar to one that took place in Africa about 5,000 years ago, when trees there suddenly died.
"Abrupt change can happen," he said. But current computer models don't look at the planet's climate as a continuous unfolding event, and that keeps researchers from connecting different points in time to understand how climatic events connect. "They cannot simulate abrupt change at all," Liu said.
To run his models, Liu recently was awarded time on a Cray system at the US Department of Energy's Oak Ridge National Laboratory. The Cray machine is based on dual-core versions of AMD's Opteron processor, with about 11,000 processor cores. Liu was given nearly 420,000 hours of processing time this year alone, for use as part of a multiyear project that began last year.
Liu's work underscores the importance that supercomputing has in climate change research. The models used to determine the impact of industrial gases are very complex and require enormous computing resources. And the work required to effectively utilise large supercomputers still has a long way to go, according to Liu and other researchers.
But already, climate research applications are increasing the demand for high-performance systems. By 2010, IDC expects annual worldwide spending on supercomputers used solely for climate research to reach nearly $500 million, up from $171 million in 2000 - an increase of almost 200 percent. That figure doesn't include supercomputers that may be used in a variety of research projects, including climate-related applications, said Steve Conway, an analyst at IDC.
Climate researchers have a relentless need for higher performance and faster speeds on simulations that can take weeks, months or even years to run because of their complexity. That need "is one of the big drivers for getting to peta-scale systems," Conway said, referring to supercomputers that will be able to process 1,000 FLOPS.
Cray is among the IT vendors that are delivering systems for use in climate and weather research. It recently manufactured machines for the Danish Meteorological Institute, the Swiss National Supercomputing centre and the University of Edinburgh in Scotland. The latter system, the UK's most powerful supercomputer to date, will be used for a variety of research issues, including weather applications.
Peter Ungaro, Cray's president and CEO, said that before researchers buy larger systems of that sort, they are forming groups "to build applications for next-generation machines."
But much work remains before applications can even begin to effectively use such systems.
At the Oak Ridge National Laboratory, James Hack wears two hats. He directs the National centre for Computational Sciences as well as a just-created Climate Change Initiative, which is tying together various climate research efforts to help take them to the next stage.
One of the reasons that new stage is needed is because of the findings of the Intergovernmental Panel on Climate Change, an international body involving some 2,500 scientists in 130 countries. The IPCC released a report on climate change in November that it believed made a conclusive case about the certainty of impending climate change. "Today, the time for doubt has passed," chairman RK Pachauri said when the report was released.
Supercomputers have assembled a big climate-change picture, so to speak, that can show some of the things that may happen as CO2 emissions increase further. But another need in climate research, said Hack, is to understand the impact of climate change on a regional level, such as what it would mean for the manager of a water project in the south-western USA who wants to know how groundwater levels may be affected by warming temperatures.
"There are a lot of issues that have some practical consequences for society, and I think the next phase in the science is to develop the capabilities to answer those questions," Hack said. For that to happen, though, faster machines will be needed, he added.
For instance, developing the ability to look at how climate change may affect certain areas, and how to mitigate or adapt to the changes, will require more precision and specificity in the resolution of computer models. Liu said that just doubling a model's resolution may require an approximately tenfold increase in compute time.
But it isn't an issue of simply adding more chips and building bigger systems, although more performance would help. The effort also has to include the development of applications that combine a multidisciplinary range of sciences, such as algorithms that can scale across many processing cores. "Your applications need to scale, and they need to scale well," Hack said.
The problem is that climatic events are non-linear, which means there's no straightforward path to a solution.
Oak Ridge is expecting the arrival of peta-scale systems next year, according to Hack. And while there are researchers who feel that "you can't build fast-enough computers," he said, others think that the rate of progress has been remarkable. "On some levels, we are being paced by our ability to incorporate more realistic physics," as well as other observational data about the earth, into the systems, he said.