In a move reminiscent of the SETI search for extra-terrestrial intelligence which used individuals' PCs via a small programme, IBM and the University of Cape Town are modelling the effects of climate change on Africa.
The project, called "[email protected]," will use the potentially vast computational power of World Community Grid, a virtual supercomputer comprised of hundreds of thousands of individuals who donate their unused computer time, making it as powerful as one of the world’s top five supercomputers.
To donate their unused computer time to this project, individuals register on www.worldcommunitygrid.org and install a free, small software program on to their computers. When computers are idle, for example, when people are at lunch, their computers request data from World Community Grid’s server. These computers then perform drug discovery computations using this data, and send the results back to the server, prompting it for a new piece of work. A screensaver will tell individuals when their computers are being used.
IBM's announcement states: "Climate change is of grave concern in all areas, but in developing regions such as Africa, the impact can be more acute because of the lack of access to healthcare and other social services. Widespread floods, for example, can lead to water borne illness and related diseases such as dengue fever or malaria, which are spread by infected mosquitoes that thrive in water. Droughts can have devastating effects as well by bringing on pervasive food shortages."
By making better predictions about how global climate change might realistically affect regions of Africa, resource managers can start to make decisions that might alleviate the adverse effects. For example, they could begin planning an irrigation infrastructure or promoting appropriate drought resistant crops. Researchers will use the hopefully huge computational power of the World Community Grid to improve the models used to predict the climate by conducting simulations in small regions of Africa and then checking them against real observations.
Large-scale global climate models provide people with a general idea of what the climate may be like over a wide area, but do not necessarily reflect what will happen in a particular region because the global models do not sufficiently take into account large lakes, mountains, or plains that can affect the local climate.
Lead researcher Dr Mark Tadross, says, “Making predictions about the climate requires an enormous amount of computational power because of all of the variables, such as temperature, wind, pressure, and humidity. In order to improve the models, we need to come up with better algorithms that will more closely match what is observed in a local area. By using (the) World Community Grid, we have the computational power necessary to run the tests we need to improve our models.”
Once researchers have access to models that more accurately predict regional weather patterns, they can then begin to run forecasts about how global climate changes may affect the region. People can then use the data to make management decisions related to agriculture and water resources.
This is important in an area of the world that is still developing and often does not have adequate infrastructure. Forewarning any potentially dramatic changes in climate, especially those related to extreme events such as droughts and floods, can enable vulnerable communities and disaster management teams to act in advance of the climatic hazard.
World Community Grid
The World Community Grid (WGC), the largest public humanitarian grid in existence, has an impressive 315,000-plus members and links more than 700,000 computers. However, it’s estimated that there will be one billion computers worldwide by 2008, underscoring the potential for the grid and its computational power to significantly expand and make an even greater impact on a range of humanitarian issues.
IBM and members of the world’s leading science, education and charity organisations launched the programme in November of 2004. Since the WGC launch more than 500,000 devices have been registered. The computer power that this volunteer community has donated equals one PC running nonstop for more than 76,000 years, an average of 1,000 years a week.
Seven projects have been run on the WGC to date, including [email protected], which completed five years of HIV/AIDS research in just six months.
CPU and data-intensive modelling
The project website says that developing regional climate models requires experimentation to see which algorithms and which parameters most relistically simulate local environments.
The WGC's "server will send each volunteer's computer a dataset representing the large-scale atmosphere over a particular region of Africa, as well as a suite of RCM formulations (each representing a different combination of parameters) that it will use to simulate the local climate. The results will be checked against observations to identify the model formulations that best simulate the observed climate."
This WGC may be as powerful as a supercomputer but it certainly isn't anywhere near as fast and it doesn't require IBM to provide a supercomputer of its own to do the work a whole lot more quickly. Also it relies upon people registering to get CPU cycles available. Clearly the progress of the modelling will not be predictable.
Exacerbating this is the point that the model's data needs are large. The WGC states that the code is both computationally- and data-intensive. Similar to other climate simulations, [email protected] requires three-dimensional information about temperature, pressure, wind, humidity and surface properties for the entire region (known as a domain), at the spatial resolution of the model (approximately 30km x 30km).
In addition, information arriving at the boundary of the region of the time span being studied is needed. This requires a considerable amount of input data, and as the simulation runs, a large quantity of output data is produced.
The CPU impact of this on computers is anticipated to be similar to other projects running on the WGC however, because [email protected] requires such a large amount of input and output data, the downloaded work unit size is anticipated to be approximately 77MB. This is approximately 150 times larger than other WGC project work units. With a 756Kbit/s network connection it will take 12-15 minutes to download.
The size of the result data to be returned to the servers also is quite large. This is exacerbated by the fact that most Internet connection upload speeds are slower than the download speed.
The WGC has an ability to trade off CPU time with result data size. This will be accomplished by running work units with a higher redundancy than other projects. This redundancy will enable it to have individual computers return a fraction of the total output data rather than one very large upload. It also will help it better validate results when heterogeneous processor types are involved. The WCG will initially try this with a redundancy of 10, but may need to adjust this number as it gains experience with the project.
What is IBM contributing to the project? It sponsors the WGC's IT work. The company has donated hardware, software, technical services and expertise to build the WGC infrastructure and provides free hosting, maintenance and support.
Fourteen IBM servers serve as “command central” for the WCG. When they receive a research assignment they will scour it for security bugs, parse it into data units, encrypt them, run them through a scheduler and dispatch them out in triplicate to the army of volunteer PCs.
As results come in, they are scrubbed, validated and assembled into a file. When all the calculations are returned and the assignment is complete, the data is packaged and sent to a directory for retrieval (by researchers).
“We can all have a profound effect on this research by simply by donating our unused computer cycle time,” said Stanley Litow, a man with a very-long-title at IBM; the company's vice president of Corporate Citizenship and Corporate Affairs and also president of the IBM International Foundation. He also said: “If you own a computer and can access the Internet, then you can be a part of the solution in an area of the world where access to computers and high speed connections are not as ubiquitous.”
However, users volunteering PCs with slower connectivity than 756Kbit/s will be rejected, because of the I/O constraints mentioned above. Also, people with Apple Macs are not wanted, unless they run Boot Camp or Parallels and have Windows or Linux installed, as only Linux and Windows computers are usable by the WGC.
Find your next job with techworld jobs