IBM is set to build a petaflop-level supercomputer using 16,000 cell processors and 16,000 x86 (query Opteron) chips. It's for the USA's Los Alamos National Laboratory in New Mexico. The peak speed is said to be 1.6 petaflops; that's 1.6 thousand trillion calculations per second. Where is all the data going to come from?
Each CPU will have its cache and there will be a monstrous amount of RAM backed up by a humungus amount of disk. How will it be organised? Cray and other supercomputers use SANs and cluster file systems. Roadrunner looks like it will need the mother of all SANs with a petabyte-level capacity and equally monstrous tape libraries to hold the off-line data.
The organisation and administration of this data will be an awesome task. I just shake my head in disbelief at the size and complexity of what is going to be involved.
Find your next job with techworld jobs