Banner: ZumGuy Publications and Network

ZumGuy Publications and Network

Supercomputing!

Posted by Andrew on Wednesday, 26th September 2012 08:27
Super Computing Centre Lugano
CSCS (Centro Svizzero di Calcolo Scientifico / Swiss National Supercomputing Centre)

What defines and gives shape to a super computer facility is not physical space for its arrays of ever more tiny components, but that for the infrastructure required to cool them. With enough heat to melt the expensive equipment in literally a flash, it needs the thermal capacity of a cold Alpine lake to keep it alive.

At Lugano's CSCS, a metre-wide pipe draws water from thirty cool metres down, at a numbing 6 degrees celcius, from the lake more than a kilometre away. Most of the enormous, airline terminal like building is to house the pumps and distribution networks of pipes, which bring that water into the heat exchangers, which blow thusly refrigerated air past the components. This electron-heated water is then returned to the lake some 19 degrees the warmer for its experience.

The CSCS is Switzerland's new impressive centre for number crunchers extraordinaire. It is at the forefront of Swiss domestic needs, as well as many international projects, such as CERN and the IPCC's climate change modelling. All of the Swiss meteological data is processed here, in an array of components that would fit into a smallish suitcase, and the universities of Switzerland, Germany and Britain vie for processing time. Applications for HPC (High-Performance Computing) range from the sciences to humanities, as well as commercial applications, such as pharmaceuticals (e.g. modelling molecules).

Perhaps the most impressive thing about such a centre is its ability to channel, coordinate and reassemble the endless flow of data from huge projects, which require the power of a world network of computing centres. With many gigabytes per second of data needing to be processed and organised meaningfully, this demands a surrounding team of experts to design, implement and monitor all the activities. A constant stream of user demands occupy the fifty full-time staff, and all in all a third of a billion CPU hours are managed annually.

The first supercomputer used in any Swiss supercomputing centre was installed in 1992, and had a capacity of 0.005 teraflops, or a wimpy five billion floating point operations (techy speak for calculations) per second - a task your PC could manage with or without its morning coffee. It took ten years before a supercomputer broke the threshold of one terabyte of floating point operations every second. Another decade along, and the current Monte Rosa CRAY XE6 at CSCS can manage over 400 of these teraflops. Eloquent proof of the veracity of Moore's Law.

We might be tempted to marvel at this. At half a million billion operations per second per machine, we may prosaically come to the conclusion that this brings potential modelling resolution to grains of sand in the universe level. But just to give a perspective and add a dash of awe for nature - despite this overwhelming capacity to calculate, if each operation were to represent a single atom of hydrogen, it would still take an entire year of constant operation to process but a single tonne of star stuff.

And of course, computing power is only as good as the science behind it.....

- Andrew Bone, 26 Sept 2012
Posted by Andrew on Monday, 1st October 2012 13:52
The full article and photographs may be viewed on the Quantum Planet site at:
http://www.quantumplanet.ch/articles/internet/internet1.php
Posted by Andrew on Monday, 25th July 2016 17:56

Thanks.

You must be logged in to post messages.

Quote of the day...


ZumGuy Internet Promotions