Green Supercomputing

Posted on 03/19/2010

0


The beginning of this week I spent in sunny Lugano, Switzerland at the High Performance Computing Advisory Council workshop and the kickoff meeting for my PhD project, the “Computational Cosmology on the Petascale” collaboration under the Swiss High Performance and High Productivity Computing initiative. The HPC advisory council workshop, which I unfortunately only got to attend the first day of due to the overlap of the two meetings, was mainly attended by supercomputing professionals and industry representatives whose job is to provide the software, hardware and human resources infrastructure to build and manage the most powerful computing systems on the planet. As a user of these systems at a marginally higher level, I decided to attend to get a deeper perspective on the hardware and software architecture to better inform my algorithmic design.

One of the takeaways I wasn’t expecting was that I don’t work in a green field. I knew, of course, that supercomputers consume a lot of energy–in fact this is one of the major limitations, both financially and infrastructure wise, in rolling out more powerful systems. I also knew that computers are to blame for a significant chunk of the global energy budget. But somehow I had managed thus far to abstract those hard facts from my chosen research direction, internally imagining that it must be Evil Corporations who are responsible for those statistics.

At the workshop I got a different, on the ground perspective. The reality is that in the U.S. alone the equivalent of 30 million cars worth of CO2 is on the shoulders of commercial and personal IT usage. One single data center of Microsoft requires 47 MW of power. Supercomputers as demanded by the academic community share this burden in a particularly poignant sense. These computers are intended, based on the PR leaflets, for active public and academic use. In reality, at times they are political pawns and a country having yet another system on the list of the top 500 most powerful computers in the world, rather than advancing human knowledge and enabling new human-empowering technologies, is the strategic goal. The consequences of this are that some machines sit idle, burning power as a display of power, or are even delivered without the energy budget to be run properly.

“Is it worth it?” asked the system administrators. “Is your precious science worth the costs?” Is it? I would argue yes. While running climate simulations on a machine using megawatts of power to pin down climate change may seem a tad ironic, the reality is these machines are also powering simulations informing our nuclear fusion and alternative energy research. And astrophysics? That I’ll be talking more about, of course, but my basic belief is that pursuing a subject like theoretical physics, pushing the boundaries of human knowledge of the cosmos ever beyond our imaginations, is one of the things that make our climate and humanity so precious. In Robert Wilson’s words:

“Senator, particle physics research is not likely to aid in the defense of our nation, but it will make our nation worth defending.”

But I will take the system administrators pleas to heart. While the support community tackles the greening of the infrastructure, out of financial and environmental necessity, I can do my part by always being mindful of the consequences of my code design and implementation, however abstract it may be at the inception. A bad algorithm or a poor debugging strategy is a waste of an afternoon on the smaller scale, but on the petascale can have a serious negative environmental impact.

About these ads
Posted in: science