The construction costs of a nuclear power plant are enormous, but so are the costs of research – the painstaking hours, months and years invested in analyzing the interactions of neutronics, fluid mechanics, and structural mechanics in order to predict the behavior of the reactor throughout its lifetime.
Potential investors in next-generation reactors and the U.S. Department of Energy are counting on the synergistic efforts of reactor designers, computational scientists and applied mathematicians to find ways to analyze reactor flow through simulation – capitalizing on the power of high-performance computers.
One of the people leading the way is Paul Fischer, an applied mathematician and mechanical engineer who works in the Mathematics and Computer Science Division at DOE’s Argonne National Laboratory near Chicago.
Fischer uses millions of hours of computer processing time on the IBM Blue Gene/P and a unique code to make detailed simulations of coolant flow in a reactor. His modeled device is about the size of a long, narrow mailbox, packed with 217 fuel pins and about 1 billion data points.
Any one of a number of properties can be calculated at each data point – temperature, pressure, turbulence and velocity.
It takes 65,000 processors working eight hours a day for 16 days, crunching numbers and information, to understand what the entire mailbox-sized device is experiencing at those pressures and temperatures.
And when Argonne gets its next-generation IBM Blue Gene computer, expected to be among the world’s fastest, Fischer’s nuclear reactor flow simulation will be one of the first applications to run on it.
Fischer’s large-eddy simulations of flow in sodium-cooled reactors are 100 times more ambitious than any done before. The work is designed to run at petascale speeds – more than 1 quadrillion floating-point operations a second – and to provide detailed information about heat transfer within the core.
The aim is to demonstrate that the temperature inside a helium-cooled or sodium-cooled reactor can be reliably predicted. That’s crucial, because if plant operators have confidence in the precise temperature, they can run nuclear reactors at higher power levels without compromising safety – resulting in reduced energy costs.
In Fischer’s simulation, coolant passes through interior flow channels between the 217 pins – each of which has a single wire spiraling around it – and through corner channels between the pins and the walls.
By exploiting symmetries, and by virtue of the relatively short entrance length for the flows, Fischer can simplify the calculations so that only a single wire pitch must be simulated.
|September 20, 2010|
Supercomputing brings the climate into focus
The development of even more advanced petascale and exascale supercomputers, capable of doing quadrillions and eventually quintillions of calculations per second, has begun to change the game of climate science and modeling. The best verifiable climate models currently operate with data points that represent areas hundreds to thousands of square miles across. Because new computers are capable of digesting and processing such vast quantities of data, scientists at Argonne and at other institutions around the world believe for the first time that they can generate models with resolutions down to possibly a single square kilometer, or about a third of a square mile.
In addition to improving the spatial resolution of climate models, petascale and exascale supercomputers would also allow modelers to find ways to extend the runs of their simulations. Because studying changes in climatic patterns requires examining global trends over many years, model developers need to find ways of dealing with the accumulation of uncertainties, according to Rick Stevens, who leads Argonne’s work in computing, the environment and life science.
Argonne has started a search for a world-renowned computational climate scientist to build and enhance Argonne’s climate program and reputation. Argonne has also teamed with the University of Chicago to explore the possibility of creating an institute that would house interdisciplinary teams that would work on important problems in climate and other areas of environmental research. The institute would host computer scientists, hydrophysicists, ecologists, environmental scientists, microbiologists, chemists and other experts who would collaboratively tackle these problems.
|August 31, 2010|
DOE awards Argonne projects 200 million hours of supercomputer time
Five researchers at Argonne National Laboratory will lead projects that have been awarded almost 200 million processor-hours of computing time at Argonne’s Leadership Computing Facility.
|July 28, 2010|
DOE launches Institute for Computing in Science at Argonne
The inaugural summer program of the Institute for Computing in Science (ICiS) commenced at July 17 in Snowbird, Utah, with a series of four one-week workshops. A major goal of the DOE-funded initiative is to get key members of the scientific community to start thinking about the development and integration of new computational methods within their discipline. Under the leadership of Rick Stevens, ICiS will enable the science community to come together to form interdisciplinary teams in a visitor-oriented institute structure to address the most challenging problems.
|July 12, 2010|
Magellan explores scientific clouds -- scientifically
“Cloud computing has become a very exciting new field with several companies making offerings that are already being used by scientists around the world,” said Pete Beckman, director of Argonne Leadership Computing Facility and leader of the ALCF Magellan team. “The question the Department of Energy has is pretty straightforward: what kind of science can be done on clouds, and are there specializations or customizations that we can do on the software to get more science out of clouds?”
|May 19, 2010|
INCITE 2011 Call for Proposals
For the eighth consecutive year, the Innovative and Novel Computational Impact on Theory and Experiment (INCITE) program invites proposals for large-scale, computationally intensive research projects to run at America's premier leadership computing facility (LCF) centers, established and operated by the U.S. Department of Energy (DOE) Office of Science. The INCITE program awards sizeable allocations on some of the world's most powerful supercomputers to address grand challenges in science and engineering, such as developing new energy solutions and gaining a better understanding of climate change resulting from energy use.
|April 15, 2010|
Bringing LHC data to US Tier-3s
The Open Science Grid is aiding physicists at over 40 Tier-3 centers nationwide to get set up on a grid to get access to data from the Large Hadron Collider. The new US Tier-3 centers – evenly split between the ATLAS and the Compact Muon Solenoid experiments – have each received about $30,000 in funding as part of the American Recovery and Reinvestment Act. Physicists scattered around the country will be able to use them to do their own analysis of data generated by two of the LHC experiments.
|February 10, 2010|
ALCF Early Science Program to Award Cycles on Next-Generation IBM Blue Gene
The Argonne Leadership Computing Facility is now accepting proposals for time allocations on its next-generation, 10 petaflop IBM Blue Gene system. Allocations through the Early Science Program (ESP) are for preproduction hours (between system installation and full production) beginning in early 2012. More than four billion core hours are available.
To ensure the success of these early projects, ESP awardees will receive significant support from the ALCF staff of computational scientists and performance engineers, plus additional assistance from program postdocs.
Proposals are due April 29, 2010, and must include a detailed plan for the science to be accomplished plus a description of the application development that would be done. For full details or to submit your proposal, visit esp.alcf.anl.gov.
|February 1, 2010|
Argonne streaming visualization sends images across the world
Dealing with data is the specialty of a group of scientists at Argonne National Laboratory. To more easily share and analyze the mountains of data from today's scientific challenges, they are developing software that enables researchers to interact with their results in real-time from across the country.
|January 4, 2010|
Metagenome Analysis Service Exceeds 100GB
Argonne's Metagenomics Rapid Annotation using Subsystem Technology server, or MG-RAST, has processed more than 100 gigabytes (or 100 gigabases) of samples, making MG-RAST the primary data repository and analysis resource for the metagenomics community.
MG-RAST has more than 2,500 registered data submitters representing more than 450 universities, institutes and companies.
The study of community genomes (metagenomics) is now being applied by researchers worldwide to understand the contributions of microbial organisms to processes like carbon sequestration, carbon cycling and environmental remediation.
Argonne's MG-RAST is a free, fully-automated online service for annotating the metagenome of an environmental sample. It uses a variety of computing resources - Argonne�s local clusters, the National Science Foundation-funded TeraGrid and cloud computing - to compare the DNA fragments of more than 3,000 samples submitted to the system against proteins contained in several publicly available databases.
Principal investigators are the Mathematics and Computer Science Division�s Robert Edwards and Folker Meyer, who are assisted by Narayan Desai, Mark d'Souza, Elizabeth Glass, Robert Olson, Tobias Paczian, Andreas Wilke and Jared Wilkening.
|December 10, 2009|