Feature Stories

Date Postedsort ascending
DOE Research Group Makes Case for Exascale

Exascale computing promises incredible science breakthroughs, but it won't come easily, and it won't come free. That's the premise of a feature story from the DOE's Office of Advanced Scientific Computing Research, whose mission it is "to discover, develop, and deploy the computational and networking tools that enable researchers in the scientific disciplines to analyze, model, simulate, and predict complex phenomena important to the Department of Energy."

The article makes the case for exascale computing, citing some of the scientific breakthroughs that such a leap would enable, such as precise long-range weather forecasting, innovative alternative fuels and advances in disease research. The ability to represent many more variables will lead to more realistic models. For example, future researchers will be able to create a global climate model with a level of resolution that is now only possible for regional studies.

February 21, 2011
Sharpening up legacy codes

A team from Argonne National Laboratory in the USA is developing a modern set of exascale computing simulation tools for the design and study of liquid-metal-cooled fast reactors. The simulation-based high-efficiency advanced reactor prototyping (SHARP) project at Argonne National Laboratory (ANL) is a multidivisional, collaborative effort to develop a modern set of design and analysis tools for liquid-metal-cooled fast reactors. The project’s modular approach allows users to attach new modules to legacy reactor analysis codes, thereby avoiding costly rewriting of codes while enabling the incorporation of new physics modules.

January 21, 2011
Why America Needs a 21st Century Power Grid

Director Eric Isaacs's article for the Huffington Post on the need to upgrade our nation's power grid. Isaacs discusses why we need to make the investment in revamping and redesigning our power grid infrastructure for long-distance transmission of renewable electricity.

December 26, 2010
Division Director Mathematics and Computer Science Division

Argonne National Laboratory seeks applications from highly qualified candidates for the position of director of the Mathematics and Computer Science (MCS) Division. Argonne is one of the preeminent multidisciplinary research facilities in the country and is a U.S. Department of Energy laboratory managed by UChicago, Argonne, LLC.

December 3, 2010
These high core count experimental chips from Intel could provide one path to exascale computing—in which computers would perform one quintillion operations per second. Photo by Intel Corporation.
New institute to tackle exascale computing

Renowned supercomputing expert Pete Beckman has been named director of a newly created Exascale Technology and Computing Institute at Argonne National Laboratory.

November 19, 2010
Why America must win the supercomputing race

The United States cannot afford to take a back seat in computer technology to the Chinese, or to anyone else. The nation that leads the world in high-performance computing will have an enormous competitive advantage in every sector, including national defense, medicine, energy, environment, finance, manufacturing and product development.

November 18, 2010
Victor Zavala, an Argonne computational mathematician, is helping to develop optimization models to improve building energy efficiency.
BuildingIQ makes Argonne smarter about energy management

Victor Zavala is leading a pilot project with BuildingIQ, an energy management software company, which allows him to use his training to improve energy performance at Argonne's Theory and Computing Sciences building.

September 29, 2010
Winding path leads to fluid career

Paul Fischer's fascination with science, mathematics and engineering have landed him in a position to work with the world's most powerful computers.

September 20, 2010
Nuclear predictive

The construction costs of a nuclear power plant are enormous, but so are the costs of research – the painstaking hours, months and years invested in analyzing the interactions of neutronics, fluid mechanics, and structural mechanics in order to predict the behavior of the reactor throughout its lifetime.

Potential investors in next-generation reactors and the U.S. Department of Energy are counting on the synergistic efforts of reactor designers, computational scientists and applied mathematicians to find ways to analyze reactor flow through simulation – capitalizing on the power of high-performance computers.

One of the people leading the way is Paul Fischer, an applied mathematician and mechanical engineer who works in the Mathematics and Computer Science Division at DOE’s Argonne National Laboratory near Chicago.

Fischer uses millions of hours of computer processing time on the IBM Blue Gene/P and a unique code to make detailed simulations of coolant flow in a reactor. His modeled device is about the size of a long, narrow mailbox, packed with 217 fuel pins and about 1 billion data points.

Any one of a number of properties can be calculated at each data point – temperature, pressure, turbulence and velocity.

It takes 65,000 processors working eight hours a day for 16 days, crunching numbers and information, to understand what the entire mailbox-sized device is experiencing at those pressures and temperatures.

And when Argonne gets its next-generation IBM Blue Gene computer, expected to be among the world’s fastest, Fischer’s nuclear reactor flow simulation will be one of the first applications to run on it.

Fischer’s large-eddy simulations of flow in sodium-cooled reactors are 100 times more ambitious than any done before. The work is designed to run at petascale speeds – more than 1 quadrillion floating-point operations a second – and to provide detailed information about heat transfer within the core.

The aim is to demonstrate that the temperature inside a helium-cooled or sodium-cooled reactor can be reliably predicted. That’s crucial, because if plant operators have confidence in the precise temperature, they can run nuclear reactors at higher power levels without compromising safety – resulting in reduced energy costs.

In Fischer’s simulation, coolant passes through interior flow channels between the 217 pins – each of which has a single wire spiraling around it – and through corner channels between the pins and the walls.

By exploiting symmetries, and by virtue of the relatively short entrance length for the flows, Fischer can simplify the calculations so that only a single wire pitch must be simulated.

September 20, 2010
Supercomputing brings the climate into focus

The development of even more advanced petascale and exascale supercomputers, capable of doing quadrillions and eventually quintillions of calculations per second, has begun to change the game of climate science and modeling. The best verifiable climate models currently operate with data points that represent areas hundreds to thousands of square miles across. Because new computers are capable of digesting and processing such vast quantities of data, scientists at Argonne and at other institutions around the world believe for the first time that they can generate models with resolutions down to possibly a single square kilometer, or about a third of a square mile.

In addition to improving the spatial resolution of climate models, petascale and exascale supercomputers would also allow modelers to find ways to extend the runs of their simulations. Because studying changes in climatic patterns requires examining global trends over many years, model developers need to find ways of dealing with the accumulation of uncertainties, according to Rick Stevens, who leads Argonne’s work in computing, the environment and life science.

Argonne has started a search for a world-renowned computational climate scientist to build and enhance Argonne’s climate program and reputation. Argonne has also teamed with the University of Chicago to explore the possibility of creating an institute that would house interdisciplinary teams that would work on important problems in climate and other areas of environmental research. The institute would host computer scientists, hydrophysicists, ecologists, environmental scientists, microbiologists, chemists and other experts who would collaboratively tackle these problems.

August 31, 2010