Science Highlights

Date Postedsort ascending
Using dual decomposition for solving problems involving data uncertainty

Researchers develop a new parallel formulation using dual decomposition to solve problems requiring decisions under uncertainty.

August 14, 2013
Argo exascale architecture
Click on image to enlarge.
Designing a new operating system for exascale architectures

The Argo project will design and develop a platform-neutral prototype of an exascale operating system and runtime software.

August 7, 2013
MCS division researchers help develop new sequencing analysis service

The Argonne/University of Chicago Computation Institute has announced a new sequencing analysis service called Globus Genomics.

July 16, 2013
Dynamic trees can aid in performance tuning of scientific codes

Researchers have developed a data analysis tool that uses dynamic trees to rapidly determine which software and hardware tuning parameters best explain differences in code performance.

July 16, 2013
This image shows that the "cold cache effect" (whereby the execution time of the first job requires more time than subsequent trials/replications--35 in this test--of the same job) exists and that the run times stabilize after the second trial. The new analysis reported here, however, shows that runtime differences related to the cold cache effect are not statistically significant relative to runtime differences due to applying code optimization strategies.
Dynamic trees can aid in performance tuning of scientific codes

Researchers from the University of Chicago Booth School of Business, together with Stefan Wild, an assistant computational mathematician in the Mathematics and Computer Science Division at Argonne, have recently demonstrated how the dynamic tree model can support both variable selection and sensitivity analysis of inputs.

June 10, 2013
Darshan has been integrated into the web portal at NERSC to provide feedback to scientists about application behavior. This screen shot shows an example of the I/O summary data that is available to users as soon as their job has completed. It indicates the amount of data accessed, the percentage of runtime consumed by I/O activity, and the access sizes used by the application.
Characterizing I/O performance on leadership-class systems

Argonne National Laboratory have developed Darshan, a scalable I/O characterization tool that collects I/O access pattern information from HPC production applications.

May 30, 2013
Argonne applied mathematicians use INCITE awards to attack energy problems

What is the best possible power grid configuration for our nation? How can we balance the increasing demands for power while minimizing costs and avoiding waste of resources?

March 27, 2013
MONJU reactor core: 715 assemblies, 99k volumes, 9.7M elements
New MOAB mesh software version released

Scientists use meshes to approximate a geometric domain for solving complex engineering analysis problems.

March 6, 2013
Fig. 1: Zoom-in visualization of the matter density field in a 1.07 trillion particle, 9.14-gigaparsec (Gpc) box-size simulation with HACC on sixteen BG/Q racks. This figure illustrates the global spatial dynamic range covered by the simulation, of approximately a part per million, although the finer details are not resolved by the visualization. (Note: 1 Gpc is about 3.26 billion light-years; 1 Mpc, or megaparsec, is about 3.26 million light-years.)
Shedding Light on Dark Matter and Dark Energy

Dark matter and dark energy — together accounting for 95% of the mass-energy of the universe — remain mysterious. With funding from the DOE Office of Science, Argonne is now studying these pillars of cosmology.

March 6, 2013
Using large eddy simulations to understand flow mixing

In nuclear power plants, turbulent flow streams of different velocity and density mix rapidly at right angles in pipes.

March 4, 2013