Seminar Details:

LANS Informal Seminar
"SASSy : Student Argonne Summer Symposium - Round 2"

DATE: August 25, 2010

TIME: 10:00:00 - 11:30:00
SPEAKER: LANS Summer Students
LOCATION: Bldg 240 Conference Center 1404-1405, Argonne National Laboratory

The event features three talks by MCS summer students.

Rajat Garg
Title: Performance Implications of Parallel I/O in Adjoint Computations
Adjoint computations offer the potential to provide insight (e.g., sensitivites with respect to billions of independent variables) at a modest increase in flops, typically only a factor 3-10 relative to simulation. However, these computations require intermediate states in reverse order of their generation. The states are restored through a combination of application-level checkpointing and recomputation from the checkpointed states. Adjoint computations typically assume a cluster-like execution environment and checkpoint to loal disk. We will attempt to use parallel I/O (e.g., MPI-I/O) to store this data to a global parallel file system. We will add parallel I/O to the adjoint code for MITgcm, a general circulation model used for ocean state estimation and related climate studies. We are interested in studying the performance characterisitics of the resulting file access patterns, which are not typical of current simulation codes.

Manu Shantharam
Title: Performance Bounds Prediction of Parallel Applications
One of the main concerns in high performance computing relates to the
estimation of application performance bound on a given hardware. For a
given hardware configuration and an application, we estimate the maximum
performance that the application can achieve using static program

In the talk, first, I will give a basic introduction to the ROSE compiler
framework and how it is used within the Pbound tool. Next, I will present
my work related to improving the efficiency of Pbound and extending its
functionality to parallel applications.

Vyacheslav Kungurtsev
Title: Augmented Lagrangian Interpolation-Based Derivative-Free Optimization
There is a class of general nonlinear programming problems (NLP) for which derivative information is unavailable. So far, algorithms for constrained optimization have used direct-search methods. Using the POUNDer algorithm for interpolating an objective function with a quadratic model, we attempt to solve constrained NLPs using the Augmented Lagrangian framework. In this framework, each subproblem is interpolated and solved, then the subproblems are updated at an outer iteration. Numerical results show the method to be promising.


Please send questions or suggestions to Jeffrey Larson: jmlarson at anl dot gov.