Argonne National Laboratory, September 8-10, 2003

Home  Information  Program  Abstracts  Links

Abstracts

Follow this link for a printable program and list of abstracts.

Deterministic Global Optimization for Problems with Nonlinear Dynamics 

Claire S. Adjiman and I. Papamichail,  Imperial College London, UK

The identification of the global solution of programs with differential equations in the constraints is a problem with numerous applications in engineering and science. While there have been many developments in deterministic global optimization in recent years, much work has assumed that all the functions in the problems are available analytically. This is of course not the case when nonlinear differential equations are present in the form of an initial value problem (IVP). Thus, the key issue in designing suitable algorithms for such problems is the extraction of reliable and global information from the IVP in order to construct rigorously valid convex relaxations. 

A deterministic spatial branch and bound global optimization algorithm is presented for systems with an initial value problem for a set of first-order, typically nonlinear, differential equations in the constraints. Upper bounds on the global minimum are obtained using the sequential approach for the local solution of the dynamic optimization problem. The solution of a convex relaxation of the problem provides lower bounds. Well-known convex underestimation techniques are used for the relaxation of the algebraic functions. The concept of differential inequalities is utilized for the development of parameter independent as well as parameter dependent bounds on the dynamic system. Three convex relaxation procedures are investigated for the parameter dependent solution of the initial value problem. The global optimization algorithm is illustrated by applying it to several  case studies relevant to chemical engineering. Click here for slides

 

Towards Global Solution of Semi-Infinite Programs   

Paul I. Barton,  Massachusetts Institute of Technology

Optimization problems involving a finite number of decision variables and an infinite number of constraints are referred to as semi-infinite programs (SIPs). Such problems occur in a variety of engineering applications including design for operability, range identification for kinetic models, and material stress analysis. We describe progress towards a deterministic approach for solving general semi-infinite programs to guaranteed global optimality.

SIPs are typically solved by deriving a finite analog or approximation to which known optimality conditions and nonlinear programming algorithms can be applied. However, numerical methods in the literature either make strong assumptions, e.g., convexity and/or regularity, or solve a discretized approximation that only guarantees a lower bound to the true solution of the SIP. Moreover, the iterative nature of the more general methods makes them computationally expensive for higher-dimensional problems.

Here, a novel interval-constrained approach is used to derive a finite reformulation of the SIP. The infinite number of real-valued constraints in the SIP is replaced by a finite number of constrained inclusion functions. It is shown that in certain cases, a single NLP solved to global optimality yields the true solution of the SIP. In the more general case, a guaranteed feasible upper bound to the SIP solution can be obtained by solving a single local optimization problem. The quality of this upper bound can then be refined to epsilon-optimality using a nested branch-and-bround procedure. The rate of the convergence and the computational cost of the refinement procedure depend on the type of inclusion function used, e.g. Taylor models versus natural interval extensions. The smoothness requirements of the method are simply those of the NLP solver used to solve the finite reformulation. Click here for slides

 

Topographies, Dynamics and Kinetics on the Landscapes of Multidimensional Potential Surfaces

R. Stephen Berry, University of Chicago

The behaviors of many systems, including proteins and other biopolymers, nanoscale particles and complex nonrigid molecules, lend themselves to description in terms of movement on multidimensional potential energy surfaces. We would like to understand such properties as the tendency for a liquid-like system undergoing cooling and relaxation to move either to some arbitrary amorphous structure or to one of some small set of "special" structures. In this case, a qualitative characterization of the landscape has proved useful to explain the difference between "glass-formers" and "structure seekers." Examples of these are clusters of rare gas atoms and non-folding peptides (glass-formers), vs. clusters of alkali halide molecules and foldable proteins (structure seekers). Quantification of the motion of such systems can be accomplished, in principle, by the use of master equations, kinetic equations describing rates of passage of systems from the region of one local minimum to another. However the complexity of such systems makes it necessary to find simplifying approaches to make the analyses tractable. Some steps have been made in this direction, but major challenges still lie ahead. The presentation will review the state of understanding today, and the problems ahead. Click here for slides

 

Predicting Protein Structure with Global Optimization 

Richard Byrd, University of Colorado, Boulder

In this talk we discuss an attack on the problem of protein structure prediction by parallel global optimization techniques. The protein structure prediction problem is one of the fundamental challenges of modern science. It is to predict the three-dimensional shape, or native state of a protein, given its sequence of amino-acids. Optimization is one of the promising approaches to solving this problem, because it is believed that in most cases, the native state corresponds to the minimum free energy of the protein. However, the energy landscape of a realistic-sized protein has thousands of parameters and an enormous number of local minimizers.

We describe a large-scale perturbation global optimization algorithm used for determining the structure of proteins. The method incorporates secondary structure predictions (which describe the more basic elements of the protein structure) into the starting structures, and thereafter minimizes using a purely physics-based energy model. We have tested our approach in CASP competition, where many research groups compete in prediction of structures that have just been experimentally determined. Results show our method to be particularly successful on protein targets where structural information from similar proteins is unavailable, i.e., the most difficult targets for most protein structure prediction methods. Click here for slides

 

Challenges in Bringing Global Optimizatio to the Marketplace

Steven Dirkse, GAMS Corporation

Recent advances in global optimization (GO) have made possible the introduction of three solid GO solvers into the GAMS system: BARON, LGO, and OQNLP. In this talk, we discuss some of the ongoing challenges and issues involved in introducing general-purpose GO to modeling systems, including:
making GO relevant to existing applications,
creating new markets with novel applications,
requirements of a modeling system unique to GO, and
user expectations, risk-avoidance, and quality control. Click here for slides

 

Global Optimization Problems in Radiosurgery

Michael C Ferris, University of Wisconsin

Radiation therapy delivers dose to a patient from a variety of delivery devices for the treatment of tumors. For each patient, an optimization seeks to produce a homogeneous dose distribution that conforms closely to the treatment volume, while avoiding overdosing nearby sensitive structures. For example, the Gamma Knife is a highly specialized treatment unit that provides an advanced stereotactic approach to the treatment of tumors, vascular malformations, and pain disorders within the head. Inside a shielded treatment unit, beams from 201 radioactive sources are focused so that they intersect at the same location in space, resulting in a spherical region of high dose referred to as a shot of radiation. The location and width of the shots can be adjusted using focussing helmets. By properly combining a set of shots, larger treatment volumes can be successfully treated with the Gamma Knife.

We outline several global optimization problems whose solution leads to optimized treatment plans. The variables in the optimization can include the number of shots of radiation along with the size, the location, and the weight assigned to each. Formulation of such problems using a variety of mathematical programming models is described, and the solution of several test and real-patient examples using nonlinear programming is demonstrated.

 

A New Class of Improved Convex Underestimators for Twice-Continuously Differentiable NLPs: The Generalized aBB Global Optimization Approach

Chris Floudas and I. G. Akrotirianakis, Princeton University

In this talk, we present an algorithm that locates the global optimum of nonconvex optimization problems that involve C2 functions. The algorithm is based on the principles of the deterministic global optimization algorithm aBB, and it also generates sequences of upper and lower bounds that converge to the global optimum. The novelty of the proposed approach is the way it constructs convex underestimators of arbitrarily nonconvex functions. The underestimators are derived by augmenting the original non-convex function by a nonlinear relaxation function. The relaxation function is a separable convex function, that involves the sum of univariate parametric exponential functions. An efficient procedure that finds the appropriate values for those parameters is developed. This procedure uses interval arithmetic extensively in order to verify whether the new underestimator is convex. For arbitrarily nonconvex functions it is shown that these underestimators are tighter than those generated by the aBB method. The role of the underestimating function is crucial to the performance of any deterministic algorithm. The tighter the underestimator the larger the lower bound is. Larger lower bounds help to fathom larger portions of the enumeration tree and thereby increase the performance of the overall algorithm. Extensive computational studies complemented with geometrical interpretations demonstrate the potential benefits of the proposed tight convex underestimators.

 

Integration and Focus in Validated Global Optimization Software

R. Baker Kearfott, University of Louisiana at Lafayette

Three considerations lead to corresponding challenges in validated global optimization.

First, the general global optimization and validated global optimization fields are beginning to mature. Various researchers have produced comprehensive software packages and have had some success at solving "industrial-strength" problems. These packages implement a plethora of techniques. A number of these
techniques, although essentially the same, have been developed independently, and different terminology and theoretical structure has been developed to describe them. Furthermore, in particular applications, it is still not universally clear which techniques are most valuable and which techniques add unnecessary complication.

Second, present validated global optimization software (in which the computations automatically prove mathematically rigorous statements about the optima) usually runs more slowly and is practical only on smaller problems than unvalidated global optimization (which employs heuristics and gives approximate answers that may be wrong). In turn, unvalidated global optimization software is presently applicable only for problems that are smaller than those that well-designed local optimization packages can solve. There are fundamental differences between local and global optimization, and it is unreasonable to expect global optimization in general to perform as well with higher-dimensional problems as local optimization. However, there are striking similarities between current non-validated and validated global optimization techniques: The branch-and-bound processes are similar, similar local approximations of objective and constraints are used, etc.
Furthermore, certain recent developments appear to facilitate making key techniques in non-validated global optimization software rigorous. Moreover, application of these techniques should not slow down the process significantly.

Third, local optimization is often made practical for particular challenging problems by relying on the structure of the problem. We can expect problem structure to be useful in different ways in global optimization, but such structure nonetheless can be crucial in making global optimization practical. We will give some examples.

In short, we have the following challenges:

  1. Identify which of the many global optimization techniques are most effective, and when. Furthermore, simplify and unify the language used to describe the most important techniques. (We will give examples.)
  2. Bridge the performance gap between non-validated and validated global optimization codes.
  3. Make use of structure to effectively attack important special classes of problems. Click here for slides

 

A Triangular-Based Branch and Bound Method for Nonconvex Quadratic Programming and the Computational Grid

Jeffrey T. Linderoth, Lehigh University

In the first portion of this talk, we describe a branch-and-bound method for nonconvex quadratic programs that uses a triangular-based decomposition of the feasible region. Convex and concave envelopes of bilinear functions over the triangular regions making up the spatial decomposition are derived. Issues in building a branch-and-bound algorithm using nonlinear programming software to solve the relaxations will be discussed.

In the second portion of this talk, we will describe the computational grid, experiences with branch-and-bound algorithms running on computational grids, and unique challenges and opportunities for global optimization algorithms on computational grids. Click here for slides

 

Consistency Notions for Global Optimization

Laurent D. Michel, University of Connecticut

Numerica is a modeling language for global optimization. Numerica statements use a notation close to informal presentations in textbook or scientific papers. The implementation integrates techniques from numerical analysis and artificial intelligence and is capable of providing correctness, completeness, finiteness and safety guarantees. The combination of traditional interval arithmetic techniques with consistency notions from artificial intelligence produced various box consistency operators that constitute the basis of the system and support its semantics. The talk will briefly describe the language and its properties and explore the main contributions such as box consistency. Click here for slides

 

Global Optimization and Constraint Satisfaction Problems

Arnold Neumaier, Universit�t Wien, Austria

The lecture discusses techniques, packages and challenges for the solution of global optimization and constraint satisfaction problems by complete search.

The two kinds of problems are closely related, and the solution techniques for them, developed independently for a long time, complement each other. The only way to achieve a complete search in finite time is the utilization of branch and bound techniques together with exclusion boxes that remove optimal points together with a complete neighborhood. Successful strategies require the intelligent combination of convex analysis, constraint propagation, and interval techniques, and the careful handling of rounding error issues. I'll discuss work on integrating various approaches within the European COCONUT project.

Several global optimization packages based on branch and bound are now available; some of them commercially. I'll report about the characteristics of these solvers and about our experiences with some of them. In particular, I'll mention current limitations of the solvers available, and corresponding challenges for the future. Click here for slides

 

Some Recent Advances and Trends in Deterministic Global Optimization

Panos M. Pardalos, University of Florida

In recent years we have witnessed the rapid growth of a new field, global optimization. Many new theoretical, algorithmic, and computational contributions of global optimization have been used to solve a wide spectrum of difficult problems in science and engineering. This talk will cover the following topics:

  1. Nonconvexity and discreteness in optimization
  2. DC (difference of convex functions) and monotonic optimization
  3. Multiquadratic programming and applications
  4. Multi-variable partition (MVP) approaches with applications to distance geometry and spherical code problem
  5. Optimization with massive datasets. Click here for slides

 

Convexification and Global Optimization

Nikolaos V. Sahinidis, University of Illinois at Urbana-Champaign

The construction of tight convex relaxations is a crucial step towards the global optimization of nonconvex NLPs and MINLPs. This talk presents four convexification techniques we have recently developed:

  1. A variant of McCormick's strategy for constructing separable relaxations of factorable programs.
  2. A two-step convexification approach whereby identification of the generating set of  convex/concave envelopes is followed by disjunctive programming techniques to produce the envelopes of lower semi-continuous functions.
  3. Product disaggregation whereby products are distributed over summations to tighten standard relaxations.
  4. A sandwich algorithm that produces a quadratically convergent polyhedral relaxation of a convex program.

Finally, we present computational results with BARON using suitable combinations of the above techniques in the context of specific applications. Click here for slides

 

Physics-Based Folding of Proteins

Harold A. Scheraga, Cornell University

With the current availability of the nucleotide sequences of many genomes, attention is now focused on the gene products, namely the proteins. The gene provides only information about the amino acid sequence of the protein, and the problem then becomes one of trying to understand how the physics of the interaction between the amino acid residues influences the folding pathway and the final (native) structure of the biologically-active protein. According to experiments by C.B. Afinsen, the final 3D structure is believed to be realized as the thermodynamically stable one, and we have developed a hierarchical approach to locate the thermodynamically stable state. This is a problem of global optimization of the potential energy of the system, using a simplified (united residue) model in the initial stages to locate the region of the global minimum; the united residue model is then converted to an all-atom one, and optimization of the potential energy is continued in this limited region to locate the global minimum. The basic methodology will be illustrated. Then, results of recent blind tests on prediction of 3C structures of proteins from amino acid sequence, without using knowledge-based information from protein data bank, will be presented. Finally, recent attempts to apply a physic-based method to determine folding pathways will be discussed.

 

Implementing a Global Solver in a General Purpose Callable Library

Tony Gau and  Linus E.Schrage, University of Chicago

We describe our decisions and experience in implementing a global solver as part of the LINDO API callable optimization library. This library is the solver underneath the LINGO modeling system and the What'sBest solver add-in  for the Excel spreadsheet. The topics and implementation issues we discuss are:

  1. Getting a good solution quickly, multistart and related ideas;
  2. Guaranteed solutions, basic ideas: a) convex relaxation,  b) split/branch;
  3. Constraint propagation/bound tightening, and interval arithmetic;
  4. Constructing convex relaxations for a wide range of functions: 
    continuous and smooth: x+y, x-y, x*y, sin(x), cos(x), log(x), exp(x), sqrt(x),
    continuous, nonsmooth: abs(x), max(x,y) , min(x,y),
    smooth not quite continuous: x/y, x^y, tan(x), floor(x),
    logical functions: if(), and, or, not >=, <=, = =, !=,
    application specific functions: Normal cdf and linear loss function;
  5. Using linearization + linear MIP only for functions such as: abs(), min(), if(), special cases of x*y;
  6. Choosing an algebraic representation, e.g., x*(y-x) vs. x*y-x^2;
  7. Choosing a machine representation with some vector functions, e.g., inner product;
  8. Enhancing numerical stability through interval analysis, cut management, and reformulation;
  9. Choosing a judicious branching for a) tighter relaxation, b) balanced branching on continuous vs. integer variables (MINLP);
  10. Computational testing on various classes of problems.Click here for slides

     

Special issue of Mathematical Programming (Series B) on Global Optimization

We are editing a special issue of Mathematical Programming (Series B) dedicated to global optimization. For details and deadlines, see the call for papers. This call is not restricted to participants of the theory institute.

Last updated on September 5, 2003 by [email protected]