Argonne National Laboratory Mihai Anitescu, Ph.D, Computational Mathematician
Mathematics and Computer Science (MCS) Division
 
Comp Math 310, Part B: Optimization

Stat 310: Computational Math and Optimization II: Optimization; Part 1.

Logistics

  • First Lecture: Feb 7, 2012. Last lecture: Mar 8, 2012

  • Class Meets: Tu-Th 3:00-4:30 pm, Eckhart 117.

  • Office Hours: Tue-Thu 4:30-5:30 pm, Eckhart 104, or by appointment.

Lectures.

  • Lecture 1:
    • 1.1: Course logistics.
    • 1.2: Context of Optimization; Example Problems.
    • 1.3: Modeling environments; AMPL. The best reference for AMPL is the AMPL web site.
    • 1.4 Course Objectives.
  • Lecture 2:
    • 2.1 Newton’s method and implications.
    • 2.2 Computing Derivatives.
    • 2.3 Optimization Code Encapsulation.
  • Lecture 3:
    • 2.4 Linear Algebra.
    • 2.5 Sparse Linear Algebra
    • 3.1 Failure of Newton Method
    • 3.2 Line Search Methods: Principles
  • Lecture 4 (tentative plan)
    • 3.2.2. Line Search Methods: Other considerations.
    • 3.3 Dealing with Indefinite Matrices
    • 3.4 Quasi Newton Methods
  • Lecture 5
  • Lecture 6
    • 5.4 Conjugate Gradient Preconditioning
    • 5.5 Nonlinear Conjugate Gradient.
  • Lecture 7
    • 6.1 Matrix-free convergence framework
    • 6.2 Krylov Methods in Optimization
    • 6.3 Lanczos Algorithms
    • 7.1 Nonlinear Least Squaares
    • 7.2 Nonlinear Equations.
  • Section 8 (from here on I refer to my own sectioning)
    • 8.1 Dealing with nonsmoothness in NLP
    • 8.2 Examples of NLP
    • 8.3 The implicit Function Theorem
    • 8.4 Optimiality Conditions for Equality-Constrained Optimization
    • 8.5 Optimality Conditions for Inequality-Constrained Optimization
  • Section 9
    • 9.1 Gradient Projection Method
    • 9.2 Augmented Lagrangian Approaches.
  • Section 10
    • 10.1 Types of Constrained Optimization Algorithms.
    • 10.2 Merit Functions and Filters
    • 10.3 Maratos Effect and Curvilinear Search
  • Section 11.
    • 11.1 Interior-point algorithms for quadratic programming.
    • 11.2 Interior-point algorithms for general nonlinear programming.

Homeworks

  1. Homwework #1 Assigned as part of Prof Lim's assignment. Some useful files:
    • The INTVAL package which contains automatic differentiation.
    • If you install it, you can use my implementation of fenton's function. The files are: fenton.m and fenton_wrap.m ; the latter returns the function, gradient, and Hessian of the Fenton function. The headers of the functions should be fairly self-explanatory.
  2. Homework #2, assigned 02/21/12 due 02/28/12
  3. Homework #3, assigned 02/28/12 due 03/05/12
  4. Homework #4, assgined 03/06/12 due 03/13/12

Resources:

  1. Textbook: Nocedal and Wright, "Numerical Optimization", Springer, 2006. Available online for free for members of the University of Chicago Community.
  2. Excellent course page for a computation class from Maria Emelianenko from GMU.
  3. Harvey Greenberg's Myths and Counterexamples in Mathematical Programming page on the INFORMS site.
  4. The Matlab Exchange: A large set of free Matlab Programs contributed by Matlab Users. I start here many times when designing and developing a new project
  5. The Matlab Documentation page, (downloadable pdf).
  6. Tim Kelley's collection of Matlab Programs for Optimization..
  7. LSTRS, a Matlab large-scale trust-region algorithm implementation.
  8. The AMPL modeling language web site.

What I assume is known by the students.

  1. Multivariate calculus.
  2. First-order necessary optimality conditions for unconstrained optimization.
  3. Linear algebra. Positive definite and semidefinite symmetric matrices, eigenvalues.
  4. Beginner Matlab.


Disclaimer
Security/Privacy Notice
webmaster@mcs.anl.gov
Last modified: February 21, 2012