# NEW

Mihai's Comp. Env. Tips.
MCS info I find useful
Energy and other Science Policy Readings
Manuscript submission guide

Comp Math 310, Part B: Optimization

# Stat 310: Computational Math and Optimization II: Optimization; Part 1.

## Lectures.

• Lecture 1:
• 1.1: Course logistics.
• 1.2: Context of Optimization; Example Problems.
• 1.3: Modeling environments; AMPL. The best reference for AMPL is the AMPL web site.
• Lecture 2:
• Lecture 3 (Section 3 roughly follows Chapter 3 of Nocedal and Wright)
• 2.3 Direct Linear Algebra Factorization
• 2.4 Sparsity
• 3.1 Failure of vanilla Newton
• 3.2 Line Search Methods
• Matlab Diary from class.
• Lecture 4. (Section 4 roughly follows Chapter 4 of Nocedal and Wright)
• 3.3 Dealing with Indefinite Matrices.
• 3.4 quasi-Newton Methods.
• 4.1 Trust region fundamentals.
• 4.2 Cauchy point methods. The Dogleg method
• Lecture 5.
• 4.3 Solving the trust-region problem.
• 5.1 Supercomputing and Matrix-free methods.
• 5.1.1 Parallel computing/supercomputing. The memory hierarchy.
• 5.1.2 Linear Algebra on Massively Parallel Computers.
• 5.2 Conjugated gradients method. (Section 5 Follows chapter 5 from Nocedal and Wright)
• 5.2.1 Gram-Schmidt.
• 5.2.2 Method of Conjugated Directions.
• 5.2.3 Krylov spaces.
• Lecture 6
• 6.1 Matrix-free methods, convergence framework. (Section 7 Follows chapter 7 of Nocedal and Wright, except for details on Lanczos)
• 6.2 Krylov-type methods for nonlinear unconstrained optimization.
• 6.3 Lanczos algorithms for unconstrained optimization.
• 7.1 Nonlinear Least Squares.
• Lecture 7
• 7.2 Nonlinear Equations.
• 8.1 Introduction (Section 8 follows chapter 12 of Noceral and Wright)
• 8.2 Examples
• 8.3 Implicit function theorem review.
• Lecture 8
• 8.4 First-order optimality conditions.
• 8.5 Second-order conditions.
• Lecture 9
• 10.2 Quadratic Programs with Equality Constraints.
• 10.2.1 Direct Methods
• 10.2.2 Explicit use of feasible set.
• 10.2.3 Null Space Method.
• 10.2.4 Iterative Methods: CG applied to Reduced System
• 10.2.5 Iterative Methods: projected CG.
• 10.3: Interior-point methods for quadratic programming.
• Lecture 10
• 9.1 Types of constrained optimization algorithms (Section 9 follows chapter 15 in Nocedal and Wright).
• 9.2 Merit functions and filters.
• 9.3 Maratos effect and curvilinear search.
• 11.1 Augmented Lagrangian
• 11.2 Interior-Point Algorithms

## Homeworks

1. Homework 1. Due 01/13/2011.
2. Homework 2. Due 01/20/2011. You will need the files cute.m and cute_wrap.m
3. Homework 3. Due 01/27/2011.
4. Homework 4, Due 02/03/2011.

## Resources:

1. Textbook: Nocedal and Wright, "Numerical Optimization", Springer, 2006. Available online for free for members of the University of Chicago Community.
2. Excellent course page for a computation class from Maria Emelianenko from GMU.
3. Harvey Greenberg's Myths and Counterexamples in Mathematical Programming page on the INFORMS site.
4. The Matlab Exchange: A large set of free Matlab Programs contributed by Matlab Users. I start here many times when designing and developing a new project
6. Tim Kelley's collection of Matlab Programs for Optimization..
7. LSTRS, a Matlab large-scale trust-region algorithm implementation.
8. The AMPL modeling language web site.

## What I assume is known by the students.

1. Multivariate calculus.
2. First-order necessary optimality conditions for unconstrained optimization.
3. Linear algebra. Positive definite and semidefinite symmetric matrices, eigenvalues.
4. Beginner Matlab.