Comp Math 310, Part B: Optimization
Stat 310: Computational Math and Optimization II: Optimization; Part 1.
Logistics
-
First Lecture: Jan 4, 2011. Last lecture: Feb 3, 2011.
-
Office Hours: Tue-Thu 3-4 pm, Eckhart 104, or by appointment.
Lectures.
- Lecture 1:
- 1.1: Course logistics.
- 1.2: Context of Optimization; Example Problems.
- 1.3: Modeling environments; AMPL. The best reference for AMPL is the AMPL web site.
- Lecture 2:
- Lecture 3
(Section 3 roughly follows Chapter 3 of Nocedal and Wright)
- 2.3 Direct Linear Algebra Factorization
- 2.4 Sparsity
- 3.1 Failure of vanilla Newton
- 3.2 Line Search Methods
- Matlab Diary from class.
- Lecture 4. (Section 4 roughly follows Chapter 4 of Nocedal and Wright)
- 3.3 Dealing with Indefinite Matrices.
- 3.4 quasi-Newton Methods.
- 4.1 Trust region fundamentals.
- 4.2 Cauchy point methods. The Dogleg method
- Lecture 5.
- 4.3 Solving the trust-region problem.
- 5.1 Supercomputing and Matrix-free methods.
- 5.1.1 Parallel computing/supercomputing. The memory hierarchy.
- 5.1.2 Linear Algebra on Massively Parallel Computers.
- 5.2 Conjugated gradients method. (Section 5 Follows chapter 5 from Nocedal and Wright)
- 5.2.1 Gram-Schmidt.
- 5.2.2 Method of Conjugated Directions.
- 5.2.3 Krylov spaces.
- Lecture 6
- 5.4 Conjugated gradient convergence/preconditioning.
- 5.5 Nonlinear Conjugate Gradient.
- 6.1 Matrix-free methods, convergence framework. (Section 7 Follows chapter 7 of Nocedal and Wright, except for details on Lanczos)
- 6.2 Krylov-type methods for nonlinear unconstrained optimization.
- 6.3 Lanczos algorithms for unconstrained optimization.
- 7.1 Nonlinear Least Squares.
- Lecture 7
- 7.2 Nonlinear Equations.
- 8.1 Introduction (Section 8 follows chapter 12 of Noceral and Wright)
- 8.2 Examples
- 8.3 Implicit function theorem review.
- Lecture 8
- 8.4 First-order optimality conditions.
- 8.5 Second-order conditions.
- 10.1 Gradient projection methods for quadratic programming.
- Lecture 9
- 10.2 Quadratic Programs with Equality Constraints.
- 10.2.1 Direct Methods
- 10.2.2 Explicit use of feasible set.
- 10.2.3 Null Space Method.
- 10.2.4 Iterative Methods: CG applied to Reduced System
- 10.2.5 Iterative Methods: projected CG.
- 10.3: Interior-point methods for quadratic programming.
- Lecture 10
- 9.1 Types of constrained optimization algorithms (Section 9 follows chapter 15 in Nocedal and Wright).
- 9.2 Merit functions and filters.
- 9.3 Maratos effect and curvilinear search.
- 11.1 Augmented Lagrangian
- 11.2 Interior-Point Algorithms
- 11.3 Sequential Quadratic Programming.
Homeworks
- Homework 1. Due 01/13/2011.
- Homework 2. Due 01/20/2011. You will need the files cute.m and cute_wrap.m
- Homework 3. Due 01/27/2011.
- Homework 4, Due 02/03/2011.
Resources:
- Textbook: Nocedal and Wright, "Numerical Optimization", Springer, 2006. Available online for free for members of the University of Chicago Community.
- Excellent course page for a computation class from Maria Emelianenko from GMU.
- Harvey Greenberg's Myths and Counterexamples in Mathematical Programming page on the INFORMS site.
- The Matlab Exchange: A large set of free Matlab Programs contributed by Matlab Users. I start here many times when designing and developing a new project
- The Matlab Documentation page, (downloadable pdf).
- Tim Kelley's collection of Matlab Programs for Optimization..
- LSTRS, a Matlab large-scale trust-region algorithm implementation.
- The AMPL modeling language web site.
What I assume is known by the students.
- Multivariate calculus.
- First-order necessary optimality conditions for unconstrained optimization.
- Linear algebra. Positive definite and semidefinite symmetric matrices, eigenvalues.
- Beginner Matlab.
|
|