Improving Performance of The Interior Point Method by Preconditioning. Final Report Project by: Ken Ryals For: AMSC 663-664 Fall 2007-Spring 2008 6 May 2008. Outline. Problem Approach Results Recommendations. Problem Approach Results Recommendations. Problem Statement.
Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.
Project by: Ken Ryals
For: AMSC 663-664
Fall 2007-Spring 2008
6 May 2008
In constrained optimization…
minimize cTx varying x
subject to: Ax=b with x≥0
maximizebTy varying y
The IPM solves a sequence of linearly constrained optimization problems in the primal and dual spaces simultaneously to approach the desired solution from within the feasible region.
A Δx = 0
ATΔy + Δz = 0.
x is the unknown
y is the “dual” of x
z is the “slack”
Solve a Distributed Command and Control optimization problem for the Office of the Secretary of Defense/Acquisition and Technology (OSD/A&T).
Since the problem dimensionality is small (~dozens), the key issue is stability.
ex: 108 104
where D2 ~ diag(x/z)
Q: n by n and orthonormal, R: n by m upper triangular
The specific preconditioning being examined in this research is as follows:
We are solving: A D2 AtΔy = −Ar
What if we pre-multiplied by A-1?
A-1 A D2 AtΔy = − A-1 Ar
This would give us:
D2 AtΔy = − r
A is not square, so it isn’t invertible; but AAt is…
(AAt)-1 A D2 AtΔy = − (AAt)-1 Ar
Conceptually, this gives us:
(At)-1D2 AtΔy = − (AT)-1 r
which would be a similarity transformation on D2 using At.
With (P)CG and without (“\”)
A General Linear Model (GLM) of the results:
A GLM like above fits a multi-dimensional “line” through the data; for example:
Iterations ≈101.2 + 44.7*(if_CG) + 24.8 *(if_Precond) - 4.7*(if_QR)
A look at the rank of algorithms is:
Average ranking over all problems (1=best, 8=worst)
Results! Why, man, I have gotten a lot of results. I know several thousand things that won't work. Thomas A. Edison
For many helpful suggestions and thought provoking questions.
Contact: KenRyals “at” aol.com
Kenneth.Ryals “at” jhuapl.edu
Ref:An Introduction to the Conjugate Gradient Method Without the Agonizing Pain, Edition 1¼, by Jonathan Richard Shewchuk, August 4, 1994, School of Computer Science, Carnegie Mellon University.