- 67 Views
- Uploaded on
- Presentation posted in: General

Steepest Decent and Conjugate Gradients (CG)

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.

- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

- Solving of the linear equation system

- Solving of the linear equation system
- Problem: dimension n too big, or not enough time for gauss elimination
Iterative methods are used to get an approximate solution.

- Solving of the linear equation system
- Problem: dimension n too big, or not enough time for gauss elimination
Iterative methods are used to get an approximate solution.

- Definition Iterative method: given starting point , do steps
hopefully converge to the right solution

- Solving is equivalent to minimizing

- Solving is equivalent to minimizing
- A has to be symmetric positive definite:

- If A is also positive definite the solution of is the minimum

- If A is also positive definite the solution of is the minimum

- error:
The norm of the error shows how far we are away from the exact solution, but canâ€™t be computed without knowing of the exact solution .

- error:
The norm of the error shows how far we are away from the exact solution, but canâ€™t be computed without knowing of the exact solution .

- residual:
can be calculated

- We are at the point . How do we reach ?

- We are at the point . How do we reach ?
- Idea: go into the direction in which decreases most quickly ( )

- We are at the point . How do we reach ?
- Idea: go into the direction in which decreases most quickly ( )
- how far should we go?

- We are at the point . How do we reach ?
- Idea: go into the direction in which decreases most quickly ( )
- how far should we go?
Choose so that is minimized:

- We are at the point . How do we reach ?
- Idea: go into the direction in which decreases most quickly ( )
- how far should we go?
Choose so that is minimized:

- We are at the point . How do we reach ?
- Idea: go into the direction in which decreases most quickly ( )
- how far should we go?
Choose so that is minimized:

- We are at the point . How do we reach ?
- Idea: go into the direction in which decreases most quickly ( )
- how far should we go?
Choose so that is minimized:

- We are at the point . How do we reach ?
- Idea: go into the direction in which decreases most quickly ( )
- how far should we go?
Choose so that is minimized:

- We are at the point . How do we reach ?
- Idea: go into the direction in which decreases most quickly ( )
- how far should we go?
Choose so that is minimized:

one step of steepest decent can be calculated as follows:

one step of steepest decent can be calculated as follows:

- stopping criterion: or with an given small
It would be better to use the error instead of the residual, but you canâ€™t calculate the error.

Method of steepest decent:

- As you can see the starting point is important!

- As you can see the starting point is important!
When you know anything about the solution use it to guess a good starting point. Otherwise you can choose a starting point you want e.g. .

- Definition energy norm:

- Definition energy norm:
- Definition condition:
( is the largest and the smallest eigenvalue of A)

- Definition energy norm:
- Definition condition:
( is the largest and the smallest eigenvalue of A)

convergence gets worse when the condition gets larger

- is there a better direction?

- is there a better direction?
- Idea: orthogonal search directions

- is there a better direction?
- Idea: orthogonal search directions

- is there a better direction?
- Idea: orthogonal search directions
- only walk once in each direction and minimize

- is there a better direction?
- Idea: orthogonal search directions
- only walk once in each direction and minimize
maximal n steps are needed to reach the exact solution

- is there a better direction?
- Idea: orthogonal search directions
- only walk once in each direction and minimize
maximal n steps are needed to reach the exact solution

has to be orthogonal to

- example with the coordinate axes as orthogonal search directions:

- example with the coordinate axes as orthogonal search directions:
Problem: canâ€™t be computed because (you donâ€™t know !)

- new idea: A-orthogonal

- new idea: A-orthogonal
- Definition A-orthogonal: A-orthogonal
(reminder: orthogonal: )

- new idea: A-orthogonal
- Definition A-orthogonal: A-orthogonal
(reminder: orthogonal: )

- now has to be A-orthogonal to

- new idea: A-orthogonal
- Definition A-orthogonal: A-orthogonal
(reminder: orthogonal: )

- now has to be A-orthogonal to

- new idea: A-orthogonal
- Definition A-orthogonal: A-orthogonal
(reminder: orthogonal: )

- now has to be A-orthogonal to
can be computed!

- A set of A-orthogonal directions can be found with n linearly independent vectors and conjugate Gram-Schmidt (same idea as Gram-Schmidt).

- Gram-Schmidt:
linearly independent vectors

- Gram-Schmidt:
linearly independent vectors

- Gram-Schmidt:
linearly independent vectors

- conjugate Gram-Schmidt:

- A set of A-orthogonal directions can be found with n linearly independent vectors and conjugate Gram-Schmidt (same idea as Gram-Schmidt).
- CG works by setting (makes conjugate Gram-Schmidt easy)

- A set of A-orthogonal directions can be found with n linearly independent vectors and conjugate Gram-Schmidt (same idea as Gram-Schmidt).
- CG works by setting (makes conjugate Gram-Schmidt easy)
with

- for steepest decent for CG
Convergence of CG is much better!