Finding Roots of Functions

1 / 33

Finding Roots of Functions - PowerPoint PPT Presentation

Andrew Tomko COT 4810 26 February 2008. Finding Roots of Functions. Functions. “relation between two sets in which one element of the second set is assigned to each element of the first set” Example: y = x^2 Bad example: y^2 = x^2. /root.

I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.

Finding Roots of Functions

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.

- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
Functions
• “relation between two sets in which one element of the second set is assigned to each element of the first set”
• Example: y = x^2
• Bad example: y^2 = x^2
/root
• The value for 'x' for which 'y' will equal zero
• If there is no 'y', can rearrange function so that it equals zero
• Functions can have none, one, or many roots
Who cares?
• Mathematicians
• Physicians
• Lots of people
• Computer Scientists
Methods for finding a root
• Bisection
• Newton-Raphson
• Secant
• False Position (regula falsi)‏
• Müller
• Brent
Intermediate Value Theorem
• If y=f(x) is continuous on [a,b], and N is a number between f(a) and f(b), then there is at least one c ∈ [a,b] such that f(c) = N.
Bisection Method
• Easiest
• Positive/Negative values chosen, then bisection
• Repeats until root is found
• Absolute error is halved each step
• Runs in linear time
• Has problems with multiple roots
Secant Method
• Similar to bisection
• Takes secant of two points on the function and then moves points closer until root is found
• Does not always converge, runs in superlinear time, faster than Newton's method in practice
• Based on recurrence relation:
False Position (regula falsi) Method
• Combination of bisection and secant
• Takes to points: one above/below root
• Runs in superlinear time
• First recorded use in 3rd century BC
• Based on:
Müller's Method
• Based on secant method, but takes 3 points
• Faster than the secant method, slower than Newton's method
• Quadratic formula can yield complex values
Polynomial Interpolation
• Given a series of points, find a polynomial function that satisfies these points.
• Useful for transforming more difficult functions (logarithmic, trigonometric, etc)‏
• Rarely used on its own, because it can fail easily if points not chosen close to root.
• Tries to create a quadratic interpolation for the function's inverse.
• Using linear interpolation: secant method
• Interpolating f instead of the inverse: Müller's method
Brent's Method
• Combination of bisection, secant, and inverse quadratic interpolation
• Reliability of bisection, speed of secant/interpolation
• Will take at most N^2 iterations, where N is the number of iterations when using bisection
Householder's Methods
• Methods for finding a root in a function with one real variable and has a continuous derivative.
• Derives from geometric series.
• The iterations will converge to a zero if the first “guess” is close enough.
Newton-Raphson History
• First described by Issac Newton in 1669
• First published in 1685 in A Treatise of Algebra both Historical and Practical
• Joseph Raphson published a simplified version in 1690.
• Newton most likely got the formula from French mathematician François Viète seigneur de la Bigotière.
Newton-Raphson Method
• Very efficient method for real functions
• Quadratic convergence: the number of correct digits doubles every iteration
• Possibility to not converge
• Requires the calculation of the function's derivative:
Newton-Raphson fail to converge
• f(x) = x^3 – x – 3
• f'(x) = 3x^2 – 1
• Using x0 = 0, will start to cycle between xm=-3 and xn=0
Newton-Raphson etc.
• Can be used to find multi-dimensional roots.
• Can be used to find roots in systems of non-linear, multivariable functions.
• Can be used to find the local minimums / maximums in a given function.
• Can be used to find complex roots, creates “Newton Fractals”
Roots of Polynomials
• For degrees less than five, the quadratic formula is generally the best. Can sometimes rewrite higher degree functions (x^6 – 4x^3 + 8 » u^2 – 4u + 8)‏
• Sturm's Theorem for finding number of real roots.
• Laguerre's method has cubic convergence.
• Problems with polynomials: Wilkinson
Sturm's Theorem
• Used to determine the number of unique roots of a polynomial.
• Applies Euclid's algorithm to X and X'
• The final polynomial, Xr, is the GCD of X and X'
• Number of unique roots found by counting the number of sign changes.
Fundamental Theorem of Algebra
• “Every non-constant single-variable polynomial with complex coefficients has at least one complex root”
• This can be reworked to show that every nth degree polynomial can be written in the form: f(x) = C(x-x1)(x-x2)...(x-xn)‏
Laguerre's Method
• Pretty much guaranteed a convergence on root, regardless of initial value.
• Cubic convergence (!)‏
• Takes the natural log of both sides of formula on last slide, then takes two derivatives. (First derivative = G, Second derivative = H)‏
• Where a = the distance from your guess to the next root, and n = which root you're looking for:
Sources
• Burden, Richard L. & Faires, J. Douglas (2000), Numerical Analysis (7th ed.), Brooks/Cole
• Dewdney, A. K. The New Turing Omnibus. New York: Henry Holt and Company, LLC, 1993.
• http://www.wikipedia.org/
Questions
• 1) Do the next iteration of the first Newton-Raphson example.
• 2) What theorem do many of these methods depend on?