Line Search. Line search techniques are in essence optimization algorithms for one-dimensional minimization problems. They are often regarded as the backbones of nonlinear optimization algorithms. Typically, these techniques search a bracketed interval. Often, unimodality is assumed.
Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.
They are often regarded as the backbones of nonlinear optimization algorithms.
Typically, these techniques search a bracketed interval.
Often, unimodality is assumed.Line Search
Exhaustive search requires N = (b-a)/ + 1 calculations to search the above interval, where is the resolution.
0) assume an interval [a,b]
1) Find x1 = a + (b-a)/2 - /2 and x2 = a+(b-a)/2 + /2 where is the resolution.
2) Compare ƒ(x1) and ƒ(x2)
3) If ƒ(x1) < ƒ(x2) then eliminate x > x2 and set b = x2
If ƒ(x1) > ƒ(x2) then eliminate x < x1 and set a = x1
If ƒ(x1) = ƒ(x2) then pick another pair of points
4) Continue placing point pairs until interval < 2 Basic bracketing algorithm
Method is sometimes referred to as a line search by curve fit because it approximates the real (unknown) objective function to be minimized.
Thus, try to approximate second order derivative.\False Position Method or Secant Method
Replace y''(xk) in Newton Raphson with
Hence, Newton Raphson becomes
Main advantage is no second derivative requirement
Question: Why is this an advantage ?