1 / 11

Demo Outline (For reference)

Demo Outline (For reference). Gradient Descent Method is a first-order optimization algorithm. To find a local minimum of a function, one takes a step proportional to the negative of the gradient of the function at the current point. This demo illustrates the Gradient Descent Method.

Download Presentation

Demo Outline (For reference)

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. STAM Interactive Solutions

  2. Demo Outline (For reference)

  3. Gradient Descent Method is a first-order optimization algorithm. To find a local minimum of a function, one takes a step proportional to the negative of the gradient of the function at the current point. This demo illustrates the Gradient Descent Method.

  4. Gradient Descent Method • Gradient is the slope of a function. • Optimization is finding the “best” value of a function which is the minimum value of the function. • The number of “turning points” of a function depend on the order of the function. • Not all turning points are minima. • The least of all the minimum points is called the “global” minimum. • Every minimum is a “local” minimum.

  5. Simulation Function: START Select a function and the starting point. Click START.

  6. Resources • Books: • Practical Optimization, Philip E. Gill, Walter Murray, and Margaret H. Wright, Academic Press, 1981 • Practical Optimization: Algorithms and Engineering Applications , Andreas Antoniou and Wu-Sheng Lu , 2007 • Reference Links: • www-fp.mcs.anl.gov/OTC/Guide/

  7. The Gradient Descent method will fail for a jump discontinuous objective function fail for a kink discontinuous objective function work for a jump discontinuous objective function work for a kink discontinuous objective function

  8. The rate of convergence is dependent on the starting point dependent on the step size independent of the step size independent of the starting point

  9. For a function that has more than one minimum, the method will always find the global minimum find the global minimum only if the start point is selected at the correct location fail to find the global minimum if the function has one or more inflection point find the global minimum only if the step size is correct

  10. The step size is selected based on the continuity of the objective function based on the degree of the objective function based on the gradient of the objective function based on the available computing resources

  11. Locating a maximum is not possible possible if the function is convex possible if the function is concave possible if the step size is along the positive gradient

More Related