A note about gradient descent:
Download
1 / 21

A note about gradient descent: Consider the function f(x)=(x-x 0 ) 2 Its derivative is: - PowerPoint PPT Presentation


  • 100 Views
  • Uploaded on

A note about gradient descent: Consider the function f(x)=(x-x 0 ) 2 Its derivative is: By gradient descent. + -. x 0. Solving the differential equation:. or in the general form:. What is the solution of this type of equation:. Try:. THE PERCEPTRON: (Classification).

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about ' A note about gradient descent: Consider the function f(x)=(x-x 0 ) 2 Its derivative is:' - tad-harper


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript

A note about gradient descent:

Consider the function f(x)=(x-x0)2

Its derivative is:

By gradient descent .

+ -

x0


Solving the differential equation:

or in the general form:

What is the solution of this type of equation:

Try:


THE PERCEPTRON:

(Classification)

Threshold unit:

where is the output for input pattern ,

are the synaptic weights and is the desired output

AND

w1 w2 w3 w4 w5


1

0 1

-1.5

1 1

AND

Linearly seprable


-0.5

1 1

OR

1

0 1

Linearly separable


Perceptron learning rule:

Convergence proof:

Hertz, Krough, Palmer (HKP)

- did you receive the email?

Assignment 3a:

program in matlab a preceptron

with a perceptron learning rule

and solve the OR, AND and XOR problems. (Due before Feb 27)

w1 w2 w3 w4 w5

Show Demo



Linear single layer network:

( approximation, curve fitting)

*

or

Linear unit:

where is the output for input pattern ,

are the synaptic weights and is the desired output

Minimize mean square error:

w1 w2 w3 w4 w5


Linear single layer network:

( approximation, curve fitting)

Linear unit:

where is the output for input pattern ,

are the synaptic weights and is the desired output

Minimize mean square error:

w1 w2 w3 w4 w5


The best solution is obtained when E is minimal.

For linear neurons there is an exact solution for this called the pseudo-inverse (see HKP).

Looking for a solution by gradient descent:

-gradient

E

w

Chain rule


and

Since:

Error:

Therefore:

Which types of problems can a linear network solve?


Sigmoidal neurons:

for

example:

Which types of problems can a sigmoidal networks solve?

Assignment 3b – Implement a one layer linear and sigmoidal network, fit a 1D a linear, a sigmoid and a quadratic function, for both networks.


Multi layer networks:

Output layer

  • Can solve non linearly separable classification problems.

  • Can approximate any arbitrary function, given ‘enough’ units in the hidden layer.

Hidden layer

Input layer



Solving linearly inseparable problems

XOR

Hint: XOR = or and not and


XOR

-.5

1 0.5

.5

0

0.5 -0.5 1 -1

How do we learn a multi-layer network

The credit assignment problem !


Gradient descent/ Back Propagation, the solution to the credit assignment problem:

Where:

{

From hidden layer to output weights:


For input to hidden layer: credit assignment problem:

{

Where:

and

and


and credit assignment problem:

For input to hidden layer:

Assignment 3c: Program a 2 layer network in matlab, solve the XOR problem. Fit the curve: x(x-1) between 0 and 1, how many hidden units did you need?


  • Formal neural networks can accomplish many tasks, for example:

  • Perform complex classification

  • Learn arbitrary functions

  • Account for associative memory

  • Some applications: Robotics, Character recognition, Speech recognition,

  • Medical diagnostics.

  • This is not Neuroscience, but is motivated loosely by neuroscience and carries important information for neuroscience as well.

  • For example: Memory, learning and some aspects of development are assumed to be based on synaptic plasticity.


What did we learn today? example:

Is BackProp biologically realistic?


ad