slide1
Download
Skip this Video
Download Presentation
A note about gradient descent: Consider the function f(x)=(x-x 0 ) 2 Its derivative is:

Loading in 2 Seconds...

play fullscreen
1 / 21

A note about gradient descent: Consider the function f(x)=(x-x 0 ) 2 Its derivative is: - PowerPoint PPT Presentation


  • 102 Views
  • Uploaded on

A note about gradient descent: Consider the function f(x)=(x-x 0 ) 2 Its derivative is: By gradient descent. + -. x 0. Solving the differential equation:. or in the general form:. What is the solution of this type of equation:. Try:. THE PERCEPTRON: (Classification).

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about ' A note about gradient descent: Consider the function f(x)=(x-x 0 ) 2 Its derivative is:' - tad-harper


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
slide1
A note about gradient descent:

Consider the function f(x)=(x-x0)2

Its derivative is:

By gradient descent .

+ -

x0

slide2
Solving the differential equation:

or in the general form:

What is the solution of this type of equation:

Try:

slide3
THE PERCEPTRON:

(Classification)

Threshold unit:

where is the output for input pattern ,

are the synaptic weights and is the desired output

AND

w1 w2 w3 w4 w5

slide4
1

0 1

-1.5

1 1

AND

Linearly seprable

slide5
-0.5

1 1

OR

1

0 1

Linearly separable

slide6
Perceptron learning rule:

Convergence proof:

Hertz, Krough, Palmer (HKP)

- did you receive the email?

Assignment 3a:

program in matlab a preceptron

with a perceptron learning rule

and solve the OR, AND and XOR problems. (Due before Feb 27)

w1 w2 w3 w4 w5

Show Demo

slide8
Linear single layer network:

( approximation, curve fitting)

*

or

Linear unit:

where is the output for input pattern ,

are the synaptic weights and is the desired output

Minimize mean square error:

w1 w2 w3 w4 w5

slide9
Linear single layer network:

( approximation, curve fitting)

Linear unit:

where is the output for input pattern ,

are the synaptic weights and is the desired output

Minimize mean square error:

w1 w2 w3 w4 w5

slide10
The best solution is obtained when E is minimal.

For linear neurons there is an exact solution for this called the pseudo-inverse (see HKP).

Looking for a solution by gradient descent:

-gradient

E

w

Chain rule

slide11
and

Since:

Error:

Therefore:

Which types of problems can a linear network solve?

slide12
Sigmoidal neurons:

for

example:

Which types of problems can a sigmoidal networks solve?

Assignment 3b – Implement a one layer linear and sigmoidal network, fit a 1D a linear, a sigmoid and a quadratic function, for both networks.

slide13
Multi layer networks:

Output layer

  • Can solve non linearly separable classification problems.
  • Can approximate any arbitrary function, given ‘enough’ units in the hidden layer.

Hidden layer

Input layer

slide15
Solving linearly inseparable problems

XOR

Hint: XOR = or and not and

slide16
XOR

-.5

1 0.5

.5

0

0.5 -0.5 1 -1

How do we learn a multi-layer network

The credit assignment problem !

slide17
Gradient descent/ Back Propagation, the solution to the credit assignment problem:

Where:

{

From hidden layer to output weights:

slide19
and

For input to hidden layer:

Assignment 3c: Program a 2 layer network in matlab, solve the XOR problem. Fit the curve: x(x-1) between 0 and 1, how many hidden units did you need?

slide20
Formal neural networks can accomplish many tasks, for example:
  • Perform complex classification
  • Learn arbitrary functions
  • Account for associative memory
  • Some applications: Robotics, Character recognition, Speech recognition,
  • Medical diagnostics.
  • This is not Neuroscience, but is motivated loosely by neuroscience and carries important information for neuroscience as well.
  • For example: Memory, learning and some aspects of development are assumed to be based on synaptic plasticity.
slide21
What did we learn today?

Is BackProp biologically realistic?

ad