Simple perceptrons
This presentation is the property of its rightful owner.
Sponsored Links
1 / 11

Simple Perceptrons PowerPoint PPT Presentation


  • 42 Views
  • Uploaded on
  • Presentation posted in: General

Simple Perceptrons. Or one-layer feed-forward networks. Perceptrons or Layered Feed-Forward Networks. Equation governing comp of simple perceptron. activation function, usually nonlinear, e.g. step function or sigmoid. ksi. Threshold or no threshold?. with threshold.

Download Presentation

Simple Perceptrons

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Presentation Transcript


Simple perceptrons

Simple Perceptrons

Or one-layer feed-forward networks


Perceptrons or layered feed forward networks

Perceptrons or Layered Feed-Forward Networks


Equation governing comp of simple perceptron

Equation governing comp of simple perceptron

activation function, usually nonlinear, e.g. step function or sigmoid

ksi


Threshold or no threshold

Threshold or no threshold?

with threshold

without threshold; threshold simulated with connections to an input terminal permanently tied to -1


The general association matching task

The General Association (Matching) Task:

Is to ask for: actual output pattern = target pattern


Threshold units

Threshold Units

  • Start with simplest threshold unit, practical for 1-level perceptrons

  • Also assume the targets have plus/minus 1 values and no values in between those extremes, that is,

  • Then all that matter is that for each input pattern, the net input (weighted sum) h to each output unit has the same sign as the target zeta


A notational simplification

A Notational Simplification

  • To simplify notation, note that the output units are independent

  • [In a multilayer nn, however, the hidden (non-output) layers aren’t independent]

  • So let’s consider only one output at a time

  • Drop the i subscripts

Weights and each input pattern live in the same space.

Advantage: can geometrically represent these two vectors together.


New form for general association task geometric interpretation

New Form for General Association Task: geometric interpretation

Another form:


A simple learning algorithm

A simple learning algorithm

  • Also called the Perceptron Rule

  • Go through the input patterns one by one

  • For each pattern go through the output units one by one, asking whether output is the desired one.

  • If so, leave the weight into that unit alone

  • Else in the spirit of Hebb add to each connection something proportional to product of the input and desired output


Simple perceptrons

Simplified Simple Learning Algorithm(for one neuron case)

  • Start with w = 0 (not necessary)

  • Cycle through the learning patterns

    • For each pattern ksi

      • If the output (O) != desired output (zeta), add product of the desired output and the input to w. (i.e., w = w + z*x)

  • Keep cycling through the patterns until done.

  • Convergence is guaranteed provided the two classes of input points are linearly separable.

    • Perceptron convergence theorem guarantees this


Simple perceptrons

Weight Update Formula,“Hebbian” from blue book, too complicated


  • Login