1 / 12

Perceptron Networks and Vector Notation

Perceptron Networks and Vector Notation. CS/PY 231 Lab Presentation # 3 January 31, 2005 Mount Union College. A Multiple Perceptron Network for computing the XOR function. We found that a single perceptron could not compute the XOR function

lester-hahn
Download Presentation

Perceptron Networks and Vector Notation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Perceptron Networks and Vector Notation • CS/PY 231 Lab Presentation # 3 • January 31, 2005 • Mount Union College

  2. A Multiple Perceptron Network for computing the XOR function • We found that a single perceptron could not compute the XOR function • Solution: set up one perceptron to detect if x1 = 1 and x2 = 0 • set up another perceptron for x1 = 0 and x2 = 1 • feed the outputs of these two perceptrons into a third one that produces an output of 1 if either input is a 1

  3. A Nightmare! • Even for this simple example, choosing the weights that cause a network to compute the desired output takes skill and lots of patience • Much more difficult than programming a conventional computer: • OR function: if x1 + x2 > 1, output 1; otherwise output 0 • XOR function: if x1 + x2 = 1, output 1; otherwise output 0

  4. There must be a better way…. • These labs and demos were designed to show that manually adjusting weights is tedious and difficult • This is not what happens in nature • No creature says, “Hmmm, what weight should I choose for this neural connection?” • Formal training methods exist that allow networks to learn by updating weights automatically (explored next week)

  5. Expanding to More Inputs • artificial neurons may have many more than two input connections • calculation performed is the same: multiply each input by the weight of the connection, and find the sum of all of these products • notation can become unwieldy: • sum = x1·w1 + x2·w2 + x3·w3 + … + x100·w100

  6. Some Mathematical Notation • Most references (e.g., Plunkett & Elman text) use mathematical summation notation • Sums of large numbers of terms are represented by the symbol Sigma () • previous sum is denoted as: 100  xk·wk k = 1

  7. Summation Notation Basics • Terms are described once, generally • Index variable shows range of possible values • Example: 5  k / (k - 1) = 3/2 + 4/3 + 5/4 k = 3

  8. Summation Notation Example • Write the following sum using Sigma notation: 3·x0 + 4·x1 + 5·x2 + 6·x3 + 7·x4 + 8·x5 + 9·x6 + 10·x7 • Answer: 7  (k + 3) ·xk k = 0

  9. Vector Notation • The most compact way to specify values for inputs and weights when we have many connections • The ORDER in which values are specified is important • Example: if w1 = 3.5, w2 = 1.74, and w3 = 18.2, we say that the weight vector w = (3.5, 1.74, 18.2)

  10. Vector Operations • Vector Addition: adding two vectors means adding the values from the same position in each vector • result is a new vector • Example: (9.2, 0, 17) + (1, 2, 3) = (10.2, 2, 20) • Vector Subtraction: subtract corresponding values • (9.2, 0, 17) - (1, 2, 3) = (8.2, -2, 14)

  11. Vector Operations • Dot Product: mathematical name for what a perceptron does • x · m = x1·m1 + x2·m2 + x3·m3 + … + xlast·mlast • Result of a Dot Product is a single number • example: (9.2, 0, 17) · (1, 2, 3) = 9.2 + 0 + 51 = 60.2

  12. Perceptron Networks and Vector Notation • CS/PY 231 Lab Presentation # 3 • January 31, 2005 • Mount Union College

More Related