1 / 16

PSY105 Neural Networks 2/5

PSY105 Neural Networks 2/5. 2. “A universe of numbers”. Lecture 1 recap. We can describe patterns at one level of description that emerge due to rules followed at a lower level of description.

les
Download Presentation

PSY105 Neural Networks 2/5

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. PSY105 Neural Networks 2/5 2. “A universe of numbers”

  2. Lecture 1 recap • We can describe patterns at one level of description that emerge due to rules followed at a lower level of description. • Neural network modellers hope that we can understand behaviour by creating models of networks of artificial neurons.

  3. Warren McCullock 1943 - First artificial neuron model Warren McCulloch (neurophysiologist) Walter Pitts (mathematician)

  4. input weight activation Threshold A simple artificial neuron Threshold logic unit (TLU) Add Multiply inputs by weights and add. If the sum is larger than a threshold output 1, otherwise output 0

  5. output 1 activation 0 threshold TLU: the output relation The relation is non-linear – small changes in activation give different changes in the output depending on the initial activation

  6. input weight activation Squashing function Add Semilinear node

  7. output 1 activation 0 threshold Semilinear node: the output relation (squashing function)

  8. Model neuron function, reminders… • Inputs vary, they can be 0 or 1 • Weights change, effectively ‘interpreting’ inputs • There is a weight for each input • This can be a +ve number (excitation) or a –ve number (inhibition) • Weights do not change when inputs change • Activation = weighted sum of inputs • Activation = input1 x weight1 + input2xweight2 etc • If activation>threshold, output = 1, otherwise output=0 • Threshold = 1

  9. Computing with neurons: identify (1) input output weight Act. ? X State 1 State 2 Threshold = 1

  10. Computing with neurons: identity (2) input output weight Act. ? State 1 State 2 Threshold = 1

  11. Computing with neurons: AND inputs output weights Act. ? State 1 State 2 State 3 State 4 Threshold = 1, Weight 1 = 0.5, Weight 2 = 0.5

  12. Networks of such neurons are Turing complete 1912 - 1954

  13. Question: How could you use these simple neurons (TLUs) to compute the NOR (‘NOT OR’) function?

  14. Computing with neurons: NORa clue inputs output weights Input 1 (varies) Act. ? Input 2 (varies) Tonically active Input (always = 1)

  15. Model neuron function, reminders… • Inputs vary, they can be 0 or 1 • Weights change, effectively ‘intepreting’ inputs • There is a weight for each input • This can be a +ve number (excitation) or a –ve number (inhibition) • Weights do not change when inputs change • Activation = weighted sum of inputs • Activation = input1 x weight1 + input2xweight2 etc • If activation>threshold, output = 1, otherwise output=0 • Threshold = 1

  16. Computing with neurons: NORone way inputs output weights Input 1 (varies) Act. ? Input 2 (varies) Tonically active Input (always = 1) Threshold = 1, Weight 1 = -1, Weight 2 = -1 Weight 3 = +1

More Related