Notes on backpropagation
Download
1 / 26

Notes on Backpropagation - PowerPoint PPT Presentation


  • 65 Views
  • Uploaded on

Notes on Backpropagation. Alex Churchill. Feed Forward. Node C = sigmoid(A * weight ca + B * weight cb ). C. D. Feed Forward. Node C = sigmoid(0.1 * 0.1+ 0.7*0.5). C. D. Feed Forward. Node C = sigmoid(0.01+0.35) = 0.59. C. 0.59. D. Feed Forward.

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about ' Notes on Backpropagation' - kinsey


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
Notes on backpropagation

Notes on Backpropagation

Alex Churchill


Feed forward
Feed Forward

  • Node C = sigmoid(A * weightca + B * weightcb)

C

D


Feed forward1
Feed Forward

  • Node C = sigmoid(0.1 * 0.1+ 0.7*0.5)

C

D


Feed forward2
Feed Forward

  • Node C = sigmoid(0.01+0.35) = 0.59

C

0.59

D


Feed forward3
Feed Forward

  • Node D = sigmoid(A * weightda + B * weightdb)

C

0.59

D


Feed forward4
Feed Forward

  • Node D = sigmoid(0.1 * 0.3+ 0.7*0.2)

C

0.59

D


Feed forward5
Feed Forward

  • Node D = sigmoid(0.03+0.14) = 0.54

C

0.59

E

D

0.54


Feed forward6
Feed Forward

  • Node E = sigmoid(C * weightec + D * weighted)

C

0.59

E

D

0.54


Feed forward7
Feed Forward

  • Node E = sigmoid(0.59*0.2 + 0.54*0.1)=0.542

C

0.59

E

D

0.54


Feed forward8
Feed Forward

  • Node E = sigmoid(0.59*0.2 + 0.54*0.1)=0.542

C

0.59

E

0.542

0.542

D

0.54


Backpropagation
Backpropagation

  • Calculate error for each output neuron at the output layer (L)

  • For each hidden layer (L-1 to L – n) pass the error backwards from the layer above

  • Update the weights connecting the last hidden layer (L-1) to the output layer (L)

  • Update the weights connecting each lower layer


Backpropagation1
Backpropagation

  • Calculate error (δk)for each output neuron (k) at the output layer (L)

    This is calculated using:

    δk=(yk-tk)*g'(xk)

    Where g’ is the first derivative of the sigmoid and xk is the pre-sigmoided output


Backpropagation2
Backpropagation

δk=(yk-tk)*g'(xk)

δk=(0.542-1)*g'(xk)=(0.542-1)*(0.542)*(1-0.542) = -0.114

Target = 1

C

Learning rate = 1

0.59

E

0.542

0.542

D

0.54


Backpropagation3
Backpropagation

δk=(yk-tk)*g'(xk)

δk=(0.542-1)*g'(xk)=(0.542-1)*(0.542)*(1-0.542) = -0.114

Target = 1

C

Learning rate = 1

δk=-0.114

0.59

E

0.542

0.542

D

0.54


Backpropagation4
Backpropagation

2. For each hidden layer (L-1 to L – n) pass the error backwards from the layer above

This is calculated using:

Where j is the hidden neuron and k is the output neuron


Backpropagation5
Backpropagation

δc=(wecδk)*g'(xc) = 0.2 * -0.114 * 0.59*(1-0.59)=-0.0055

δd=(wedδk)*g'(xd) = 0.1 * -0.114 * 0.54*(1-0.54)=-0.0028

Target = 1

C

Learning rate = 1

δk=-0.114

0.59

E

0.542

0.542

D

0.54


Backpropagation6
Backpropagation

δc=-0.0055

δc=(wecδk)*g'(xc) = 0.2 * -0.114 * 0.59*(1-0.59)=-0.0055

δd=(wedδk)*g'(xd) = 0.1 * -0.114 * 0.54*(1-0.54)=-0.0028

Target = 1

C

Learning rate = 1

δk=-0.114

0.59

E

δd=-0.0028

0.542

0.542

D

0.54


Backpropagation7
Backpropagation

3. Update the weights connecting the last hidden layer (L-1) to the output layer (L)

This is calculated using:

Where j is the hidden neuron and k is the output neuron. aj is the sigmoided output of the hidden neuron


Backpropagation8
Backpropagation

δc=-0.0055

Wec=wec-ηδeC = 0.2 - 1* -0.114*0.59=0.267

Wed=wed -ηδed = 0.1 - 1* -0.114*0.54=0.162

Target = 1

C

Learning rate = 1

δe=-0.114

0.59

E

δd=-0.0028

0.542

0.542

D

0.54


Backpropagation9
Backpropagation

δc=-0.0055

Wec=wec-ηδeC = 0.2 - 1* -0.114*0.59=0.267

Wed=wed -ηδed = 0.1 - 1* -0.114*0.54=0.162

Target = 1

C

Learning rate = 1

0.267

0.59

E

δd=-0.0028

0.542

0.542

D

0.162

0.54


Backpropagation10
Backpropagation

4. Update the weights connecting each lower layer.

This is calculated using:

Where j is the hidden neuron (or input neuron) in the layer below and k is the hidden neuron in the layer above.


Backpropagation11
Backpropagation

δc=-0.0055

Wca=wca-ηδcA = 0.1 - 1* -0.0055*0.1=0.1005

Wcd=wcd-ηδdA = 0.3 - 1* -0.0028*0.1=0.3003

Target = 1

C

Learning rate = 1

0.267

0.59

E

δd=-0.0028

0.542

0.542

D

0.162

0.54


Backpropagation12
Backpropagation

δc=-0.0055

Wca=wca-ηδcA = 0.1 - 1* -0.0055*0.1=0.1005

Wcd=wcd-ηδdA = 0.3 - 1* -0.0028*0.1=0.3003

Target = 1

C

0.1005

Learning rate = 1

0.267

0.59

0.3003

E

δd=-0.0028

0.542

0.542

D

0.162

0.54





ad