Loading in 5 sec....

Notes on BackpropagationPowerPoint Presentation

Notes on Backpropagation

- 66 Views
- Uploaded on

Download Presentation
## PowerPoint Slideshow about ' Notes on Backpropagation' - kinsey

**An Image/Link below is provided (as is) to download presentation**

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.

- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Presentation Transcript

### Notes on Backpropagation

Alex Churchill

Backpropagation

- Calculate error for each output neuron at the output layer (L)
- For each hidden layer (L-1 to L – n) pass the error backwards from the layer above
- Update the weights connecting the last hidden layer (L-1) to the output layer (L)
- Update the weights connecting each lower layer

Backpropagation

- Calculate error (δk)for each output neuron (k) at the output layer (L)
This is calculated using:

δk=(yk-tk)*g'(xk)

Where g’ is the first derivative of the sigmoid and xk is the pre-sigmoided output

Backpropagation

δk=(yk-tk)*g'(xk)

δk=(0.542-1)*g'(xk)=(0.542-1)*(0.542)*(1-0.542) = -0.114

Target = 1

C

Learning rate = 1

0.59

E

0.542

0.542

D

0.54

Backpropagation

δk=(yk-tk)*g'(xk)

δk=(0.542-1)*g'(xk)=(0.542-1)*(0.542)*(1-0.542) = -0.114

Target = 1

C

Learning rate = 1

δk=-0.114

0.59

E

0.542

0.542

D

0.54

Backpropagation

2. For each hidden layer (L-1 to L – n) pass the error backwards from the layer above

This is calculated using:

Where j is the hidden neuron and k is the output neuron

Backpropagation

δc=(wecδk)*g'(xc) = 0.2 * -0.114 * 0.59*(1-0.59)=-0.0055

δd=(wedδk)*g'(xd) = 0.1 * -0.114 * 0.54*(1-0.54)=-0.0028

Target = 1

C

Learning rate = 1

δk=-0.114

0.59

E

0.542

0.542

D

0.54

Backpropagation

δc=-0.0055

δc=(wecδk)*g'(xc) = 0.2 * -0.114 * 0.59*(1-0.59)=-0.0055

δd=(wedδk)*g'(xd) = 0.1 * -0.114 * 0.54*(1-0.54)=-0.0028

Target = 1

C

Learning rate = 1

δk=-0.114

0.59

E

δd=-0.0028

0.542

0.542

D

0.54

Backpropagation

3. Update the weights connecting the last hidden layer (L-1) to the output layer (L)

This is calculated using:

Where j is the hidden neuron and k is the output neuron. aj is the sigmoided output of the hidden neuron

Backpropagation

δc=-0.0055

Wec=wec-ηδeC = 0.2 - 1* -0.114*0.59=0.267

Wed=wed -ηδed = 0.1 - 1* -0.114*0.54=0.162

Target = 1

C

Learning rate = 1

δe=-0.114

0.59

E

δd=-0.0028

0.542

0.542

D

0.54

Backpropagation

δc=-0.0055

Wec=wec-ηδeC = 0.2 - 1* -0.114*0.59=0.267

Wed=wed -ηδed = 0.1 - 1* -0.114*0.54=0.162

Target = 1

C

Learning rate = 1

0.267

0.59

E

δd=-0.0028

0.542

0.542

D

0.162

0.54

Backpropagation

4. Update the weights connecting each lower layer.

This is calculated using:

Where j is the hidden neuron (or input neuron) in the layer below and k is the hidden neuron in the layer above.

Backpropagation

δc=-0.0055

Wca=wca-ηδcA = 0.1 - 1* -0.0055*0.1=0.1005

Wcd=wcd-ηδdA = 0.3 - 1* -0.0028*0.1=0.3003

Target = 1

C

Learning rate = 1

0.267

0.59

E

δd=-0.0028

0.542

0.542

D

0.162

0.54

Backpropagation

δc=-0.0055

Wca=wca-ηδcA = 0.1 - 1* -0.0055*0.1=0.1005

Wcd=wcd-ηδdA = 0.3 - 1* -0.0028*0.1=0.3003

Target = 1

C

0.1005

Learning rate = 1

0.267

0.59

0.3003

E

δd=-0.0028

0.542

0.542

D

0.162

0.54

Download Presentation

Connecting to Server..