1 / 75

Structure learning with deep neuronal networks

Structure learning with deep neuronal networks. 6 th Network Modeling Workshop, 6/6/2013 Patrick Michl. Agenda. Autoencoders. Dataset. Model. x 2. x 1. Real world data usually is high dimensional …. x 1. x 2. Autoencoders. Dataset. Model. x 2. ?. x 1.

Download Presentation

Structure learning with deep neuronal networks

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Structure learningwith deep neuronal networks 6th Network Modeling Workshop, 6/6/2013 Patrick Michl

  2. Agenda

  3. Autoencoders Dataset Model x2 x1 Real worlddatausuallyishigh dimensional …

  4. x1 • x2 Autoencoders Dataset Model x2 ? x1 … whichmakesstructuralanalysisandmodelingcomplicated!

  5. Autoencoders Dataset Model PCA x2 x1 DimensionalityreductiontechinqueslikePCA …

  6. x1 • x2 Autoencoders Dataset Model PCA x2 x1 … can not preservecomplexstructures!

  7. Autoencoders Dataset Model x2 x1 Thereforetheanalysisofunknownstructures…

  8. x1 • x2 Autoencoders Dataset Model x2 x1 … needsmoreconsideratenonlineartechniques!

  9. Autoencoders inputdataX Autoencoder • Artificial Neuronal Network Perceptrons Gaussian Units outputdataX‘ Autoencoders areartificial neuronal networks …

  10. Autoencoders inputdataX Perceptron Autoencoder 1 • Artificial Neuronal Network 0 Perceptrons Gaussian Units Gauss Units R outputdataX‘ Autoencoders areartificial neuronal networks …

  11. Autoencoders inputdataX Autoencoder • Artificial Neuronal Network Perceptrons Gaussian Units outputdataX‘ Autoencoders areartificial neuronal networks …

  12. Autoencoders inputdataX Autoencoder • Artificial Neuronal Network • Multiple hiddenlayers Perceptrons (Hidden layers) Gaussian Units (Visible layers) outputdataX‘ … withmultiple hiddenlayers.

  13. Autoencoders inputdataX Autoencoder • Artificial Neuronal Network • Multiple hiddenlayers Perceptrons (Hidden layers) Gaussian Units (Visible layers) outputdataX‘ Such networksarecalleddeepnetworks.

  14. Autoencoders inputdataX Autoencoder • Artificial Neuronal Network • Multiple hiddenlayers Perceptrons (Hidden layers) Definition (deepnetwork) Deepnetworksareartificial neuronal networkswith multiple hiddenlayers Gaussian Units (Visible layers) outputdataX‘ Such networksarecalleddeepnetworks.

  15. Autoencoders inputdataX Autoencoder • Deepnetwork Perceptrons (Hidden layers) Gaussian Units (Visible layers) outputdataX‘ Such networksarecalleddeepnetworks.

  16. Autoencoders inputdataX Autoencoder • Deepnetwork • Symmetrictopology Perceptrons (Hidden layers) Gaussian Units (Visible layers) outputdataX‘ Autoencoders have a symmetrictopology …

  17. Autoencoders inputdataX Autoencoder • Deepnetwork • Symmetrictopology Perceptrons (Hidden layers) Gaussian Units (Visible layers) outputdataX‘ … with an oddnumberofhiddenlayers.

  18. Autoencoders inputdataX Autoencoder • Deepnetwork • Symmetrictopology • Information bottleneck Bottleneck outputdataX‘ The smalllayer in thecenterworkslika an informationbottleneck

  19. Autoencoders inputdataX Autoencoder • Deepnetwork • Symmetrictopology • Information bottleneck Bottleneck outputdataX‘ ... thatcreates a low dimensional codeforeach sample in theinputdata.

  20. Autoencoders inputdataX Autoencoder Encoder • Deepnetwork • Symmetrictopology • Information bottleneck • Encoder outputdataX‘ The upperstackdoestheencoding …

  21. Autoencoders inputdataX Autoencoder Encoder • Deepnetwork • Symmetrictopology • Information bottleneck • Encoder • Decoder Decoder outputdataX‘ … andthelowerstackdoesthedecoding.

  22. Autoencoders inputdataX Autoencoder Encoder • Deepnetwork • Symmetrictopology • Information bottleneck • Encoder • Decoder Definition (autoencoder) Definition (deepnetwork) Autoencoders aredeepnetworkswith a symmetrictopologyand an oddnumberofhiddernlayers, containing a encoder, a low dimensional representationand a decoder. Deepnetworksareartificial neuronal networkswith multiple hiddenlayers Decoder outputdataX‘ … andthelowerstackdoesthedecoding.

  23. Autoencoders inputdataX Autoencoder Problem: dimensionalityofdata Idea: Train autoencodertominimizethedistancebetweeninputXandoutputX‘ EncodeXtolow dimensional codeY Decodelow dimensional codeYtooutputX‘ Output X‘ islow dimensional outputdataX‘ Autoencoders canbeusedtoreducethedimensionofdata …

  24. Autoencoders inputdataX Autoencoder Problem: dimensionalityofdata Idea: Train autoencodertominimizethedistancebetweeninputXandoutputX‘ EncodeXtolow dimensional codeY Decodelow dimensional codeYtooutputX‘ Output X‘ islow dimensional outputdataX‘ … ifwecantrainthem!

  25. Autoencoders inputdataX Autoencoder Training Backpropagation outputdataX‘ In feedforward ANNs backpropagationis a goodapproach.

  26. Autoencoders inputdataX Autoencoder Training Backpropagation Backpropagation Definition (autoencoder) • The distance (error) betweencurrentoutputX‘andwantedoutputYiscomputed. This gives a errorfunction error outputdataX‘ In feedforward ANNs backpropagationis a goodapproach.

  27. Autoencoders inputdataX Autoencoder Training Backpropagation Backpropagation Definition (autoencoder) • The distance (error) betweencurrentoutputX‘andwantedoutputYiscomputed. This gives a errorfunctionExample (linear neuronal unitwithtwoinputs) outputdataX‘ In feedforward ANNs backpropagationisthechoice

  28. Autoencoders inputdataX Autoencoder Training Backpropagation Backpropagation Definition (autoencoder) The distance (error) betweencurrentoutputX‘andwantedoutputYiscomputed. This gives a errorfunction Bycalculatingweget a vectorthatshows in a directionwhichdecreasestheerror We update theparameterstodecreasetheerror outputdataX‘ In feedforward ANNs backpropagationis a goodapproach.

  29. Autoencoders inputdataX Autoencoder Training Backpropagation Backpropagation Definition (autoencoder) The distance (error) betweencurrentoutputX‘andwantedoutputYiscomputed. This gives a errorfunction Bycalculatingweget a vectorthatshows in a directionwhichdecreasestheerror We update theparameterstodecreasetheerror Werepeatthat outputdataX‘ In feedforward ANNs backpropagationisthechoice

  30. Autoencoders inputdataX Autoencoder Training BackpropagationProblem: DeepNetwork outputdataX‘ … theproblemarethe multiple hiddenlayers!

  31. Autoencoders inputdataX Autoencoder Training • BackpropagationProblem: Deep Network • Veryslowtraining outputdataX‘ Backpropagationisknowntobeslowfarawayfromtheoutputlayer …

  32. Autoencoders inputdataX Autoencoder Training • BackpropagationProblem: Deep Network • Veryslowtraining • Maybebadsolution outputdataX‘ … andcanconvergetopoorlocalminima.

  33. Autoencoders inputdataX Autoencoder Training • BackpropagationProblem: Deep Network • Veryslowtraining • Maybebadsolution • Idea: Initialize closeto a goodsolution outputdataX‘ The taskistoinitializetheparameterscloseto a goodsolution!

  34. Autoencoders inputdataX Autoencoder Training • BackpropagationProblem: Deep Network • Veryslowtraining • Maybebadsolution • Idea: Initialize closeto a goodsolution • Pretraining outputdataX‘ Thereforethetrainingofautoencodershas a pretrainingphase …

  35. Autoencoders inputdataX Autoencoder Training • BackpropagationProblem: Deep Network • Veryslowtraining • Maybebadsolution • Idea: Initialize closeto a goodsolution • Pretraining • Restricted Boltzmann Machines outputdataX‘ … whichusesRestricted Boltzmann Machines (RBMs)

  36. Autoencoders inputdataX Autoencoder Training • Restricted Boltzmann Machine • RBMs areMarkov Random Fields • BackpropagationProblem: Deep Network • Veryslowtraining • Maybebadsolution • Idea: Initialize closeto a goodsolution • Pretraining • Restricted Boltzmann Machines outputdataX‘ … whichusesRestricted Boltzmann Machines (RBMs)

  37. Autoencoders inputdataX Markov Random Field Every unitinfluenceseveryneighbor The couplingisundirected Motivation (Ising Model) A setofmagneticdipoles (spins) isarranged in a graph (lattice) whereneighborsare coupledwith a given strengt Autoencoder Training • Restricted Boltzmann Machine • RBMs areMarkov Random Fields • BackpropagationProblem: Deep Network • Veryslowtraining • Maybebadsolution • Idea: Initialize closeto a goodsolution • Pretraining • Restricted Boltzmann Machines outputdataX‘ … whichusesRestricted Boltzmann Machines (RBMs)

  38. Autoencoders inputdataX Autoencoder Training • Restricted Boltzmann Machine • RBMs areMarkov Random Fields • Bipartitetopology: visible(v), hidden(h) • Uselocalenergytocalculatetheprobabilitiesofvalues • Training: • contrastivedivergency • (Gibbs Sampling) v3 v4 v1 v2 • BackpropagationProblem: Deep Network • Veryslowtraining • Maybebadsolution • Idea: Initialize closeto a goodsolution • Pretraining • Restricted Boltzmann Machines h1 h2 h3 outputdataX‘ … whichusesRestricted Boltzmann Machines (RBMs)

  39. Autoencoders inputdataX Autoencoder Training Restricted Boltzmann Machine Gibbs Sampling • BackpropagationProblem: Deep Network • Veryslowtraining • Maybebadsolution • Idea: Initialize closeto a goodsolution • Pretraining • Restricted Boltzmann Machines outputdataX‘ … whichusesRestricted Boltzmann Machines (RBMs)

  40. Autoencoders Autoencoder Training Top The top layer RBM transformsreal valuedataintobinarycodes.

  41. Autoencoders Autoencoder Training v3 v4 v1 v2 Top h2 h3 h4 h5 h1 Thereforevisibleunitsaremodeledwithgaussianstoencodedata …

  42. Autoencoders Autoencoder Training v3 v4 v1 v2 Top h2 h3 h4 h5 h1 … andmanyhiddenunitswithsimoidstoencodedependencies

  43. Autoencoders Autoencoder Training v3 v4 v1 v2 Top LocalEnergy h2 h3 h4 h5 h1 The objectivefunctionisthesumofthelocalenergies.

  44. Autoencoders Autoencoder Training Reduction The next RBM layermapsthedependencyencoding…

  45. Autoencoders Autoencoder Training v3 v4 v1 v2 Reduction v h1 h2 h3 … fromtheupperlayer …

  46. Autoencoders Autoencoder Training v3 v4 v1 v2 Reduction h h1 h2 h3 … to a smallernumberofsimoids …

  47. Autoencoders Autoencoder Training v3 v4 v1 v2 Reduction LocalEnergy h1 h2 h3 … whichcanbetrainedfasterthanthe top layer

  48. Autoencoders Autoencoder Training Unrolling The symmetrictopologyallowsustoskipfurthertraining.

  49. Autoencoders Autoencoder Training Unrolling The symmetrictopologyallowsustoskipfurthertraining.

  50. Autoencoders Autoencoder Training • PretrainingTop RBM (GRBM)Reduction RBMsUnrolling • FinetuningBackpropagation After pretrainingbackpropagationusuallyfindsgoodsolutions

More Related