1 / 36

waiting...

waiting. The Generalised Mapping Regressor ( GMR ) neural network for inverse discontinuous problems. Student : Chuan LU Promotor : Prof. Sabine Van Huffel Daily Supervisor : Dr. Giansalvo Cirrincione. Mapping Approximation Problem.

keona
Download Presentation

waiting...

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. waiting...

  2. The Generalised Mapping Regressor (GMR) neural network for inverse discontinuous problems Student : Chuan LU Promotor : Prof. Sabine Van Huffel Daily Supervisor : Dr. Giansalvo Cirrincione

  3. Mapping Approximation Problem • Feedforward neural networks are : • universal approximators of nonlinear continuousfunctions (many-to-one, one-to-one) • they don’t yield multiple solutions • they don’t yield infinite solutions • they don’t approximate mapping discontinuities

  4. conditional average of the target data Inverse and Discontinuous Problems • Mapping : multi-valued, complex structure. • Poor representation of the mapping by least squares approach (sum-of-squares error function) for feedforward neural networks. • Mapping with discontinuities.

  5. kernel blending winner-take-all • Jacobs and Jordan • Bishop (ME extension) output mixture-of-experts gating network Network 1 Network 2 Network 3 input It partitions the solution between several networks. It uses a separate network to determine the parameters of each kernel, with a further network to determine the coefficients.

  6. ME MLP Example #1

  7. ME MLP Example #2

  8. ME MLP Example #3

  9. ME MLP Example #4

  10. Characteristics: • approximate every kind of function or relation. • input : collection of components of x and youtput : estimation of the remaining components • output all solutions, mapping branches, equilevel hypersurfaces. Generalised Mapping Regressor( GMR ) (G. Cirrincione and M. Cirrincione, 1998)

  11. clusters mapping branches GMR Basic Ideas function approximation pattern recognition Z (augmented) space  unsupervised learning • coarse-to-fine learning • incremental • competitive • based on mapping recovery (curse of dimensionality) • topological neuron linking • distance • direction • linking tracking • branches • contours • open architecture

  12. Training Set Learning Object Merging Recall-ing Linking object 1 branch 1 branch 2 links object merged INPUT pool of neurons object 3 object 2 GMR four phases

  13. vigilance threshold x w4= x4 EXIN Segmentation Neural Network (EXIN SNN) (G. Cirrincione, 1998) • clustering Input/weight space

  14. branch (object) neuron GMR Learning • EXIN SNN • high rz ( say r1 ) coarse quantization Z (augmented) space

  15. GMR Learning • production phase • Voronoi sets domain setting Z (augmented) space

  16. GMR Learning TS#3 TS#5 TS#1 TS#4 TS#2 • secondary EXIN SNNs • rz = r2 < r1 fine quantization Z (augmented) space Other levels are possible

  17. GMR Coarse to fine Learning ( Example) object neuron Voronoi set fine VQ neurons object neuron

  18. asymmetric radius ri neuron i Task 1 : GMR Linking • Voronoi set: setup of the neuron radius (domain variable)

  19. Task 2 : k-nn branch and bound search technique Weight Space Linking candidates w3 w4 w5 d3 Linking direction d4 d5 w1 d1 d1 d2 w2 GMR Linking • distance test • direction test • create a link or strengthen a link • For one TS presentation: zi

  20. Branch and Bound Accelerated Linking • neuron tree constructed during learning phase (multilevel EXIN SNN learning) • methods in linking candidate step (k-nearest-neighbors computation): • -BnB : <  d1 , ( : linking factorpredefined) • k-BnB : k predefined.

  21. 83 % GMR Linking branch-and-boundin linkingexperimental results:

  22. branch and bound (cont.) Apply branch and bound in learning phase ( labelling ) : • Tree construction • k-means • EXIN SNN • Experimental results (in the 3-D example) • 50% of labeling flops are saved

  23. GMR Linking Example link

  24. GMR Merging Example

  25. level 1 neuron branch 1 level 2 neuron branch 2 GMR Recalling Example • level one neurons : input within theirdomain • level two neurons : only connected ones • level zero neurons : isolated (noise)

  26. Experiments spiral of Archimedes  = a (a = 1)

  27. Sparse regions further normalizing + higher mapping resolution Experiments

  28. noisy data Experiments

  29. Experiments

  30. GMR mapping of 8 spheres in a 3-D scene. Experiments contours: links among level one neurons

  31. Conclusions GMR is able to : • solve inversediscontinuous problems • approximate every kind of mapping • yield all the solutions and the corresponding branches GMR can be accelerated by applying tree search techniques GMR needs: • interpolation techniques • kernels or projection techniques for high dimensional data • adaptive parameters

  32. Thank you ! (shi-a shi-a)

  33. w8 w7 l8= 0 b8 = 0 l7 = 0 b7 = 0 l1 = 0 b1 = 0 l3 = 0 b3 = 0 w3 l4 = 0 b4 = 0 w1 input w2 w4 l2= 0 b2= 0 w6 l5 = 0 b5 = 0 w5 l6 = 0 b6 = 0 connected neuron : level zero level two branch  the winner branch GMR Recall l1 = 1 b1 = 1 • restricted distance r1 l3 = 2 b3 = 1 • level one test • linking tracking

  34. input w8 • level one test • linking tracking GMR Recall w7 l8= 0 b8 = 0 l7 = 0 b7 = 0 l1 = 1 b1 = 1 l1 = 0 b1 = 0 l3 = 2 b3 = 1 l3 = 0 b3 = 0 w3 l4 = 0 b4 = 0 w1 w2 r2 w4 branch cross l2= 1 b2= 2 l2= 0 b2= 0 l2= 1 b2=1 w6 l5 = 0 b5 = 0 w5 l6 = 0 b6 = 0

  35. input Two Branches Tow Branches w8 GMR Recall w7 l8= 0 b8 = 0 … until completion of the candidates l7 = 0 b7 = 0 l1 = 0 b1 = 0 l1 = 1 b1 = 1 l3 = 0 b3 = 0 l3 = 2 b3 = 1 w3 l4 = 0 b4 = 0 l4 = 1 b4 = 4 w1 w2 w4 l2= 0 b2= 0 l2= 1 b2= 1 l2= 1 b2= 2 w6 l4 = 1 b4 = 5 l5 = 2 b5 = 4 l5 = 0 b5 = 0 l4 = 1 b4 = 4 w5 l6 = 2 b6 = 4 l6 = 1 b6 = 6 l6 = 0 b6 = 0 l6 = 1 b6 = 4 • level one neurons : input within theirdomain • level two neurons : only connected ones • level zero neurons : isolated (noise) clipping

  36. l1 = 1 b1 = 1 l3 = 2 b3 = 1 l4 = 1 b4 = 4 input l2= 1 b2= 1 l4 = 1 b4 = 4 l6 = 1 b6 = 4 w8 • Output= weight complements of the level one neurons GMR Recall w7 l8= 0 b8 = 0 • Outputinterpolation l7 = 0 b7 = 0 w3 w1 w2 w4 w6 w5

More Related