1 / 28

Back to Basics: Classification and Inference Based on Input Feedback Structure

Back to Basics: Classification and Inference Based on Input Feedback Structure. Tsvi Achler Eyal Amir. Department of Computer Science University of Illinois at Urbana-Champaign. AI -> AGI. Ability to generalize Even if only learned basics Training distribution ≠ test distribution

lirit
Download Presentation

Back to Basics: Classification and Inference Based on Input Feedback Structure

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Back to Basics: Classification and Inference Based on Input Feedback Structure Tsvi Achler Eyal Amir Department of Computer Science University of Illinois at Urbana-Champaign

  2. AI -> AGI • Ability to generalize • Even if only learned basics • Training distribution ≠ test distribution • Avoid Combinatorial Explosion • Allows complex networks

  3. New Basic Computational Structure • Based on massive feedback to inputs • No emphasis on weight parameters • Input Feedback during testing

  4. Y2 Y2 I2 I2 I1 I1 I3 I3 Avoids Combinatorial Explosionvia Simple Connectivity Y1 Y3 Y4 Output Nodes I4 Input Nodes x4 x1 x2 x3 Connections: Positive Negative

  5. 1 1 å å = = 1 1 wxy N N x Forward Connections: Iterative Y1 Y2 I2 I1 x1 x2 1 thus Wy=

  6. Back Y1 Y2 I2 I1 x1 x2

  7. Forward Y1 Y2 I2 I1 x1 x2

  8. Back Y1 Y2 I2 I1 x1 x2

  9. Active (1) Y1 Y2 Inactive (0) I2 I1

  10. Active (1) C2 Inactive (0) I2 I1

  11. Active (1) C2 Inactive (0) I2

  12. Active (1) Inactive (0)

  13. Active (1) Inactive (0)

  14. Active (1) Inactive (0) I2

  15. Active (1) Inactive (0)

  16. Steady State Graph of Dynamics 1 … Y1 Y2 Activity 0 I2 0 1 2 3 4 5 I1 Simulation Time (T)

  17. A 1 → 1 A, B 2 → 1 Steady State: Inputs (PA, PB) Results (C1, C2) (½, ½) (PA ≥ PB) (PA–PB, PB) (PA ≤ PB) (0, (PA+PB)/2) Resolving Pattern Interactions Network Configuration Steady State Results Node→Value 1 2 Inputs Node : Input: A B (0, ½) A=1 and B = ½ ( Half Activation Half Response =1, =½) A=.12 C1=.12 A=2.5 B=1 c1=1.5 c2=1

  18. Inputs Results Node→Value A 2 → ½ A, B 2 → 1 A, B, C 2,3 →¾ B, C B 3 → 1 2,3 →¼ Resolving Pattern Interactions Based on Available Representations 2 3 Cells : Inputs: A B C

  19. Inputs ResultsCell Value A 1 → 1 A, B 2 → 1 A, B, C 1,3 → 1 B, C 3 → 1 1 2 3 Cells : Inputs: A B C ‘Binding’ Resolving Pattern Interactions Most efficient configuration Not possible with OvA or AvA

  20. Can be Chained Ad Infinitum 2 3 N Nodes ... Inputs: A B C N O N 1 2 3 Nodes : ... Inputs: N O A B C

  21. New data: Recognize Scene When Trained on Individuals • Teach single letters • Test multiple simultaneous letters • A scene is beyond the training distribution

  22. x3 xn Feature Examples: Feature 1 Feature 2 Feature 3 Feature 4 … Feature n x.. x2 I2 I.. x4 x1 I4 I1 Feature 1 = x1 Feature 2 Feature 3  Feature Extraction • Bag-of-features 512 features • Found in Visual Cortex • Pixels Separated into features Inputs to Model I3 In Feature Examples: Feature 1 Feature 2 Feature 3 Feature 4 … Feature n Feature 1 Feature 2 Feature 3  DX Fig 4 Simple feature extractor presenting non-spatial information from visual field: collective pixel patterns presented to network

  23. 100 NN 90 RFNN 80 70 60 50 % of combinations 40 30 20 10 0 0/5 1/5 2/5 3/5 4/5 5/5 Letters Correctly Classified Figure 7: Five Letter Classification NN SVM KNN IFN Two Stimuli Simultaneously A B 100 90 80 70 60 % of combinations 50 40 30 20 10 0 0/2 1/2 2/2 Letters Correctly Classified Figure 5: NN with two letter retraining

  24. NN SVM KNN IFN A B C D Four Stimuli Simultaneously: 100 i.e. (A B C D) 90 80 70 60 50 % of combinations 40 30 20 10 0 0/4 1/4 2/4 3/4 4/4 Letters Correctly Classified Figure 6: Four Letter Classification Figure 5: NN with two letter retraining

  25. Difficulty • Nonlinear Equations • Can’t mathematically prove general properties

  26. Steps Towards AGI • Generalize Outside Training Distribution • Structure Avoids Combinatorial Explosion

  27. Acknowledgements Cyrus Omar National Geospatial-Intelligence Agency HM1582-06--BAA-0001

  28. Equations Y ( t ) å å = Q Y ( t ) + D = a Y ( t t ) I b j a i n Î Î j M i N a b a æ ö X ç ÷ = b I ç ÷ Y ( t ) X b å Q + D = a i Y ( t t ) b ç ÷ a n å ç ÷ Î i N Y ( t ) a a ç ÷ j è ø Î j M i Activation Feedback Inhibition Combined: C collection of all output cells Ca cell “a”. Na the set of input connections to cell Ca. na the number of processes in set Na of cell Ca. P primary inputs (not affected by shunting inhibition). I collection of all inputs Ibinput cell “b”. Mb the set of recurrent feedback connections to input Ib. mb the number of connections in set Mb Q shunting inhibition. Qb shunting inhibition at input b.

More Related