1 / 23

Index

Analog recurrent neural network simulation, Θ (log 2 n) unordered search with an optically-inspired model of computation. Index. Continuous Space Machine Structure Analog Recurrent Neural Network Simulation and Complexity Result Unordered Search Algorithm.

magnar
Download Presentation

Index

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Analog recurrent neural network simulation, Θ(log2n) unordered search with an optically-inspired model of computation

  2. Index • Continuous Space Machine Structure • Analog Recurrent Neural Network Simulation and Complexity Result • Unordered Search Algorithm

  3. The Continuous Space Machine(CSM) • Definition: : grid dimensions : address of sta, a, and b : addresses of the k input images : the r programming symbols and their addresses : addresses of l output images

  4. Instructions of CSM • h and v : h gives the 1-D Fourier transformation in the x-direction, and v gives the 1-D Fourier transformation in the y-direction.

  5. Instructions of CSM (II) • * : * gives the complex conjucate of its argument image. where f* is the complex conjucate of f.

  6. Instructions of CSM (III) • ∙and +: ∙gives the pointwise complex product of its two argument images, + gives the pointwise complex sum of its two argument images.

  7. Instructions of CSM (IV) • ρ: ρ performs amplitude thresholding on its first image argument using its other two real-valued image arguments as lower and upper amplitude thresholds, respectively.

  8. Instructions of CSM (V) • ld and st ld parameters p1 to p4 to image at well-known address a. st copies the image at well-known address a to a ‘rectangle’ of images specified by the st parameters p1 to p4.

  9. Instructions of CSM (VI) • br and hlt br gives the unconditional jump to the address that the parameter indicates. hlt gives the program termination.

  10. Instructions of CSM (Review)

  11. The relation betweenimages and data • Complex-valued image A complex-valued image is a function , where [0, 1] is the real unit interval. • Zero Image An image that has value 0 everywhere represents 0.

  12. The relation betweenimages and data (II) • Binary symbol image The symbol ψ  is represented by the binary symbol image fψ • Real number image The real number r  R is represented by the real number image fr

  13. Two kinds of Binary words • Stack images ld and st instead of push and pop. • List images Load all images at once.

  14. Matrix image for ARNN simulation • RC matrix image The RC matrix A with real-valued components aij, is represented by the RC matrix image fA

  15. Complexity measure • Time The number of instructions executed in the program. • Space The total space needed to execute the program. • Resolution The maximum resolution of the grid images in the Computation sequences • Range The maximum amplitude precision needed.

  16. ARNN ARNNs are finite size feedback first order neural networks wirh real weights. The state of each neuron xi at time t + 1 is given by an update equation of the form: We can take p neurons of xi for output.

  17. ARNN (II) • The CSM model can simulate the ARNN The pseudo code is as below

  18. ARNN (III) • Complexity If ARNN being simulated is defined for time t = 1, 2, 3, … has M input, N neurons, and k is the number of stacked image elements used to encode the active input to the simulator, the four complexity are Time = O((N + M + 1)t + 1), Space = O(1), Resolution = Max(2k+M-1, 22N-2, 2N+M-2, 2t+N-1), Range = Infinity. (Real value needs infinite bits.)

  19. ARNN Conclusion • Because ARNN can be simulated by CSM, the computation power of CSM is at least as strong as TM.

  20. Unordered Search(Needle in the haystack problem) L = {w: w  0*10*}, ω L be written as ω = ω0ω1…ωn-1. • Input: ω • Output: Binary representation of i, where ωi=1.

  21. Solve NIH in other model • In the classic model, this may be solved in O(n) time naïvely, and it seems that the naive method might have the best performance in this model. • In the quantum computer, this may be solved in Ω( ) with Grover’s work.

  22. NIH in the CSM model • Thinking… Use a binary list image to represent ω, and a binary stack image to represent n with log2n bits. Because the ωhas only one non-zero point, we can use some convenient instructions in CSM to solve this problem in shorter time…

  23. Pseudo Code ofθ(log2n) unordered search

More Related