1 / 24

Random Neural Network Texture Model

Random Neural Network Texture Model. Erol Gelenbe, Khaled Hussain, and Hossam Abdelbaki. A. Introduction B. Color Change Mechanism in Chameleon C. Texture Learning Algorithm D. Texture Generation Algorithm E. Experimental Evaluation F. Conclusions. A.Introduction.

shawna
Download Presentation

Random Neural Network Texture Model

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Random Neural Network Texture Model Erol Gelenbe, Khaled Hussain, and Hossam Abdelbaki

  2. A. Introduction • B. Color Change Mechanism in Chameleon • C. Texture Learning Algorithm • D. Texture Generation Algorithm • E. Experimental Evaluation • F. Conclusions

  3. A.Introduction • There is no generally accepted definition of texture. • Texture analysis is considered one of the most important subjects in image processing and computer vision. • The task of extracting texture features is crucial and if one could model and quantify the process by which the human recognizes texture, one could construct a highly successful recognition system. • Unfortunately, the process by which we recognize textures is not fully understood, and researchers are left to consider some alternative techniques.

  4. A.Introduction • Regarding the texture synthesis problem, Markov random fields (MRFs) have been used extensively because they are able to capture the local (spatial) contextual information in a texture and generate a similar texture using the extracted parameters. However, the operations performed during texture generation are very time consuming. • The idea of using the neural networks in learning and regenerating textures was basically inspired from the color change mechanisms of some kinds of animals which are able to alter their colors and their color patterns so that they can be effectively camouflaged against a variety of different backgrounds.

  5. A. Introduction • In this presentation, we will introduce a novel method for texture modeling (learning) and synthesis using the random neural network model. This model has been successfully applied in generating synthetic textures that have features similar to those generated by the MRF model, such as granularity and inclination but with tremendous reduction in the generation time over the MRF.

  6. B. Color Change Mechanism in Chameleon

  7. B. Color Change Mechanism in Chameleon

  8. C. Texture Learning Algorithm • Here we will describe the procedure used for extracting the features from a given texture image, through training the RNN, and encoding those features into the weight matrices of the network. The resulting weights can then be used for generating textures that have similar characteristics to the initial texture.

  9. C.Texture Learning Algorithm • 1 - Initialize the weight matrices W+ and W- to random values between 0 and 1. • 2- Set k and yk to the normalized pixel values in the window and set kto 0.0. • 3- Solve the nonlinear system given be to obtain the actual neuron outputs q. • 4- Adjust the network parameters to minimize the cost function Ek given by. • 5- For each successive desired input-output pair, indexed by k, the n x n weight matrices must be adjusted after applying each input.

  10. D.Texture Synthesis (Generation) Procedure • The random neural network which we propose in order to generate artificial textures associates a neuron I(i, j) to each pixel (i, j) in the plane. The state f(i, j) can be interpreted as the gray level value of the pixel at (i, j). The topology of the proposed random network for texture generation is shown below.

  11. D.Texture Synthesis (Generation) Procedure • We see that each neuron in the network will be, in general, connected to at most eight neighbors. We shall use the for the positions in the network, where x denotes any (i, j). For instance, X1 denotes the • (i-1, j+1) pixel.

  12. D.Texture Generation Algorithm • 1- Specify the values of the weights • 2- Generate at random a bitmap value between 0 and 1 for each • pixel x = (i,j) and assign it to the variable q(x) for each x. • 3- Start with an image that is generated by coloring each point with level l, where l is chosen with equal probability from the set 0, • 1, 2, …., G-1, where G is the number of gray levels. • 4- Start with k=0 up to k=K (the stopping condition), • iterate on equations • and compute • It should be stated here that the network weights can be chosen according to some criteria to generate synthetic textures with predefined characteristics or they can result from training the RNN to a certain texture image.

  13. Modeling Synthetic Textures • In this simulations, we begin by assuming specific weights for the RNN and generate synthetic texture. The texture learning procedure is then applied to the generated synthetic texture image, with 3 x 3 training window and the RNN weights are obtained. • The weights of the RNN are used in generating another synthetic texture (again using the generation procedure). • The original and final textures are then compared. Although all our experiments yield visually similar realizations, as shown in the figures, we calculate some of the statistical features which are derived from the co-occurrence matrix such as energy, • contrast, entropy and homogeneity of the textures.

  14. Synthetic binary textures generated with specified (left) and estimated (right) and parameters

  15. Synthetic Gray Textures Generated with Specified (left) and Estimated (right) and Parameters

  16. Natural (left) and Synthetic (right) Textures a b c d

  17. Co-occurrence Matrix • Given the following 4 x 4 image containing that contains 3 different gray levels • The 3x3 gray level co-occurrence matrix for a displacement vector d = (dx, dy) = (1,0), is given by

  18. Co-occurrence Matrix Features • Basic co-occurrence matrix statistical feature

  19. Co-occurrence Matrix Features

  20. Binary synthetic texture (iterations 1, 3, 6, and 9) Binary synthetic texture after training the RNN (iterations 1, 3, 6, and 9)

  21. Binary synthetic texture after training (iterations 1, 3, 6, and 9) 1 3 6 9

  22. Binary synthetic texture after training (iterations 1, 3, 6, and 9) 1 3 6 9

  23. Conclusions

More Related