1 / 23

Outline

Content-Based Image Retrieval Using Grey Relational Analysis Dept. of Computer Engineering Tatung University Presenter: Tienwei Tsai ( 蔡殿偉 ). Outline. Introduction Feature Extraction Color Space Transformation Discrete Cosine Transform (DCT) Similarity Measurement Problem Description

neylan
Download Presentation

Outline

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Content-Based Image Retrieval Using Grey Relational AnalysisDept. of Computer EngineeringTatung University Presenter: Tienwei Tsai (蔡殿偉)

  2. Outline • Introduction • Feature Extraction • Color Space Transformation • Discrete Cosine Transform (DCT) • Similarity Measurement • Problem Description • Grey Relational Analysis • Grey Similarity Measurement • Experimental Results • Conclusions

  3. Introduction (1/4) • Query-by-text has two major drawbacks • Each image in the database has to be described by keywords, which is extremely time consuming. • The expressive power of keywords is limited and cannot be exhaustive. • Retrieving images according to the image contents is called content-based image retrieval (CBIR). • Image objects are typically represented in CBIR • Low-level features including color, texture, and shape, etc. • Usually stored in the form of a feature vector.

  4. Introduction (2/4) • Searching for k most-similar images to a query image • Comparing the feature vectors of all the images in the database with that of the query image using some pre-selected similarity measure. • This requires linear time with respect to the size of the database and quickly becomes impractical for large databases. • A compact representation of the features needs to be carefully chosen before applying any of these indexing techniques

  5. Introduction (3/4) • We intend to achieve an acceptable retrieval result using • A content descriptor formed by only one single feature, the Y-component in YUV color space. • The first 16 DCT coefficients transformed from the Y-component of an image. • The run-time complexity of this method is small, since the length of the feature vector is small.

  6. Introduction (4/4) • While retrieving images, the query results are uncertain to some extent; therefore, retrieval process can be regarded as a grey system. • The Gray Relational Analysis (GRA) method • Describes the relationship between a main factor and all the other factors in an uncertainly and incompletely grey environment. • We develop a CBIR matching technique based on the GRA method. • This method performs a high efficiency of retrieval with an acceptable accuracy.

  7. Feature Extraction (1/5) • CBIR often comprises both indexing and retrieval. • Feature extraction finds out the suitable properties of interest and converting them into mathematical “feature vectors”. • The feature vectors are stored into the database along with original images. • It is a critical phase in CBIR, because the following retrieval process depends upon the correctness of the selected features.

  8. Feature Extraction (2/5) • DCT is one of the best filters for feature extraction in the spatial frequency domain. • The steps of the proposed feature extraction • Transform a RGB image into the YUV color space. • Calculate the low frequency DCT coefficients for Y-component.

  9. Feature Extraction (3/5) • YUV color space • Y: luminance (or brightness) Y(x, y) = 0.299 R(x, y)+0.587 G(x, y)+0.114 B(x, y) • U: blue chrominance U(x, y) = 0.492(B(x, y)-Y(x, y)) • V: red chrominance V(x, y) = 0.877(R(x, y)-Y(x, y)) • Psycho-perceptual studies have shown that the human brain perceives images largely based on their luminance value, and only secondarily based on their color information

  10. Feature Extraction (4/5) • Discrete Cosine Transform • DCT coefficients are generated for a N×N image on a pixel by pixel basis. The N×N DCT coefficients thus give the nature of textual energy for each pixel. where x(i, j) is the pixel value at the (i, j) coordinate position in the image, C(u, v) is DCT domain representation of x(i, j), and

  11. Feature Extraction (5/5) • The coefficients with small u and v correspond to low frequency components. • For most images, much of the signal energy lies at low frequencies. • We generate a feature vector formed with low frequency DCT terms transformed from the Y-component of an image . • Our experiments have shown that the use of a block size of 4x4 low frequency coefficients performs well from a viewpoint of retrieval quality.

  12. Similarity Measurement (1/5) • The user query feature extraction will first convert a user's exemplary image into a mathematical feature vector compatible with the feature vectors stored in the database. • Similarity matching is typically defined by a distance function, e.g. Euclidean distance. • Fewer items in a feature vector will lead to a faster matching at the expense of accuracy.

  13. Similarity Measurement (2/5) • Grey system theory can perform grey relational analysis (GRA) for data sequences. • GRA can be used to determine the relational grade between the reference and each sequence in the given set. • The best comparative one can be found by further sorting the resultant relational grades .

  14. Similarity Measurement (3/5) • The GRA method can be described as follows. • Suppose the reference sequence is • Define the grey relational coefficient between y0 and xiat the kth item as follows: where n stands for the number of items in a data sequence. Denote the m sequences to be compared by

  15. Similarity Measurement (4/5) where i = 1…, m, k = 1, …, n, and is a distinguishing coefficient to control the resolution between and It is set to 0.5 in this study.

  16. Similarity Measurement (5/5) • The grey relational grade for an entire sequence • The larger the grey relational grade is, the higher the relative similarity ofxito y0. represents to the degree of similarity between the sequence xiand the reference sequence y0.

  17. Experimental Results • Two image databases are used in the system: • 1000 images downloaded from the WBIIS. • 1000 images downloaded from the test database used in SIMPLIcity paper. • No pre-processing was done on the images. • We conducted three queries using the proposed GRA method in this system. • There had been very little systematic evaluation of CBIR, particularly in situations in which the user does not have a target image in mind..

  18. Figure 1. The main screen of the proposed system

  19. The first query Figure 2. Retrieved results using the GRA method. An image of a flower was given in the first query. We obtained eight correct answers (related to flowers) in the top 10 images.

  20. The second query Figure 3. Retrieved results using the GRA method. An image of the sunset was given in the second query. We obtained three correct answers (related to sunsets) in the top 10 images.

  21. The third query Figure 4. Retrieved results using the GRA method. An image of some people was given in the third query. We obtained nine correct answers (related to people) in the top 10 images.

  22. Conclusions (1/2) • A CBIR method based on the GRA method is proposed. • Each image is first transformed from the standard RGB color space to the YUV space; then Y component of the image is further transformed to its DCT domain. • The low frequency DCT coefficients are applied to extract low-level features from the images due to its superiority in energy compacting.

  23. Conclusions (2/2) • The retrieval process is treated as a grey system. • We define a measure using GRA method to indicate the degree of similarity between the query image and candidate images in the database. • Those images with the largest value of grey relational grade are chosen as the query results. • The experimental system demonstrated the efficiency and effectiveness of our proposed method.

More Related