image retrieval based on the wavelet features of interest
Download
Skip this Video
Download Presentation
Image Retrieval Based on the Wavelet Features of Interest

Loading in 2 Seconds...

play fullscreen
1 / 22

Image Retrieval Based on the Wavelet Features of Interest - PowerPoint PPT Presentation


  • 104 Views
  • Uploaded on

Image Retrieval Based on the Wavelet Features of Interest. Te-Wei Chiang, Tienwei Tsai, and Yo-Ping Huang 2006/10/10. Outline. 1. Introduction. 2. Proposed Image Retrieval System. 3. Experimental Results. 4. Conclusions. 1. Introduction. Two approaches for image retrieval:

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about ' Image Retrieval Based on the Wavelet Features of Interest' - ermin


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
image retrieval based on the wavelet features of interest

Image Retrieval Based on the Wavelet Features of Interest

Te-Wei Chiang, Tienwei Tsai, and Yo-Ping Huang

2006/10/10

outline
Outline

1. Introduction

2. Proposed Image Retrieval System

3. Experimental Results

4. Conclusions

1 introduction
1. Introduction
  • Two approaches for image retrieval:
    • query-by-text (QBT): annotation-based image retrieval (ABIR)
    • query-by-example (QBE): content-based image retrieval (CBIR)
  • Standard CBIR techniques can find the images exactly matching the user query only.
slide4
In QBE, the retrieval of images basically has been done via the similarity between the query image and all candidates on the image database.
    • Euclidean distance
  • Transform type feature extraction techniques
    • Wavelet, Walsh, Fourier, 2-D moment, DCT, and Karhunen-Loeve.
  • In our approach, the wavelet transform is used to extract low-level texture features.
slide5
In this paper, we focus on the QbE approach. The user gives an example image similar to the one he/she is looking for.
  • Finally, the images in the database with the smallest distance to the query image will be given, ranking according to their similarity.
system architecture
System Architecture
  • This system consists of two major modules:
    • the feature extraction module
    • the similarity measuring module.
  • In the image database establishing phase:
    • each image is first transformed from the standard RGB color space to the YUV space;
    • then each component (i.e., Y, U, and V) of the image is further transformed to the wavelet domain.
  • In the image retrieving phase:
    • the similarity measuring module compares the most significant wavelet coefficients of the Y, U, and V components of the query image and those of the images in the database and find out good matches.
  • To benefit from the user-machine interaction, a GUI is developed, allowing users to adjust weights for each feature according to their preferences.
feature extraction
Feature Extraction
  • Features are functions of the measurements performed on a class of objects (or patterns) that enable that class to be distinguished from other classes in the same general category.
  • Color Space Transformation

RGB (Red, Green, and Blue) ->

YUV (Luminance and Chroma channels)

yuv color space
YUV color space
  • YUV is based on the Y primary and chrominance.
    • The Y primary was specifically designed to follow the luminous efficiency function of human eyes.
    • Chrominance is the difference between a color and a reference white at the same luminance.
  • The following equations are used to convert from RGB to YUV spaces:
    • Y(x, y) = 0.299 R(x, y) + 0.587 G(x, y) + 0.114 B(x, y),
    • U(x, y) = 0.492 (B(x, y) - Y(x, y)), and
    • V(x, y) = 0.877 (R(x, y) - Y(x, y)).
discrete wavelet transform
Discrete Wavelet Transform
  • Mallat\' s pyramid algorithm
similarity measurement
Similarity Measurement
  • In our experimental system, we define a measure called the sum of squared differences (SSD) to indicate the degree of distance (or dissimilarity).
  • The distance between Q and Xn under the Y component and LL(k)subband can be defined as
slide14
The distance between Q and Xn under the component Y can be defined as the weighted combination of LL(k) , LH(k) , HL(k) , HH(k) :
slide15
Likewise, the distances between Q and Xn under the component U and V can be defined.
  • Then, the overall distance between Q and Xn can be defined as :
5 experimental results
5. Experimental Results
  • 1000 images downloaded from the WBIIS database are used to demonstrate the effectiveness of our system.
  • The images are mostly photographic and have various contents, such as natural scenes, animals, insects, building, people, and so on.
6 conclusions
6. Conclusions
  • In this paper, a content-based image retrieval method that based on DWT is proposed.
  • To achieve QBE, the system compares the most significant wavelet coefficients of the Y, U, and V components of the query image and those of the images in the database and find out good matches by the help of users’ cognition ability.
  • Since there is no feature capable of covering all aspects of an image, the discrimination performance is highly dependent on the selection of features and the images involved.
  • Since several features are used simultaneously, it is necessary to integrate similarity scores resulting from the matching processes.
future works
Future Works
  • For each type of feature we will continue investigating and improving its ability of describing the image and its performance of similarity measuring.
ad