Skip to main content

Color-Based Image Retrieval Using Perceptually Modified Hausdorff Distance


In most content-based image retrieval systems, the color information is extensively used for its simplicity and generality. Due to its compactness in characterizing the global information, a uniform quantization of colors, or a histogram, has been the most commonly used color descriptor. However, a cluster-based representation, or a signature, has been proven to be more compact and theoretically sound than a histogram for increasing the discriminatory power and reducing the gap between human perception and computer-aided retrieval system. Despite of these advantages, only few papers have broached dissimilarity measure based on the cluster-based nonuniform quantization of colors. In this paper, we extract the perceptual representation of an original color image, a statistical signature by modifying general color signature, which consists of a set of points with statistical volume. Also we present a novel dissimilarity measure for a statistical signature called Perceptually Modified Hausdorff Distance (PMHD) that is based on the Hausdorff distance. In the result, the proposed retrieval system views an image as a statistical signature, and uses the PMHD as the metric between statistical signatures. The precision versus recall results show that the proposed dissimilarity measure generally outperforms all other dissimilarity measures on an unmodified commercial image database.

1. Introduction

With an explosive growth of digital image collections, content-based image retrieval (CBIR) has been emerged as one of the most active and challenging problems in computer vision as well as multimedia applications. Content-based image retrieval differs from the traditional text-based image retrieval in that images would be indexed by the visual features, such as color, texture, and shape [13]. In order to reflect the human perception precisely, there have been lots of image retrieval systems, which are based on the query-by-example scheme, including QBIC [4], PhotoBook [5], VisualSEEK [6], and MARS [7]. Actually, low-level visual contents do not properly capture human perceptual concepts, so closing the gap between them is still one of the ongoing problems. However, a series of psychophysical experiments reported that there is a significant correlation between visual features and semantically relevant information [8]. Based on these findings, many techniques have been introduced to improve the perceptual visual features and dissimilarity measures, which enable to achieve semantically correct retrieval performances [1, 914].

Among variety of visual features, color information is the most frequently used visual characteristic. Color histogram (or fixed-binning histogram) is widely employed as a color descriptor due to its simplicity of implementation and insensitivity to similarity transformation [9, 15]. However, in some cases, these simple histogram-based indexing methods fail to match perceptual (dis)similarity [16]. Moreover, since the color histogram is sensitive to the variation in color distribution, the performances of these methods usually depend severely on the quantization process in color space. To overcome these drawbacks, a clustering-based representation,signature (or adaptive-binning color histogram) has been proposed [1214, 16, 17, 18, 19, 20, 21]. Based on the psychophysical fact that at the first perception stage the human visual system identifies the dominant colors and cannot simultaneously perceive a large number of colors [12], cluster-based techniques generally extract dominant colors and their proportions to describe the overall color information. Also, a signature represents a set of clusters compactly in a color space and the distribution of color features. Therefore, it can reduce the complexity of representation and the cost of retrieval process.

Once two sets of visual features, represented by a histogram or a signature, are given, we need to determine how similar one is from the other. A number of different dissimilarity measures have been proposed in various areas of computer vision. Specifically for histograms, Jeffrey divergence, histogram intersection, and -statistics have been known to work successfully. However, these dissimilarity measures cannot be directly applied to signatures. As alternatives to these metrics, Rubner and Tomasi [16] proposed a novel dissimilarity measure for matching signatures, the Earth Mover's distance (EMD), which was able to overcome most of the drawbacks in histogram-based dissimilarity measures and handle the partial matching between two images. Dorado and izquierdo [17] also used the EMD as a metric to compare fuzzy color signatures. However, the computational complexity of the EMD is very high compared to other dissimilarity measures. Leow and Li [19] proposed a new dissimilarity measure called weighted correlation (WC) for signatures, which is more reliable than Euclidean distance and computationally more efficient than EMD. Generally, WC produced better performance than that of EMD, however in some cases, it showed worse results than those of the Jeffrey divergence (JD) [22]. Mojsilović et al. [12] introduced perceptual color distance metric, optimal color composition distance (OCCD), which is based on the optimal mapping between the dominant color components with area percentage of two images.

In this paper, we extract the compact representation of an original color image, a statistical signature by modifying general color signature, which consists of the representative color features and their statistical volume. Then a novel dissimilarity measure for matching statistical signatures is proposed based on the Hausdorff distance. The Hausdorff distance is an effective metric for the dissimilarity measure between two sets of points [2325], that is also robust to the outliers and geometric variations in certain degree. Recently, it has been applied to video indexing and retrieval [26]. However, it was simply designed for color histogram model. To overcome this drawback, we propose a new perceptually modified Hausdorff distance (PMHD) as a measure of dissimilarity between statistical signatures, that is consistent with human perception. Moreover, to cope with the partial matching problem, a partial PMHD metric is designed by incorporating outlier detection scheme. The experimental results on a real image database show that the proposed metric outperforms other conventional dissimilarity measures.

This paper is organized as follows. In Section 2, we introduce a statistical signature as a color descriptor. Section 3 proposes a novel dissimilarity measure, PMHD, and partial PMHD for partial matching. Then, Section 4 presents the experimental results and discussions on the effectiveness of the proposed metric. Finally, conclusions are drawn in Section 5.

2. A Color Image Descriptor: A Statistical Signature

In order to retrieve visually similar images to a query image using color information, a proper color descriptor for the images should be designed. Recently, it has been proven that a signature can describe the color distribution more efficiently than a color histogram [16, 17, 19]. And a signature is appropriate for describing each image independently of other images in an image database.

In this paper, we represent an original color image by a statistical signature defined as


where is the number of clusters, is the mean feature vector of th cluster, is the number of the features that belong to cluster, and is the covariance matrix of cluster. Variety of different clustering methods can be used to construct a statistical signature from a color image. In this paper, we used -means algorithm [27] to cluster color features in CIELab color space.

Figure 1 shows two sample images quantized by using the proposed statistical signature. We could observe that not much perceptual color degradation has occurred, regardless of a great amount of representation data reduction in color space by the clustering.

Figure 1
figure 1

Sample images quantized using -means clustering: (a) original image with 256 758 colors, and quantized images based on a random signature with (b) 10 colors, and (c) 30 colors.

3. A Novel Dissimilarity Measure for a Statistical Signature

3.1. Hausdorff Distance

It has been shown that the Hausdorff distance (HD) is an effective metric for the dissimilarity measure between two sets of points in a number of computer vision literatures [2325, 28], while insensitive to the variations and noise.

In this section, we briefly describe the HD. More details can be found in [2325, 28]. Given two finite point sets, and , the HD is defined as




and the function is the directed HD between two point sets.

3.2. Perceptually Modified Hausdorff Distance

In this paper, we propose a novel dissimilarity, called perceptually modified Hausdorff distance (PMHD) measure based on HD for comparison of statistical signatures.

Given two statistical signatures, and , a novel dissimilarity measure between two statistical signatures is defined by


where and are directed Hausdorff distances between two statistical signatures.

The directed Hausdorff distance is defined as


where is the distance between two color features, and in and , respectively. In this paper, we consider three different distances for : the Euclidean distance, the CIE94 color difference, and the Mahalanobis distance. In order to guarantee that the distance is perceptually uniform, the CIE94 color difference equation is used instead of the Euclidean distance in CIELab color space [29, 30]. While the Euclidean distance and the CIE94 simply measure the geometric distance between two feature vectors in the Euclidean coordinates without considering the distribution of color features, the Mahalanobis distance explicitly considers the distribution of color features after clustering process [31]. Three distances are defined as follows.

(i)Euclidean distance:


where and are the elements of and , respectively.

(ii)CIE94 color difference:


where , , and are the differences in lightness, chroma, and hue between and .

(iii)Mahalanobis distance:


Note that in order to take into account the size of clusters in matching, we penalize the distance between two color feature vectors by the minimum of their corresponding sizes as in (5). This reflects the fact that color features with a large size influence more the perceptual similarity between images than the smaller ones [12]. Let us consider an example as in Figure 2(a). There are two pairs of feature vectors denoted by circles centered at the mean feature vectors. The radius of each circle represents the size of the corresponding feature. If we compute only the geometric distance without considering the size of two feature vectors, two distances and will be equal. However, perceptually must be smaller than . Another example is given in Figure 2(b), where three feature vectors are shown. Again, if we consider only the geometric distance, will be smaller than . However, in fact, perceptual is smaller than .

Figure 2
figure 2

An example of perceptual dissimilarity based on the densities of two color features.

Thus, by combining the set theoretical metric and perceptual notion in the dissimilarity measure, the proposed PMHD becomes relatively insensitive to the variations of mean color features in a signature, and consistent with human perception.

3.3. Partial PMHD Metric for Partial Matching

In certain cases, a user may have a partial information of the target images as the query, or wants to extract all the images including partial information of the query. In these cases, conventional techniques with global descriptor are not appropriate. Like a color histogram, a signature is also a global descriptor of a whole image. So, the direct application of the HD as in (4) cannot cope with occlusion and clutter in image retrieval or object recognition [16, 28, 32]. In order to handle partial matching, Huttenlocher et al. [23] proposed a partial HD based on ranking, which measures the difference between portions of point sets. Also, Azencott et al. [25] further modified the rank-based partial HD by order statistics. But, these distances were shown to be sensitive to the parameter changes. In order to address these problems, Sim et al. [28] proposed two robust HD measures, M-HD and LTS-HD, based on the robust statistics such as M-estimation and least trimmed square (LTS). Unfortunately, they are not appropriate for image retrieval system because they are computationally too complex to search a large database.

In this paper, in order to remedy the partial matching problem, we detect and exclude the outliers first by an outlier test function, and then apply the proposed PMHD to the remaining feature points. Let us define the outlier test function by


where is a prespecific threshold for the outlier detection. The above function indicates that is inlier if , otherwise outlier.

Now let us define two directed Hausdorff distances with and without outliers by



Then the new modified directed partial PMHD is obtained by


where is a prespecific threshold for the control of a faction of information loss.

4. Experimental Results

4.1. The Database and Queries

To evaluate the retrieval precision and recall performance of the proposed retrieval system, several experiments have been conducted on a real database. We used 5200 images selected from commercially available Corel color image database without any modification. There are 52 semantic categories, each of them containing 100 images. Among those, we have chosen four sets of data including Cheetah, Eagle, Pyramids, and Royal guards as the query. Some example images in the queries are shown in Figure 3. We note in Figure 3 that since the original categorization of images was not based on the color information, substantial amount of variations in color still exist even in the same category. Nonetheless, in this experiment, we used all images in these four categories as queries. We computed a precision and recall pair to all query categories, which is commonly used as the retrieval performance measurement [33]. The precision and recall are defined as


where is the number of retrieved relevant images, is the total number of retrieved images, and is the total number of relevant images in the whole database. The precision measures the accuracy of the retrieval and the recall measures the effectiveness of the retrieval performance.

Figure 3
figure 3

Example query images from four categories in the Corel database. (a) Eagle, (b) Cheetah, (c) Pyramids, and (d) Royal guards.

4.2. Retrieval Results for Queries

The performance of the proposed PMHD was compared with five well-known dissimilarity measures, including histogram intersection (HI), -statistics, Jeffrey divergence (JD), and quadratic form (QF) distance, for the fixed binning histogram, and EMD for the signature.

Let and represent two color histograms or signatures. Then, these five dissimilarity measures are defined as follows.

(1)Histogram intersection (HI) [34]:


where is the number of elements in the bin of

(2)-statistics :


where .

(3)Jeffrey divergence (JD) [22]:


where again

(4)Quadratic form (QF) distance [4, 35]:


where is a similarity matrix that encodes the cross-bin relationships based on the perceptual similarity of the representative colors of the bins.

(5)EMD [16, 36]:


where denotes the dissimilarity between the and bins, and is the optimal flow between two distributions. The total cost is minimized subject to the constraints,


As reported in [36], EMD yielded a very good retrieval performance for the small sample size, while JD and performed very well for the larger sample sizes. Leow and Li [19] proposed the novel dissimilarity measure, weighted correlation (WC) which can be used to compare two histograms with different binnings. In the image retrieval, the performance of WC was comparable to other dissimilarity measures, but not good as JD. Therefore, in this paper, we evaluated only the performance of JD.

In order to represent a color image as a fixed histogram representation, the RGB color space was uniformly partitioned into color bins. And a color was quantized to the mean centroid of the cubic bin. While, as mentioned in Section 2, a statistical signature was extracted by applying -means clustering. To compare the performance of the signature-based dissimilarity with other fixed histogram-based ones, the quantization level was matched by clustering a color image into only 10 color feature clusters. The mean color quantization error of the -bin histogram is 5.99 CIE94 units and that of quantized image-based on a statistical signature containing 10 color feature vectors was 5.26 CIE94 units. It is noted that the difference between two quantized image errors are smaller than the perceptibility threshold of 2.2 CIE94 units [37], where two colors are perceptually indistinguishable [19]. The performance of retrieval results of the proposed metric and other dissimilarity measures are summarized by the precision-recall in Figure 4. It is noted that the proposed PMHD dissimilarity measure significantly outperformed other dissimilarity measures for all query images. The performance of PMHD is, on average, higher than the second highest precision rate over the meaningful recall values. And the performance of PMHD with Euclidean distance is almost the same as that of PMHD with CIE94, and usually performed best in the image retrieval. It is somewhat surprisingly noted that EMD performed poorer than other dissimilarity measures in all query categories except "Eagle." This is not coincident with the results reported in [16, 36], where EMD performed very well for the small sample sizes and compact representation but not so well for large sample sizes and wide representation. As indicated in [19], the image size, the number of color features in a signature, and the ground distance may degrade the whole performance of EMD. However, as mentioned before, we only used a signature with 10 color features in this experiment, which is a very compact representation. We note that the large image size of 98 304 pixels or so and the Euclidean ground distance may severely degrade the performance of EMD.

Figure 4
figure 4

Precision-recall curves for various dissimilarity measures on four query categories: (a) Eagle, (b) Cheetah, (c) Pyramids, and (d) Royal guards.

4.3. Dependency on the Number of Color Featuresin a Signatures

In general, the quantization level of a color space, that is, the number of clusters in a signature or the number of bins in the fixed histogram, has an important effect on the overall image retrieval performance. In order to investigate the effect of the level of quantization, we examined the performance of the proposed method according to the number of color features in a signature. In this experiment, two quantization levels of 10 and 30 are compared. In addition, the results showed that the mean color error of 30 color features case was 3.38 CIE94 units, which was much smaller than 5.26 CIE94 units, that of the statistical signature with 10 color features. Figures 1(b) and 1(c) show two sample quantized images of Figure 1(a) by 10 and 30 colors, respectively. It is noted that the quantized image with 30 color features is almost indistinguishable from the original image that contains 256 758 color features.

Figure 5 plots the precision-recall curves of the image retrieval results according to the number of color features in a signature. We compared the retrieval performance of the proposed PMHD with EMD, since EMD was the only dissimilarity measure applicable to signatures. The precision rate of EMD did not vary significantly as the number of color features of a signature increased, as depicted in Figure 5. However, the precision rates of PHMD (especially with the Euclidean and CIE94 distances) with 30 color features became higher than that of PMHD with 10 color features. From this result, we can expect that the performance of the proposed PMHD gets better as the quantization error decreases. Moreover, this implies that PMHD performs especially well for the large sample sizes as well as the compact representation.

Figure 5
figure 5

Comparison of the retrieval performance for varying the number of color features in a signature: (a) Eagle, (b) Cheetah, (c) Pyramids, and (d) Royal guards.

4.4. Partial Matching

In order to assess the performance of the proposed partial PMHD, the same four queries in Figure 3 have been used. The precision-recall performance has been obtained by varying two parameters, and . Figure 6 plots the best performances and the used parameters are shown in Table 1.

Table 1 The best parameters for partial matching: (, ).
Figure 6
figure 6

Precision-recall curves for the partial matching: (a) Eagle, (b) Cheetah, (c) Pyramids, and (d) Royal guards.

It is noted that although the differences between retrieval performances of two metrics were not significantly large, at most in the case of Eagle, the performance of the partial PMHD mostly outperformed that of full PMHD.

There are some problems in employing the partial PMHD. First, as can be noted in Table 1, it is difficult to get appropriate parameters automatically that can be adopted to all queries. The values of parameters severely depend on the type of query. Second, the performance of the partial PMHD can be more worse than that of the PMHD in high recall rate, as shown in Figure 6(a). Moreover, the complexity of the partial PMHD is a little high compared to that of the PMHD. Thus, in order to exploit the advantages of the partial PMHD for CBIR, these drawbacks should be made up for properly.

5. Conclusion

In this paper, we proposed a novel dissimilarity measure for color signatures, perceptually modified Hausdorff distance (PMHD) based on Hausdorff distance. PMHD is insensitive to the characteristics changes of mean color features in a signature, and theoretically sound for incorporating human perception in the metric. Also, in order to deal with partial matching, the partial PMHD was defined, which explicitly removed outlier using the outlier detection function.

The extensive experimental results on a real database showed that the proposed PMHD outperformed other conventional dissimilarity measures. The retrieval performance of the PMHD is, on average, higher than the second highest one in precision rate. Also the performance of the partial PMHD was tested on the same database. Although there were some unresolved problems including high complexity and finding optimal parameters, the performance of the partial PMHD mostly outperformed that of PMHD and showed great potential for general CBIR applications.

In this paper, we have used only the color information for the signature. However, recent studies showed that combining multiple cues including color, texture, scale, and relevance feedback can improve the results drastically and close the semantic gap. Thus, combining these multiple information in a multiresolution framework will be our future work.


  1. Rui Y, Huang TS, Chang S-F: Image retrieval: current techniques, promising directions, and open issues. Journal of Visual Communication and Image Representation 1999,10(1):39-62. 10.1006/jvci.1999.0413

    Article  Google Scholar 

  2. Ma WY, Zhang HJ: Content-Based Image Indexing and Retrieval, Handbook of Multimedia Computing. CRC Press, Boca Raton, Fla, USA; 1999.

    Google Scholar 

  3. Ionescu B, Lambert P, Coquin D, Buzuloiu V: Color-based content retrieval of animation movies: a study. Proceedings of the International Workshop on Content-Based Multimedia Indexing (CBMI '07), June 2007, Talence, France 295-302.

    Google Scholar 

  4. Flickner M, Sawhney H, Niblack W, et al.: Query by image and video content: the QBIC system. Computer 1995,28(9):23-32. 10.1109/2.410146

    Article  Google Scholar 

  5. Pentland A, Picard RW, Sclaroff S: Photobook: content-based manipulation of image databases. International Journal of Computer Vision 1996,18(3):233-254. 10.1007/BF00123143

    Article  Google Scholar 

  6. Smith JR, Chang S-F: VisualSEEk: a fully automated content-based image query system. Proceedings of the 4th ACM International Conference on Multimedia (MULTIMEDIA '96), November 1996, Boston, Mass, USA 87-98.

    Chapter  Google Scholar 

  7. Rui Y, Huang TS, Mehrotra S: Content-based image retrieval with relevance feedback in MARS. Proceedings of the International Conference on Image Processing (ICIP '97), October 1997, Santa Barbara, Calif, USA 2: 815-818.

    Article  Google Scholar 

  8. Rogowitz BE, Frese T, Smith JR, Bouman CA, Kalin EB: Perceptual image similarity experiments. Human Vision and Electronic Imaging III, January 1998, San Jose, Calif, USA, Proceedings of SPIE 3299: 576-590.

    Article  Google Scholar 

  9. Smeulders AWM, Worring M, Santini S, Gupta A, Jain R: Content-based image retrieval at the end of the early years. IEEE Transactions on Pattern Analysis and Machine Intelligence 2000,22(12):1349-1380. 10.1109/34.895972

    Article  Google Scholar 

  10. Wang T, Rui Y, Sun J-G: Constraint based region matching for image retrieval. International Journal of Computer Vision 2004,56(1-2):37-45.

    Article  Google Scholar 

  11. Tieu K, Viola P: Boosting image retrieval. International Journal of Computer Vision 2004,56(1-2):17-36.

    Article  Google Scholar 

  12. Mojsilović A, Hu J, Soljanin E: Extraction of perceptually important colors and similarity measurement for image matching, retrieval, and analysis. IEEE Transactions on Image Processing 2002,11(11):1238-1248. 10.1109/TIP.2002.804260

    Article  MathSciNet  Google Scholar 

  13. Chen J, Pappas TN, Mojsilović A, Rogowitz BE: Adaptive perceptual color-texture image segmentation. IEEE Transactions on Image Processing 2005,14(10):1524-1536.

    Article  Google Scholar 

  14. Huang X, Zhang S, Wang G, Wang H: A new image retrieval method based on optimal color matching. Proceedings of the International Conference on Image Processing, Computer Vision & Pattern Recognition (IPCV '06), June 2006, Las Vegas, Nev, USA 1: 276-281.

    Google Scholar 

  15. Qiu G, Lam K-M: Frequency layered color indexing for content-based image retrieval. IEEE Transactions on Image Processing 2003,12(1):102-113. 10.1109/TIP.2002.806228

    Article  Google Scholar 

  16. Rubner Y, Tomasi C: Perceptual Metrics for Image Database Navigation. Kluwer Academic Publishers, Norwell, Mass, USA; 2001.

    Book  MATH  Google Scholar 

  17. Dorado A, Izquierdo E: Fuzzy color signatures. Proceedings of the International Conference on Image Processing (ICIP '02), September 2002, Rochester, NY, USA 1: 433-436.

    Article  Google Scholar 

  18. Wan X, Jay Kuo C-C: A new approach to image retrieval with hierarchical color clustering. IEEE Transactions on Circuits and Systems for Video Technology 1998,8(5):628-643. 10.1109/76.718509

    Article  Google Scholar 

  19. Leow WK, Li R: The analysis and applications of adaptive-binning color histograms. Computer Vision and Image Understanding 2004,94(1–3):67-91.

    Article  Google Scholar 

  20. Theoharatos C, Economou G, Fotopoulos S, Laskaris NA: Color-based image retrieval using vector quantization and multivariate graph matching. Proceedings of the IEEE International Conference on Image Processing (ICIP '05), September 2005, Genova, Italy 1: 537-540.

    Google Scholar 

  21. Sun J, Zhang X, Cui J, Zhou L: Image retrieval based on color distribution entropy. Pattern Recognition Letters 2006,27(10):1122-1126. 10.1016/j.patrec.2005.12.014

    Article  Google Scholar 

  22. Puzicha J, Hofmann T, Buhmann JM: Non-parametric similarity measures for unsupervised texture segmentation and image retrieval. Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR '97), June 1997, San Juan, Puerto Rico 267-272.

    Chapter  Google Scholar 

  23. Huttenlocher DP, Klanderman GA, Rucklidge WJ: Comparing images using the Hausdorff distance. IEEE Transactions on Pattern Analysis and Machine Intelligence 1993,15(9):850-863. 10.1109/34.232073

    Article  Google Scholar 

  24. Dubuisson M-P, Jain AK: A modified Hausdorff distance for object matching. Proceedings of the 12th IAPR International Conference on Pattern Recognition, Conference A: Computer Vision & Image Processing (ICPR '94), October 1994, Jerusalem, Israel 1: 566-568.

    Article  Google Scholar 

  25. Azencott R, Durbin F, Paumard J: Multiscale identification of building in compressed large aerial scenes. Proceedings of 13th International Conference on Pattern Recognition (ICPR '96), August 1996, Vienna, Austria 3: 974-978.

    Article  Google Scholar 

  26. Kim SH, Park R-H: A novel approach to video sequence matching using color and edge features with the modified Hausdorff distance. Proceedings of the International Symposium on Circuits and Systems (ISCAS '04), May 2004, Vancouver, Canada 2: 57-60.

    Google Scholar 

  27. Duda RO, Hart PE, Stork DG: Pattern Classification. John Wiley & Sons, New York, NY, USA; 2001.

    MATH  Google Scholar 

  28. Sim D-G, Kwon O-K, Park R-H: Object matching algorithms using robust Hausdorff distance measures. IEEE Transactions on Image Processing 1999,8(3):425-429. 10.1109/83.748897

    Article  Google Scholar 

  29. Plataniotis KN, Venetsanopoulos AN: Color Image Processing and Applications. Springer, New York, NY, USA; 2000.

    Book  Google Scholar 

  30. Melgosa M: Testing CIELAB-based color-difference formulas. Color Research & Application 2000,25(1):49-55. 10.1002/(SICI)1520-6378(200002)25:1<49::AID-COL7>3.0.CO;2-4

    Article  Google Scholar 

  31. Imai FH, Tsumura N, Miyake Y: Perceptual color difference metric for complex images based on Mahalanobis distance. Journal of Electronic Imaging 2001,10(2):385-393. 10.1117/1.1350559

    Article  Google Scholar 

  32. Gouet V, Boujemaa N: About optimal use of color points of interest for content-based image retrieval. In Research Report RR-4439. INRIA Rocquencourt, Paris, France; 2002.

    Google Scholar 

  33. Del Bimbo A: Visual Information Retrieval. Morgan Kaufmann, San Francisco, Calif, USA; 1999.

    Google Scholar 

  34. Swain MJ, Ballard DH: Color indexing. International Journal of Computer Vision 1991,7(1):11-32. 10.1007/BF00130487

    Article  Google Scholar 

  35. Hafner J, Sawhney HS, Equitz W, Flickner M, Niblack W: Efficient color histogram indexing for quadratic form distance functions. IEEE Transactions on Pattern Analysis and Machine Intelligence 1995,17(7):729-736. 10.1109/34.391417

    Article  Google Scholar 

  36. Puzicha J, Buhmann JM, Rubner Y, Tomasi C: Empirical evaluation of dissimilarity measures for color and texture. Proceedings of the 7th IEEE International Conference on Computer Vision (ICCV '99), September 1999, Kerkyra, Greece 2: 1165-1172.

    Article  MATH  Google Scholar 

  37. Song T, Luo R: Testing color-difference formulae on complex images using a CRT monitor. Proceedings of the 8th IS&T/SID Color Imaging Conference (IS&T '00), November 2000, Scottsdale, Ariz, USA 44-48.

    Google Scholar 

Download references


This work was supported in part by the ITRC program by Ministry of Information and Communication and in part by Defense Acquisition Program Administration and Agency for Defense Development, Korea, through the Image Information Research Center under Contract no. UD070007AD.

Author information

Authors and Affiliations


Corresponding author

Correspondence to Kyoung Mu Lee.

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 2.0 International License ( ), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Reprints and Permissions

About this article

Cite this article

Park, B.G., Lee, K.M. & Lee, S.U. Color-Based Image Retrieval Using Perceptually Modified Hausdorff Distance. J Image Video Proc 2008, 263071 (2007).

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI:


  • Image Retrieval
  • Color Feature
  • Hausdorff Distance
  • Color Histogram
  • Retrieval Performance