In this paper we present a method for retrieval of color images via query-by-example which takes into consideration human perception regarding the number of colors present in images being compared. First, we perform HSV-space segmentation to extract region of perceptually relevant color, to build a low-level color representation. This segmented result is then sued to build image indices by taking the representative color vector of each of the extracted color regions. For retrieval, we implement a vector-angular based measure and a perceptually-tuned membership function, instead of color histograms, which proves to provide results consistent with experimentally obtained human results. Through human testing, collected perceptual data governs how many colors two images can have in relation to each other so that the images can be considered for similarity calculations. The initial results show that there is a well-defined range for this color cardinality which increases as the number of colors increases.
展开▼