- Research Article
- Open access
- Published:
A Multimodal Constellation Model for Object Image Classification
EURASIP Journal on Image and Video Processing volume 2010, Article number: 426781 (2010)
Abstract
We present an efficient method for object image classification. The method is an extention of the constellation model, which is a part-based model. Generally, constellation model has two weak points. (1) It is essentially a unimodal model which is unsuitable to be applied for categories with many types of appearances. (2) The probability function that represents the constellation model requires a high calculation cost. We introduced multimodalization and speed-up technique to the constellation model to overcome these weak points. The proposed model consists of multiple subordinate constellation models so that diverse types of appearances of an object category could be described by each of them, leading to the increase of description accuracy and consequently, improvement of the classification performance. In this paper, we present how to describe each type of appearance as a subordinate constellation model without any prior knowledge regarding the types of appearances, and also the implementation of the extended model's learning in realistic time. In experiments, we confirmed the effectiveness of the proposed model by comparison to methods using BoF, and also that the model learning could be realized in realistic time.
Publisher note
To access the full article, please see PDF.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
Open Access This article is distributed under the terms of the Creative Commons Attribution 2.0 International License (https://creativecommons.org/licenses/by/2.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
About this article
Cite this article
Kamiya, Y., Takahashi, T., Ide, I. et al. A Multimodal Constellation Model for Object Image Classification. J Image Video Proc 2010, 426781 (2010). https://doi.org/10.1155/2010/426781
Received:
Revised:
Accepted:
Published:
DOI: https://doi.org/10.1155/2010/426781