Adaptive Information Selection in Images: Efficient Naive Bayes Nearest Neighbor Classification
by , ,
Abstract:
We propose different methods for adaptively selecting information in images during object recognition. In contrast to standard feature selection, we consider this problem in a Bayesian framework where features are sequentially selected based on the current belief distribution over object classes. We define three different selection criteria and provide efficient Monte Carlo algorithms for the selection. In particular, we extend the successful Naive Bayes Nearest Neighbor (NBNN) classification approach, which is very costly to compute in its original form. We show that the proposed information selection methods result in a significant speed-up because only a small number of features needs to be extracted for accurate classification. In addition to adaptive methods based on the current belief distribution, we also consider image-based selection methods and we evaluate the performance of the different methods on a standard object recognition data set.
Reference:
Adaptive Information Selection in Images: Efficient Naive Bayes Nearest Neighbor Classification (Thomas Reineking, Tobias Kluth, David Nakath), Chapter in Computer Analysis of Images and Patterns, Springer Science + Business Media, 2015.
Bibtex Entry:
@InCollection{Reineking2015,
  author    = {Thomas Reineking and Tobias Kluth and David Nakath},
  title     = {Adaptive Information Selection in Images: Efficient Naive {Bayes} Nearest Neighbor Classification},
  booktitle = {Computer Analysis of Images and Patterns},
  publisher = {Springer Science + Business Media},
  year      = {2015},
  pages     = {350--361},
  abstract  = {We propose different methods for adaptively selecting information in images during object recognition. In contrast to standard feature selection, we consider this problem in a Bayesian framework where features are sequentially selected based on the current belief distribution over object classes. We define three different selection criteria and provide efficient Monte Carlo algorithms for the selection. In particular, we extend the successful Naive Bayes Nearest Neighbor (NBNN) classification approach, which is very costly to compute in its original form. We show that the proposed information selection methods result in a significant speed-up because only a small number of features needs to be extracted for accurate classification. In addition to adaptive methods based on the current belief distribution, we also consider image-based selection methods and we evaluate the performance of the different methods on a standard object recognition data set.},
  doi       = {10.1007/978-3-319-23192-1_29},
  keywords  = {former_inproceedings},
  url       = {10.1007/978-3-319-23192-1_29">http://dx.doi.org/10.1007/978-3-319-23192-1_29},
}