Séminaire Donnees et APprentissage ArtificielRSS

The Burbea-Rao and Bhattacharyya centroids

04/11/2010
Intervenant(s) : Frank Nielsen (Laboratoire d'Informatique (LIX), Ecole Polytechnique)
We study the centroid with respect to the class of information-theoretic distortion measures called Burbea-Rao divergences. Burbea-Rao divergences generalize the Jensen-Shannon divergence by measuring the non-negative Jensen difference induced by a strictly convex and differentiable function expressing a measure of entropy. We first show that a symmetrization of Bregman divergences called Jensen-Bregman distances yields a natural definition of Burbea-Rao divergences. We then define skew Burbea-Rao divergences, and prove that skew Burbea-Rao divergences amount to compute Bregman divergences in asymptotical cases. We prove that Burbea-Rao centroids are always unique, and we design a generic iterative algorithm for efficiently estimating those centroids with guaranteed convergence. In statistics, the Bhattacharyya distance is widely used to measure the degree of overlap of probability distributions. This distance notion is all the more useful as it provides both upper and lower bounds on Bayes misclassification error, and turns out to be equal at the infinitesimal level to Fisher information. We show that Bhattacharyya distances on members of the same exponential family amount to calculate a Burbea-Rao divergence. We thus get as a byproduct an efficient algorithm for computing the Bhattacharyya centroid of a set of parametric distributions belonging to the same exponential families, improving over former specialized methods that were mostly limited to univariate or "diagonal" multivariate Gaussians.
http://www.informationgeometry.org/BurbeaRao/

Plus d'informations ici …
Sahar.Changuel (at) nulllip6.fr
Mentions légales
Carte du site