Empirische Inferenz Conference Paper 2011

Risk-Based Generalizations of f-divergences

We derive a generalized notion of f-divergences, called (f,l)-divergences. We show that this generalization enjoys many of the nice properties of f-divergences, although it is a richer family. It also provides alternative definitions of standard divergences in terms of surrogate risks. As a first practical application of this theory, we derive a new estimator for the Kulback-Leibler divergence that we use for clustering sets of vectors.

Author(s): García-García, D. and von Luxburg, U. and Santos-Rodríguez, R.
Pages: 417-424
Year: 2011
Month: July
Day: 0
Editors: Getoor, L. , T. Scheffer
Publisher: International Machine Learning Society
Bibtex Type: Conference Paper (inproceedings)
Address: Madison, WI, USA
Event Name: 28th International Conference on Machine Learning (ICML 2011)
Event Place: Bellevue, WA, USA
Digital: 0
Electronic Archiving: grant_archive
ISBN: 978-1-450-30619-5
Links:

BibTex

@inproceedings{GarciaGarciavS2011,
  title = {Risk-Based Generalizations of f-divergences},
  abstract = {We derive a generalized notion of f-divergences, called (f,l)-divergences. We show that this generalization enjoys many of the nice properties of f-divergences, although it is a richer family. It also provides alternative definitions of standard divergences in terms of surrogate risks. As a first practical application of this theory, we derive a new estimator for the Kulback-Leibler divergence that we use for clustering sets of vectors. },
  pages = {417-424},
  editors = {Getoor, L. , T. Scheffer},
  publisher = {International Machine Learning Society},
  address = {Madison, WI, USA},
  month = jul,
  year = {2011},
  slug = {garciagarciavs2011},
  author = {García-García, D. and von Luxburg, U. and Santos-Rodríguez, R.},
  month_numeric = {7}
}