Empirical Inference Conference Paper 2008

Learning with Transformation Invariant Kernels

This paper considers kernels invariant to translation, rotation and dilation. We show that no non-trivial positive definite (p.d.) kernels exist which are radial and dilation invariant, only conditionally positive definite (c.p.d.) ones. Accordingly, we discuss the c.p.d. case and provide some novel analysis, including an elementary derivation of a c.p.d. representer theorem. On the practical side, we give a support vector machine (s.v.m.) algorithm for arbitrary c.p.d. kernels. For the thinplate kernel this leads to a classifier with only one parameter (the amount of regularisation), which we demonstrate to be as effective as an s.v.m. with the Gaussian kernel, even though the Gaussian involves a second parameter (the length scale).

Author(s): Walder, C. and Chapelle, O.
Book Title: Advances in neural information processing systems 20
Journal: Advances in Neural Information Processing Systems 20: 21st Annual Conference on Neural Information Processing Systems 2007
Pages: 1561-1568
Year: 2008
Month: September
Day: 0
Editors: Platt, J. C., D. Koller, Y. Singer, S. Roweis
Publisher: Curran
Bibtex Type: Conference Paper (inproceedings)
Address: Red Hook, NY, USA
Event Name: Twenty-First Annual Conference on Neural Information Processing Systems (NIPS 2007)
Event Place: Vancouver, BC, Canada
Digital: 0
Electronic Archiving: grant_archive
ISBN: 978-1-605-60352-0
Language: en
Organization: Max-Planck-Gesellschaft
School: Biologische Kybernetik
Links:

BibTex

@inproceedings{4736,
  title = {Learning with Transformation Invariant Kernels},
  journal = {Advances in Neural Information Processing Systems 20: 21st Annual Conference on Neural Information Processing Systems 2007},
  booktitle = {Advances in neural information processing systems 20},
  abstract = {This paper considers kernels invariant to translation, rotation and dilation. We show that no non-trivial positive definite (p.d.) kernels exist which are radial and
  dilation invariant, only conditionally positive definite (c.p.d.) ones. Accordingly, we discuss the c.p.d. case and provide some novel analysis, including an elementary derivation of a c.p.d. representer theorem. On the practical side, we give a support vector machine (s.v.m.) algorithm for arbitrary c.p.d. kernels. For the thinplate
  kernel this leads to a classifier with only one parameter (the amount of regularisation), which we demonstrate to be as effective as an s.v.m. with the Gaussian kernel, even though the Gaussian involves a second parameter (the length scale).},
  pages = {1561-1568},
  editors = {Platt, J. C., D. Koller, Y. Singer, S. Roweis},
  publisher = {Curran},
  organization = {Max-Planck-Gesellschaft},
  school = {Biologische Kybernetik},
  address = {Red Hook, NY, USA},
  month = sep,
  year = {2008},
  slug = {4736},
  author = {Walder, C. and Chapelle, O.},
  month_numeric = {9}
}