Empirical Inference Book Chapter 2007

Training a Support Vector Machine in the Primal

Most literature on Support Vector Machines (SVMs) concentrate on the dual optimization problem. In this paper, we would like to point out that the primal problem can also be solved efficiently, both for linear and non-linear SVMs, and that there is no reason to ignore this possibility. On the contrary, from the primal point of view new families of algorithms for large scale SVM training can be investigated.

Author(s): Chapelle, O.
Book Title: Large Scale Kernel Machines
Pages: 29-50
Year: 2007
Month: September
Day: 0
Series: Neural Information Processing
Editors: Bottou, L. , O. Chapelle, D. DeCoste, J. Weston
Publisher: MIT Press
Bibtex Type: Book Chapter (inbook)
Address: Cambridge, MA, USA
Digital: 0
Electronic Archiving: grant_archive
Language: en
Note: This is a slightly updated version of the Neural Computation paper
Organization: Max-Planck-Gesellschaft
School: Biologische Kybernetik
Links:

BibTex

@inbook{4178,
  title = {Training a Support Vector Machine in the Primal},
  booktitle = {Large Scale Kernel Machines},
  abstract = {Most literature on Support Vector Machines (SVMs) concentrate on
  the dual optimization problem. In this paper, we would like to point out
  that the primal problem can also be solved efficiently, both for linear
  and non-linear SVMs, and that there is no reason to ignore this possibility.
  On the contrary, from the primal point of view new families of algorithms for
  large scale SVM training can be investigated.},
  pages = {29-50},
  series = {Neural Information Processing},
  editors = {Bottou, L. , O. Chapelle, D. DeCoste, J. Weston},
  publisher = {MIT Press},
  organization = {Max-Planck-Gesellschaft},
  school = {Biologische Kybernetik},
  address = {Cambridge, MA, USA},
  month = sep,
  year = {2007},
  note = {This is a slightly updated version of the Neural Computation paper},
  slug = {4178},
  author = {Chapelle, O.},
  month_numeric = {9}
}