Empirical Inference Conference Paper 2007

Branch and Bound for Semi-Supervised Support Vector Machines

Semi-supervised SVMs (S3VMs) attempt to learn low-density separators by maximizing the margin over labeled and unlabeled examples. The associated optimization problem is non-convex. To examine the full potential of S3VMs modulo local minima problems in current implementations, we apply branch and bound techniques for obtaining exact, globally optimal solutions. Empirical evidence suggests that the globally optimal solution can return excellent generalization performance in situations where other implementations fail completely. While our current implementation is only applicable to small datasets, we discuss variants that can potentially lead to practically useful algorithms.

Author(s): Chapelle, O. and Sindhwani, V. and Keerthi, SS.
Book Title: Advances in Neural Information Processing Systems 19
Journal: Advances in Neural Information Processing Systems 19: Proceedings of the 2006 Conference
Pages: 217-224
Year: 2007
Month: September
Day: 0
Editors: Sch{\"o}lkopf, B. , J. Platt, T. Hofmann
Publisher: MIT Press
Bibtex Type: Conference Paper (inproceedings)
Address: Cambridge, MA, USA
Event Name: Twentieth Annual Conference on Neural Information Processing Systems (NIPS 2006)
Event Place: Vancouver, BC, Canada
Digital: 0
Electronic Archiving: grant_archive
ISBN: 0-262-19568-2
Language: en
Organization: Max-Planck-Gesellschaft
School: Biologische Kybernetik
Links:

BibTex

@inproceedings{4146,
  title = {Branch and Bound for Semi-Supervised Support Vector Machines},
  journal = {Advances in Neural Information Processing Systems 19: Proceedings of the 2006 Conference},
  booktitle = {Advances in Neural Information Processing Systems 19},
  abstract = {Semi-supervised SVMs (S3VMs) attempt to learn low-density separators by maximizing the margin over labeled and unlabeled examples.  The associated optimization problem is non-convex. To examine the full potential of S3VMs modulo local minima problems in current implementations, we apply branch and bound techniques for obtaining exact, globally optimal solutions. Empirical evidence suggests that the globally optimal solution can return excellent generalization performance in situations where other implementations fail completely. While our current implementation is only applicable to small datasets,
  we discuss variants that can potentially lead to practically useful algorithms.},
  pages = {217-224},
  editors = {Sch{\"o}lkopf, B. , J. Platt, T. Hofmann},
  publisher = {MIT Press},
  organization = {Max-Planck-Gesellschaft},
  school = {Biologische Kybernetik},
  address = {Cambridge, MA, USA},
  month = sep,
  year = {2007},
  slug = {4146},
  author = {Chapelle, O. and Sindhwani, V. and Keerthi, SS.},
  month_numeric = {9}
}