Empirische Inferenz Talk 2010

Solving Large-Scale Nonnegative Least Squares

We study the fundamental problem of nonnegative least squares. This problem was apparently introduced by Lawson and Hanson [1] under the name NNLS. As is evident from its name, NNLS seeks least-squares solutions that are also nonnegative. Owing to its wide-applicability numerous algorithms have been derived for NNLS, beginning from the active-set approach of Lawson and Han- son [1] leading up to the sophisticated interior-point method of Bellavia et al. [2]. We present a new algorithm for NNLS that combines projected subgradients with the non-monotonic gradient descent idea of Barzilai and Borwein [3]. Our resulting algorithm is called BBSG, and we guarantee its convergence by ex- ploiting properties of NNLS in conjunction with projected subgradients. BBSG is surprisingly simple and scales well to large problems. We substantiate our claims by empirically evaluating BBSG and comparing it with established con- vex solvers and specialized NNLS algorithms. The numerical results suggest that BBSG is a practical method for solving large-scale NNLS problems.

Author(s): Sra, S.
Year: 2010
Month: June
Day: 0
Bibtex Type: Talk (talk)
Digital: 0
Electronic Archiving: grant_archive
Event Name: 16th Conference of the International Linear Algebra Society (ILAS 2010)
Event Place: Pisa, Italy
Language: en
Organization: Max-Planck-Gesellschaft
School: Biologische Kybernetik
Links:

BibTex

@talk{6839,
  title = {Solving Large-Scale Nonnegative Least Squares},
  abstract = {We study the fundamental problem of nonnegative least squares. This problem
  was apparently introduced by Lawson and Hanson [1] under the name NNLS.
  As is evident from its name, NNLS seeks least-squares solutions that are also
  nonnegative. Owing to its wide-applicability numerous algorithms have been
  derived for NNLS, beginning from the active-set approach of Lawson and Han-
  son [1] leading up to the sophisticated interior-point method of Bellavia et al. [2].
  We present a new algorithm for NNLS that combines projected subgradients
  with the non-monotonic gradient descent idea of Barzilai and Borwein [3]. Our
  resulting algorithm is called BBSG, and we guarantee its convergence by ex-
  ploiting properties of NNLS in conjunction with projected subgradients. BBSG
  is surprisingly simple and scales well to large problems. We substantiate our
  claims by empirically evaluating BBSG and comparing it with established con-
  vex solvers and specialized NNLS algorithms. The numerical results suggest
  that BBSG is a practical method for solving large-scale NNLS problems.},
  organization = {Max-Planck-Gesellschaft},
  school = {Biologische Kybernetik},
  month = jun,
  year = {2010},
  slug = {6839},
  author = {Sra, S.},
  month_numeric = {6}
}