Empirical Inference Talk 2008

New Projected Quasi-Newton Methods with Applications

Box-constrained convex optimization problems are central to several applications in a variety of fields such as statistics, psychometrics, signal processing, medical imaging, and machine learning. Two fundamental examples are the non-negative least squares (NNLS) problem and the non-negative Kullback-Leibler (NNKL) divergence minimization problem. The non-negativity constraints are usually based on an underlying physical restriction, for e.g., when dealing with applications in astronomy, tomography, statistical estimation, or image restoration, the underlying parameters represent physical quantities such as concentration, weight, intensity, or frequency counts and are therefore only interpretable with non-negative values. Several modern optimization methods can be inefficient for simple problems such as NNLS and NNKL as they are really designed to handle far more general and complex problems. In this work we develop two simple quasi-Newton methods for solving box-constrained (differentiable) convex optimization problems that utilize the well-known BFGS and limited memory BFGS updates. We position our method between projected gradient (Rosen, 1960) and projected Newton (Bertsekas, 1982) methods, and prove its convergence under a simple Armijo step-size rule. We illustrate our method by showing applications to: Image deblurring, Positron Emission Tomography (PET) image reconstruction, and Non-negative Matrix Approximation (NMA). On medium sized data we observe performance competitive to established procedures, while for larger data the results are even better.

Author(s): Sra, S.
Year: 2008
Month: December
Day: 0
Bibtex Type: Talk (talk)
Digital: 0
Electronic Archiving: grant_archive
Event Name: Microsoft Research Tech-talk
Language: en
Organization: Max-Planck-Gesellschaft
School: Biologische Kybernetik
Links:

BibTex

@talk{5651,
  title = {New Projected Quasi-Newton Methods with Applications},
  abstract = {Box-constrained convex optimization problems are central to several
  applications in a variety of fields such as statistics, psychometrics,
  signal processing, medical imaging, and machine learning. Two fundamental
  examples are the non-negative least squares (NNLS) problem and the
  non-negative Kullback-Leibler (NNKL) divergence minimization problem. The
  non-negativity constraints are usually based on an underlying physical
  restriction, for e.g., when dealing with applications in astronomy,
  tomography, statistical estimation, or image restoration, the underlying
  parameters represent physical quantities such as concentration, weight,
  intensity, or frequency counts and are therefore only interpretable with
  non-negative values.  Several modern optimization methods can be
  inefficient for simple problems
  such as NNLS and NNKL as they are really designed to handle far more
  general and complex problems.
  In this work we develop two simple quasi-Newton methods for solving
  box-constrained
  (differentiable) convex optimization problems that utilize the well-known
  BFGS and limited memory BFGS updates.  We position our method between
  projected gradient (Rosen, 1960) and projected Newton (Bertsekas, 1982)
  methods, and prove its convergence under a simple Armijo step-size rule. We
  illustrate our method by showing applications to: Image deblurring, Positron
  Emission Tomography (PET) image reconstruction, and Non-negative Matrix
  Approximation (NMA). On medium sized data we observe performance competitive
  to established procedures, while for larger data the results are even
  better.},
  organization = {Max-Planck-Gesellschaft},
  school = {Biologische Kybernetik},
  month = dec,
  year = {2008},
  slug = {5651},
  author = {Sra, S.},
  month_numeric = {12}
}