Back
On Fast Approximate Submodular Minimization
We are motivated by an application to extract a representative subset of machine learning training data and by the poor empirical performance we observe of the popular minimum norm algorithm. In fact, for our application, minimum norm can have a running time of about O(n7) (O(n5) oracle calls). We therefore propose a fast approximate method to minimize arbitrary submodular functions. For a large sub-class of submodular functions, the algorithm is exact. Other submodular functions are iteratively approximated by tight submodular upper bounds, and then repeatedly optimized. We show theoretical properties, and empirical results suggest significant speedups over minimum norm while retaining higher accuracies.
@inproceedings{JegelkaLB2011, title = {On Fast Approximate Submodular Minimization}, booktitle = {Advances in Neural Information Processing Systems 24}, abstract = {We are motivated by an application to extract a representative subset of machine learning training data and by the poor empirical performance we observe of the popular minimum norm algorithm. In fact, for our application, minimum norm can have a running time of about O(n7) (O(n5) oracle calls). We therefore propose a fast approximate method to minimize arbitrary submodular functions. For a large sub-class of submodular functions, the algorithm is exact. Other submodular functions are iteratively approximated by tight submodular upper bounds, and then repeatedly optimized. We show theoretical properties, and empirical results suggest significant speedups over minimum norm while retaining higher accuracies.}, pages = {460-468}, editors = {J Shawe-Taylor and RS Zemel and P Bartlett and F Pereira and KQ Weinberger}, year = {2011}, slug = {jegelkalb2011}, author = {Jegelka, S. and Lin, H. and Bilmes, J.} }