Lower Bounds on the Performance of Polynomial-time Algorithms for Sparse Linear Regression
Y. Zhang, M. Wainwright, and M. Jordan. Proceedings of The 27th Conference on Learning Theory, volume 35 of Proceedings of Machine Learning Research, page 921--948. Barcelona, Spain, PMLR, (13--15 Jun 2014)
Abstract
Under a standard assumption in complexity theory (NP not in P/poly), we demonstrate a gap between the minimax prediction risk for sparse linear regression that can be achieved by polynomial-time algorithms, and that achieved by optimal algorithms. In particular, when the design matrix is ill-conditioned, the minimax prediction loss achievable by polynomial-time algorithms can be substantially greater than that of an optimal algorithm. This result is the first known gap between polynomial and optimal algorithms for sparse linear regression, and does not depend on conjectures in average-case complexity.
Description
Lower Bounds on the Performance of Polynomial-time Algorithms for Sparse Linear Regression
%0 Conference Paper
%1 pmlr-v35-zhang14
%A Zhang, Yuchen
%A Wainwright, Martin J.
%A Jordan, Michael I.
%B Proceedings of The 27th Conference on Learning Theory
%C Barcelona, Spain
%D 2014
%E Balcan, Maria Florina
%E Feldman, Vitaly
%E Szepesvári, Csaba
%I PMLR
%K bounds complexity computation sparsity
%P 921--948
%T Lower Bounds on the Performance of Polynomial-time Algorithms for Sparse Linear Regression
%U http://proceedings.mlr.press/v35/zhang14.html
%V 35
%X Under a standard assumption in complexity theory (NP not in P/poly), we demonstrate a gap between the minimax prediction risk for sparse linear regression that can be achieved by polynomial-time algorithms, and that achieved by optimal algorithms. In particular, when the design matrix is ill-conditioned, the minimax prediction loss achievable by polynomial-time algorithms can be substantially greater than that of an optimal algorithm. This result is the first known gap between polynomial and optimal algorithms for sparse linear regression, and does not depend on conjectures in average-case complexity.
@inproceedings{pmlr-v35-zhang14,
abstract = {Under a standard assumption in complexity theory (NP not in P/poly), we demonstrate a gap between the minimax prediction risk for sparse linear regression that can be achieved by polynomial-time algorithms, and that achieved by optimal algorithms. In particular, when the design matrix is ill-conditioned, the minimax prediction loss achievable by polynomial-time algorithms can be substantially greater than that of an optimal algorithm. This result is the first known gap between polynomial and optimal algorithms for sparse linear regression, and does not depend on conjectures in average-case complexity.},
added-at = {2019-11-14T14:10:41.000+0100},
address = {Barcelona, Spain},
author = {Zhang, Yuchen and Wainwright, Martin J. and Jordan, Michael I.},
biburl = {https://www.bibsonomy.org/bibtex/2e9f9c7b0d8cfc4bdaa9de1021bf40e41/kirk86},
booktitle = {Proceedings of The 27th Conference on Learning Theory},
description = {Lower Bounds on the Performance of Polynomial-time Algorithms for Sparse Linear Regression},
editor = {Balcan, Maria Florina and Feldman, Vitaly and Szepesvári, Csaba},
interhash = {a9f75004c4135c2b5b0522f882311997},
intrahash = {e9f9c7b0d8cfc4bdaa9de1021bf40e41},
keywords = {bounds complexity computation sparsity},
month = {13--15 Jun},
pages = {921--948},
pdf = {http://proceedings.mlr.press/v35/zhang14.pdf},
publisher = {PMLR},
series = {Proceedings of Machine Learning Research},
timestamp = {2019-11-14T14:10:41.000+0100},
title = {Lower Bounds on the Performance of Polynomial-time Algorithms for Sparse Linear Regression},
url = {http://proceedings.mlr.press/v35/zhang14.html},
volume = 35,
year = 2014
}