{"id":163159,"date":"2013-01-01T00:00:00","date_gmt":"2013-01-01T00:00:00","guid":{"rendered":"https:\/\/www.microsoft.com\/en-us\/research\/msr-research-item\/nearly-optimal-minimax-estimator-for-high-dimensional-sparse-linear-regression\/"},"modified":"2018-10-16T20:01:37","modified_gmt":"2018-10-17T03:01:37","slug":"nearly-optimal-minimax-estimator-for-high-dimensional-sparse-linear-regression","status":"publish","type":"msr-research-item","link":"https:\/\/www.microsoft.com\/en-us\/research\/publication\/nearly-optimal-minimax-estimator-for-high-dimensional-sparse-linear-regression\/","title":{"rendered":"Nearly optimal minimax estimator for high dimensional sparse linear regression"},"content":{"rendered":"
\n

We present nearly optimal minimax estimators for the classical problem of linear regression with soft sparsity constraints. Our result applies to any design matrix and represents the first result of this kind.<\/p>\n

In the linear regression problem, one is given an m*n design matrix X and a noisy observation y+g in Rm<\/sup> where y=X\u03b8 for some unknown \u03b8 in Rn<\/sup>, and g is the noise drawn from m-dimensional multivariate Gaussian distribution. In addition, we assume that \u03b8 satisfies the soft sparsity constraint, i.e. \u03b8 is in the unit Lp<\/sub> ball for p in (0,1]. We are interested in designing estimators to minimize the maximum error (or risk), measured in terms of the squared loss.<\/p>\n

The main result of this paper is the construction of a novel family of estimators, which we call the hybrid estimator, with risk O((log n)1-p\/2<\/sup>) factor within the optimal for any m*n design matrix X as long as n=\u03a9(m\/log m). The hybrid estimator is a combination of two classic estimators: the truncated series estimator and the least squares estimator. The analysis is motivated by two recent work by Raskutti-Wainwright-Yu and Javanmard-Zhang, respectively.<\/p>\n<\/div>\n

<\/p>\n","protected":false},"excerpt":{"rendered":"

We present nearly optimal minimax estimators for the classical problem of linear regression with soft sparsity constraints. Our result applies to any design matrix and represents the first result of this kind. In the linear regression problem, one is given an m*n design matrix X and a noisy observation y+g in Rm where y=X\u03b8 for […]<\/p>\n","protected":false},"featured_media":0,"template":"","meta":{"msr-url-field":"","msr-podcast-episode":"","msrModifiedDate":"","msrModifiedDateEnabled":false,"ep_exclude_from_search":false,"footnotes":""},"msr-content-type":[3],"msr-research-highlight":[],"research-area":[13546],"msr-publication-type":[193715],"msr-product-type":[],"msr-focus-area":[],"msr-platform":[],"msr-download-source":[],"msr-locale":[268875],"msr-field-of-study":[],"msr-conference":[],"msr-journal":[],"msr-impact-theme":[],"msr-pillar":[],"class_list":["post-163159","msr-research-item","type-msr-research-item","status-publish","hentry","msr-research-area-computational-sciences-mathematics","msr-locale-en_us"],"msr_publishername":"","msr_edition":"Annals of Statistics","msr_affiliation":"","msr_published_date":"2013-01-01","msr_host":"","msr_duration":"","msr_version":"","msr_speaker":"","msr_other_contributors":"","msr_booktitle":"","msr_pages_string":"","msr_chapter":"","msr_isbn":"","msr_journal":"Annals of Statistics","msr_volume":"","msr_number":"","msr_editors":"","msr_series":"","msr_issue":"","msr_organization":"","msr_how_published":"","msr_notes":"To appear.","msr_highlight_text":"","msr_release_tracker_id":"","msr_original_fields_of_study":"","msr_download_urls":"","msr_external_url":"","msr_secondary_video_url":"","msr_longbiography":"","msr_microsoftintellectualproperty":1,"msr_main_download":"218884","msr_publicationurl":"http:\/\/arxiv.org\/abs\/1206.6536","msr_doi":"10.1214\/13-AOS1141","msr_publication_uploader":[{"type":"file","title":"mmlp.pdf","viewUrl":"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2013\/01\/mmlp.pdf","id":218884,"label_id":0},{"type":"url","title":"http:\/\/arxiv.org\/abs\/1206.6536","viewUrl":false,"id":false,"label_id":0},{"type":"doi","title":"10.1214\/13-AOS1141","viewUrl":false,"id":false,"label_id":0}],"msr_related_uploader":"","msr_attachments":[{"id":0,"url":"http:\/\/arxiv.org\/abs\/1206.6536"},{"id":218884,"url":"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2013\/01\/mmlp.pdf"}],"msr-author-ordering":[{"type":"user_nicename","value":"lzha","user_id":32760,"rest_url":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/microsoft-research\/v1\/researchers?person=lzha"}],"msr_impact_theme":[],"msr_research_lab":[],"msr_event":[],"msr_group":[],"msr_project":[171268],"publication":[],"video":[],"download":[],"msr_publication_type":"article","_links":{"self":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-research-item\/163159"}],"collection":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-research-item"}],"about":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/types\/msr-research-item"}],"version-history":[{"count":1,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-research-item\/163159\/revisions"}],"predecessor-version":[{"id":519364,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-research-item\/163159\/revisions\/519364"}],"wp:attachment":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/media?parent=163159"}],"wp:term":[{"taxonomy":"msr-content-type","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-content-type?post=163159"},{"taxonomy":"msr-research-highlight","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-research-highlight?post=163159"},{"taxonomy":"msr-research-area","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/research-area?post=163159"},{"taxonomy":"msr-publication-type","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-publication-type?post=163159"},{"taxonomy":"msr-product-type","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-product-type?post=163159"},{"taxonomy":"msr-focus-area","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-focus-area?post=163159"},{"taxonomy":"msr-platform","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-platform?post=163159"},{"taxonomy":"msr-download-source","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-download-source?post=163159"},{"taxonomy":"msr-locale","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-locale?post=163159"},{"taxonomy":"msr-field-of-study","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-field-of-study?post=163159"},{"taxonomy":"msr-conference","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-conference?post=163159"},{"taxonomy":"msr-journal","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-journal?post=163159"},{"taxonomy":"msr-impact-theme","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-impact-theme?post=163159"},{"taxonomy":"msr-pillar","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-pillar?post=163159"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}