Taylor & Francis Group
Browse
uasa_a_1689984_sm6726.pdf (519.51 kB)

Kernel Meets Sieve: Post-Regularization Confidence Bands for Sparse Additive Model

Download (519.51 kB)
Version 2 2020-03-11, 16:53
Version 1 2019-11-08, 14:33
journal contribution
posted on 2020-03-11, 16:53 authored by Junwei Lu, Mladen Kolar, Han Liu

We develop a novel procedure for constructing confidence bands for components of a sparse additive model. Our procedure is based on a new kernel-sieve hybrid estimator that combines two most popular nonparametric estimation methods in the literature, the kernel regression and the spline method, and is of interest in its own right. Existing methods for fitting sparse additive model are primarily based on sieve estimators, while the literature on confidence bands for nonparametric models are primarily based upon kernel or local polynomial estimators. Our kernel-sieve hybrid estimator combines the best of both worlds and allows us to provide a simple procedure for constructing confidence bands in high-dimensional sparse additive models. We prove that the confidence bands are asymptotically honest by studying approximation with a Gaussian process. Thorough numerical results on both synthetic data and real-world neuroscience data are provided to demonstrate the efficacy of the theory. Supplementary materials for this article are available online.

Funding

The authors are grateful for the support of NSF DMS1916211, NIH R35CA220523, NSF CAREER Award DMS1454377, NSF IIS1408910, NSF IIS1332109, NIH R01MH102339, NIH R01GM083084, and NIH R01HG06841. This work is also supported by an IBM Corporation Faculty Research Fund at the University of Chicago Booth School of Business. This work was completed in part with resources provided by the University of Chicago Research Computing Center.

History