Lu, Junwei Kolar, Mladen Liu, Han Kernel Meets Sieve: Post-Regularization Confidence Bands for Sparse Additive Model <p>We develop a novel procedure for constructing confidence bands for components of a sparse additive model. Our procedure is based on a new kernel-sieve hybrid estimator that combines two most popular nonparametric estimation methods in the literature, the kernel regression and the spline method, and is of interest in its own right. Existing methods for fitting sparse additive model are primarily based on sieve estimators, while the literature on confidence bands for nonparametric models are primarily based upon kernel or local polynomial estimators. Our kernel-sieve hybrid estimator combines the best of both worlds and allows us to provide a simple procedure for constructing confidence bands in high-dimensional sparse additive models. We prove that the confidence bands are asymptotically honest by studying approximation with a Gaussian process. Thorough numerical results on both synthetic data and real-world neuroscience data are provided to demonstrate the efficacy of the theory. <a href="https://doi.org/10.1080/01621459.2019.1689984" target="_blank">Supplementary materials</a> for this article are available online.</p> Confidence band;Kernel method;Sieve estimator;Sparse additive model 2020-03-11
    https://tandf.figshare.com/articles/journal_contribution/Kernel_Meets_Sieve_Post-Regularization_Confidence_Bands_for_Sparse_Additive_Model/10274576
10.6084/m9.figshare.10274576.v2