Cross-Validation With Confidence Jing Lei 10.6084/m9.figshare.9976901.v2 https://tandf.figshare.com/articles/dataset/Cross-Validation_with_Confidence/9976901 <p>Cross-validation is one of the most popular model and tuning parameter selection methods in statistics and machine learning. Despite its wide applicability, traditional cross-validation methods tend to overfit, due to the ignorance of the uncertainty in the testing sample. We develop a novel statistically principled inference tool based on cross-validation that takes into account the uncertainty in the testing sample. This method outputs a set of highly competitive candidate models containing the optimal one with guaranteed probability. As a consequence, our method can achieve consistent variable selection in a classical linear regression setting, for which existing cross-validation methods require unconventional split ratios. When used for tuning parameter selection, the method can provide an alternative trade-off between prediction accuracy and model interpretability than existing variants of cross-validation. We demonstrate the performance of the proposed method in several simulated and real data examples. Supplemental materials for this article can be found online.</p> 2019-10-31 21:40:07 Cross-validation Hypothesis testing Model selection Overfitting Tuning parameter selection