Skip to Main Content
 
Translator disclaimer

ABSTRACT

Nonparametric methods play a central role in modern empirical work. While they provide inference procedures that are more robust to parametric misspecification bias, they may be quite sensitive to tuning parameter choices. We study the effects of bias correction on confidence interval coverage in the context of kernel density and local polynomial regression estimation, and prove that bias correction can be preferred to undersmoothing for minimizing coverage error and increasing robustness to tuning parameter choice. This is achieved using a novel, yet simple, Studentization, which leads to a new way of constructing kernel-based bias-corrected confidence intervals. In addition, for practical cases, we derive coverage error optimal bandwidths and discuss easy-to-implement bandwidth selectors. For interior points, we show that the mean-squared error (MSE)-optimal bandwidth for the original point estimator (before bias correction) delivers the fastest coverage error decay rate after bias correction when second-order (equivalent) kernels are employed, but is otherwise suboptimal because it is too “large.” Finally, for odd-degree local polynomial regression, we show that, as with point estimation, coverage error adapts to boundary points automatically when appropriate Studentization is used; however, the MSE-optimal bandwidth for the original point estimator is suboptimal. All the results are established using valid Edgeworth expansions and illustrated with simulated data. Our findings have important consequences for empirical work as they indicate that bias-corrected confidence intervals, coupled with appropriate standard errors, have smaller coverage error and are less sensitive to tuning parameter choices in practically relevant cases where additional smoothness is available. Supplementary materials for this article are available online.

Supplementary Materials

The supplemental appendix contains technical and notational details omitted from the main text, proofs of all results, further technical details and derivations, and additional simulations results and numerical analyses. The main results are Edgeworth expansions of the distribution functions of the t statistics Tus, Tbc, and Trbc, for density estimation and local polynomial regression. Stating and proving these results is the central purpose of this supplement. The higher-order expansions of confidence interval coverage probabilities in the main paper follow immediately by evaluating the Edgeworth expansions at the interval endpoints.

Acknowledgments

The authors thank Ivan Canay, Xu Cheng, Joachim Freyberger, Bruce Hansen, Joel Horowitz, Michael Jansson, Francesca Molinari, Ulrich Müller, and Andres Santos for thoughtful comments and suggestions, as well as seminar participants at Cornell, Cowles Foundation, CREST Statistics, London School of Economics, Northwestern, Ohio State University, Princeton, Toulouse School of Economics, University of Bristol, and University College London. The associate editor and three reviewers also provided very insightful comments that improved this manuscript.

Additional information

Funding

The second author gratefully acknowledges financial support from the National Science Foundation (SES 1357561 and SES 1459931).

Login options

Purchase * Save for later
Online

Article Purchase 24 hours to view or download: USD 44.00 Add to cart

Issue Purchase 30 days to view or download: USD 268.00 Add to cart

* Local tax will be added as applicable