Generalizing Covariate-tightened Trimming Bounds for Sample Selection Using Adaptive Kernels
We present methods for tightening trimming bounds for average treatment effects to account for potentially endogenous sample selection. Bounds are an attractive approach because they allow researchers to avoid disputable parametric assumptions. However, basic methods often yield bounds that are very wide and therefore minimally informative. Methods exist to use covariates to tighten bounds, but such methods cannot easily incorporate large number of covariates, meaning that researchers cannot take full advantage of the information available. We demonstrate how to addresses this problem by using a random forest. We develop methods for honest inference. A simulation study shows the benefits of our approach in terms of both the width of the bounds and the width of confidence intervals for the bounds.