Taylor & Francis Group
Browse
1/1
2 files

Nonparametric Imputation by Data Depth

Version 3 2021-09-29, 14:05
Version 2 2019-04-11, 19:41
Version 1 2019-01-15, 19:33
dataset
posted on 2019-04-11, 19:41 authored by Pavlo Mozharovskyi, Julie Josse, François Husson

We present single imputation method for missing values which borrows the idea of data depth—a measure of centrality defined for an arbitrary point of a space with respect to a probability distribution or data cloud. This consists in iterative maximization of the depth of each observation with missing values, and can be employed with any properly defined statistical depth function. For each single iteration, imputation reverts to optimization of quadratic, linear, or quasiconcave functions that are solved analytically by linear programming or the Nelder–Mead method. As it accounts for the underlying data topology, the procedure is distribution free, allows imputation close to the data geometry, can make prediction in situations where local imputation (k-nearest neighbors, random forest) cannot, and has attractive robustness and asymptotic properties under elliptical symmetry. It is shown that a special case—when using the Mahalanobis depth—has direct connection to well-known methods for the multivariate normal model, such as iterated regression and regularized PCA. The methodology is extended to multiple imputation for data stemming from an elliptically symmetric distribution. Simulation and real data studies show good results compared with existing popular alternatives. The method has been implemented as an R-package. Supplementary materials for the article are available online.

Funding

The major part of this project was conducted during the postdoc of Pavlo Mozharovskyi at Agrocampus Ouest (Rennes) granted by the Centre Henri Lebesgue under the program PIA-ANR-11-LABX-0020-01.

History