Asymptotically Optimal Bias Reduction for Parametric Models release_53htau2ebvhorfoevi3dtrz244

by Stéphane Guerrier, Mucyo Karemera, Samuel Orso, Maria-Pia Victoria-Feser


An important challenge in statistical analysis concerns the control of the finite sample bias of estimators. This problem is magnified in high-dimensional settings where the number of variables p diverges with the sample size n, as well as for nonlinear models and/or models with discrete data. For these complex settings, we propose to use a general simulation-based approach and show that the resulting estimator has a bias of order O(0), hence providing an asymptotically optimal bias reduction. It is based on an initial estimator that can be slightly asymptotically biased, making the approach very generally applicable. This is particularly relevant when classical estimators, such as the maximum likelihood estimator, can only be (numerically) approximated. We show that the iterative bootstrap of Kuk (1995) provides a computationally efficient approach to compute this bias reduced estimator. We illustrate our theoretical results in simulation studies for which we develop new bias reduced estimators for the logistic regression, with and without random effects. These estimators enjoy additional properties such as robustness to data contamination and to the problem of separability.
In text/plain format

Released as a article
Version v1
Release Date 2020-02-19
Primary Language en (lookup)

Known Files and URLs

application/pdf  562.7 kB
sha1:6b5d290496741495884e... (webarchive) (repository)
Read Full Text
Type  article
Stage   submitted
Date   2020-02-19
Version   v1
arXiv  2002.08757v1
Work Entity
grouping other versions (eg, pre-print) and variants of this release
Cite This Release
Fatcat Bits

State is "active". Revision:
As JSON object via API