首页|Robust density power divergence based tests in multivariate analysis: A comparative overview of different approaches
Robust density power divergence based tests in multivariate analysis: A comparative overview of different approaches
扫码查看
点击上方二维码区域,可以放大扫码查看
原文链接
NSTL
Elsevier
Hypothesis testing is one of the fundamental paradigms of statistical inference. The three canonical hypothesis testing procedures available in the statistical literature are the likelihood ratio (LR) test, the Wald test and the Rao (score) test. All of them have good optimality properties and past research has not identified any of these three procedures to be a clear winner over the other two. However, the classical versions of these tests are based on the maximum likelihood estimator (MLE), which, although the most optimal estimator asymptotically, is known for its lack of robustness under outliers and model misspecification. In the present paper we provide an overview of the analogues of these tests based on the minimum density power divergence estimator (MDPDE), which presents us with an alternative option that is strongly robust and highly efficient. Since these tests have, so far, been mostly studied for univariate responses, here we primarily focus on their performances for several important hypothesis testing problems in the multivariate context under the multivariate normal model family. (C) 2021 Elsevier Inc. All rights reserved.
Likelihood ratio-type testsMinimum density power divergence estimatorMultivariate normal modelPower and level influence functionRao-type testsWald-type testsLIKELIHOOD RATIOSTATISTICAL HYPOTHESESWALDS TESTPARAMETERSEFFICIENCYINFERENCE