逻辑回归中的Pearson VS Deviance残差


16

我知道标准化的Pearson残差是以传统的概率方式获得的:

ri=yiπiπi(1πi)

和偏差残差通过更统计的方式获得(每个点对可能性的贡献):

di=si2[yilogπi^+(1yi)log(1πi)]

其中 = 1,如果 = 1和 = -1,如果 = 0。siyisiyi

您能直观地向我解释如何解释偏差残差的公式吗?

此外,如果我要选择一个,那一个更合适,为什么呢?

顺便说一句,一些参考文献声称我们基于以下项得出偏差残差

12ri2

其中是上面提到的。ri


任何想法都将不胜感激
Jack Shi

1
当您说“一些参考”时...哪些参考以及如何进行?
Glen_b-恢复莫妮卡

Answers:


10

Logistic回归寻求最大化对数似然函数

LL=kln(Pi)+rln(1Pi)

where Pi is the predicted probability that case i is Y^=1; k is the number of cases observed as Y=1 and r is the number of (the rest) cases observed as Y=0.

That expression is equal to

LL=(kdi2+rdi2)/2

because a case's deviance residual is defined as:

di={2ln(Pi)if Yi=12ln(1Pi)if Yi=0

Thus, binary logistic regression seeks directly to minimize the sum of squared deviance residuals. It is the deviance residuals which are implied in the ML algorithm of the regression.

The Chi-sq statistic of the model fit is 2(LLfull modelLLreduced model), where full model contains predictors and reduced model does not.

By using our site, you acknowledge that you have read and understand our Cookie Policy and Privacy Policy.
Licensed under cc by-sa 3.0 with attribution required.