假设


18

证明以下陈述正确的最简单方法是什么?

假设Y1,,YniidExp(1)。显示i=1n(YiY(1))Gamma(n1,1)

注意,Y(1)=min1inYi

通过XExp(β),这意味着,fX(x)=1βex/β1{x>0}

很容易看到Y(1)Exponential(1/n)。此外,我们也有i=1nYiGamma(α=n,β=1)的参数化下

fY(y)=1Γ(α)βαxα1ex/β1{x>0}α,β>0.

西安人给出的解决方案答案:在原始问题中使用符号: 由此,我们得到了Σ Ñ = 2Ñ-+1[Ý-Ý-1]伽玛ñ-1

i=1n[YiY(1)]=i=1n[Y(i)Y(1)]=i=1nY(i)nY(1)=i=1n{Y(i)Y(i1)+Y(i1)Y(1)+Y(1)}nY(1)=i=1nj=1i{Y(j)Y(j1)}nY(1) where Y(0)=0=j=1ni=jn{Y(j)Y(j1)}nY(1)=j=1n(nj+1)[Y(j)Y(j1)]nY(1)=i=1n(ni+1)[Y(i)Y(i1)]nY(1)=i=2n(ni+1)[Y(i)Y(i1)]+nY(1)nY(1)=i=2n(ni+1)[Y(i)Y(i1)].
i=2n(ni+1)[Y(i)Y(i1)]Gamma(n1,1)

1
@MichaelChernick 1)我不确定我的独立性证明是否正确,以及2)我不确定我上面猜测的关于Gamma分布差异的结果是否正确。这似乎与此处给出的内容相矛盾,但是也许这种情况有所不同,因为这种差异包括订单统计之一?我不确定。
单簧管手

1
@Clarinetist, I'm not certain. Maybe try to work with i=2n(Y(i)Y(1)), which clearly equals the sum you're working with. The answer here might be helpful: math.stackexchange.com/questions/80475/…
gammer

3
Have you tried to prove that each (YiY(1))Expon(1) -- except for one i, for which YiY(1)=0 and, then, using the fact that the sum of (n1) iid Exponential variates will be Gamma distributed?
Marcelo Ventura

1
@jbowman We have
fZi(zi)=fYi(zi+a)=e(zi+a)
and conditioned on Yia, we divide this by ea, giving ezi, hence we have (ZiYiA)Exp(1). Now here's what's bugged me about this proof: I regarded A as a constant. But Y(1) isn't a constant. Why would this work?
Clarinetist

1
The point is that it doesn't matter what a is! The distribution is always Exp(1)! Remarkable, isn't it? And from this you can conclude that the distribution of YiY[1] is always Exp(1) for i>1, regardless of the actual value of Y[1].
jbowman

Answers:


15

该证明在第211页上的所有随机数生成之书,Devroye的非均匀随机变量生成中给出(这是一个非常优雅的方法!):

E(0)=0

(ni+1)(E(i)E(i1))
derived from the order statistics E(1)E(n) of an i.i.d. exponential sample of size n are themselves i.i.d. exponential variables

Proof. Since

i=1nei=i=1ne(i)=i=1nj=1i(e(j)e(j1))=j=1ni=jn(e(j)e(j1))=j=1n(nj+1)(e(j)e(j1))
the joint density of the order statistic (E(1),,E(n)) writes as
f(e)=n!exp{i=1ne(i)}=n!exp{i=1n(ni+1)(e(i)e(i1))}
Setting Yi=(E(i)E(i1)), the change of variables from (E(1),,E(n)) to (Y1,,Yn) has a constant Jacobian [incidentally equal to 1/n! but this does not need to be computed] and hence the density of (Y1,,Yn) is proportional to
exp{i=1nyi}
which establishes the result. Q.E.D.

An alternative suggested to me by Gérard Letac is to check that

(E(1),,E(n))
has the same distribution as
(E1n,E1n+E2n1,,E1n+E2n1++En1)
(by virtue of the memoryless property), which makes the derivation of
k=1n(EkE(1))k=1n1Ek
straightforward.

1
Thank you for this answer! I'd like to fill in some details for anyone who is reading this in the future: ei are observed values of the Ei, and the easiest way to see that i=1ne(i)=i=1n(ni+1)(e(i)e(i1))=i=1ne(i) is to write out i=1n(ni+1)(e(i)e(i1)) term-by-term. Because the density of (Y1,,Yn) is proportional to exp(i=1nyi), separate the yi to see the density is proportional to i=1neyi, hence Y1,,YniidExp(1).
Clarinetist

5

I lay out here what has been suggested in comments by @jbowman.

Let a constant a0. Let Yi follow an Exp(1) and consider Zi=Yia. Then

Pr(ZiziYia)=Pr(YiaziYia)

Pr(Yizi+aYia)=Pr(Yizi+a,Yia)1Pr(Yia)

Pr(aYizi+a)1Pr(Yia)=1ezia1+eaea=1ezi

which is the distribution function of Exp(1).

Let's describe this: the probability that an Exp(1) r.v. will fall in a specific interval (the numerator in the last line), given that it will exceed the interval's lower bound (the denominator), depends only on the length of the interval and not on where this interval is placed on the real line. This is an incarnation of the "memorylessness" property of the Exponential distribution, here in a more general setting, free of time-interpretations (and it holds for the Exponential distribution in general)

Now, by conditioning on {Yia} we force Zi to be non-negative, and crucially, the obtained result holds aR+. So we can state the following:

If YiExp(1), then Q0:Zi=YiQ0 ZiExp(1).

Can we find a Q0 that is free to take all non-negative real values and for which the required inequality always holds (almost surely)? If we can, then we can dispense with the conditioning argument.

And indeed we can. It is the minimum-order statistic, Q=Y(1), Pr(YiY(1))=1. So we have obtained

YiExp(1)YiY(1)Exp(1)

This means that

Pr(YiY(1)yiy(1))=Pr(Yiyi)

So if the probabilistic structure of Yi remains unchanged if we subtract the minimum order statistic, it follows that the random variables Zi=YiY(1) and Zj=YjY(1) where Yi,Yj independent, are also independent since the possible link between them, Y(1) does not have an effect on the probabilistic structure.

Then the sum i=1n(YiY(1)) contains n1 Exp(1) i.i.d. random variables (and a zero), and so

i=1n(YiY(1))Gamma(n1,1)
By using our site, you acknowledge that you have read and understand our Cookie Policy and Privacy Policy.
Licensed under cc by-sa 3.0 with attribution required.