Chapter 4(Further Topics on Random Variables):Sum of a Random Number of Independent Random Variables

本文为 I n t r o d u c t i o n Introduction Introduction t o to to P r o b a b i l i t y Probability Probability 的读书笔记

Sum of a Random Number of Independent Random Variables

  • We consider the sum
    Y = X 1 + . . . + X N Y=X_1+...+X_N Y=X1+...+XNwhere N N N is a random variable that takes nonnegative integer values, and X 1 , X 2 . . . . X_1, X_2 .... X1,X2.... are identically distributed random variables. (If N = 0 N = 0 N=0, we let Y = 0 Y = 0 Y=0.) We assume that N , X 1 , X 2 , . . . N, X_1,X_2, ... N,X1,X2,... are independent, meaning that any finite subcollection of these random variables are independent.
  • Let us denote by E [ X ] E[X] E[X] and v a r ( X ) var(X) var(X) the common mean and variance, respectively, of the X i X_i Xi. We wish to derive formulas for the mean, variance, and the transform of Y Y Y.

E [ Y ] E[Y] E[Y]

  • The method that we follow is to first condition on the event { N = n } \{N=n\} { N=n}, which brings us to the more familiar case of a f i x e d fixed fixed number of random variables.
    E [ Y ∣ N = n ] = E [ X 1 + . . . + X N ∣ N = n ] = E [ X 1 + . . . + X N ] = n E [ X ] E[Y|N=n]=E[X_1+...+X_N|N=n]=E[X_1+...+X_N]=nE[X] E[YN=n]=E[X1+...+XNN=n]=E[X1+...+XN]=nE[X]
  • This is true for every nonnegative integer n n n, so
    E [ Y ∣ N ] = N E [ X ] E[Y|N]=NE[X] E[YN]=NE[X]
  • Using the law of iterated expectations, we obtain
    E [ Y ] = E [ E [ Y ∣ N ] ] = E [ N E [ X ] ] = E [ N ] E [ X ] E[Y]=E[E[Y|N]]=E[NE[X]]=E[N]E[X] E[Y]=E[E[YN]]=E[NE[X]]=E[N]E[X]

v a r ( Y ) var(Y) var(Y)

  • Similarly,
    v a r ( Y ∣ N = n ) = v a r ( X 1 + . . . + X N ∣ N = n ) = v a r ( X 1 + . . . + X N ) = n v a r ( X ) var(Y|N=n)=var(X_1+...+X_N|N=n)=var(X_1+...+X_N)=nvar(X) var(YN=n)=var(X1+...+XNN=n)=var(X1+...+XN)=nvar(X)
  • Since this is true for every nonnegative integer n n n,
    v a r ( Y ∣ N ) = N v a r ( X ) var(Y | N)=Nvar(X) var(YN)=Nvar(X)
  • We now use the law of total variance to obtain
    v a r ( Y ) = E [ v a r ( Y ∣ N ) ] + v a r ( E [ Y ∣ N ] ) = E [ N ] v a r ( X ) + ( E [ X ] ) 2 v a r ( N ) var(Y)=E[var(Y|N)]+var(E[Y|N])=E[N]var(X)+(E[X])^2var(N) var(Y)=E[var(YN)]+var(E[YN])=E[N]var(X)+(E[X])2var(N)

M Y ( s ) M_Y(s) MY(s)

  • The calculation of the transform proceeds along similar lines.
    E [ e s Y ∣ N = n ] = E [ e s X 1 . . . e s X N ∣ N = n ] = E [ e s X 1 . . . e s X N ] = E [ e s X 1 ] . . . E [ e s X N ] = ( M X ( s ) ) n E[e^{sY}|N=n]=E[e^{sX_1}...e^{sX_N}|N=n]=E[e^{sX_1}...e^{sX_N}] \\=E[e^{sX_1}]...E[e^{sX_N}]=(M_X(s))^n E[esYN=n]=E[esX1...esXNN=n]=E[esX1...esXN]=E[esX1]...E[esXN]=(MX(s))nUsing the law of iterated expectations, the (unconditional) transform associated with Y Y Y is
    M Y ( s ) = E [ e s Y ] = E [ E [ e s Y ∣ N ] ] = E [ ( M X ( s ) ) N ] = ∑ n = 0 ∞ ( M X ( s ) ) n p N ( n ) M_Y(s)=E[e^{sY}]=E[E[e^{sY}|N]]=E[(M_X(s))^N]=\sum_{n=0}^\infty(M_X(s))^np_N(n) MY(s)=E[esY]=E[E[esYN]]=E[(MX(s))N]=n=0(MX(s))npN(n)
  • Using the observation
    ( M X ( s ) ) n = e l o g ( M X ( s ) ) n = e n l o g M X ( s ) (M_X(s))^n=e^{log(M_X(s))^n}=e^{nlogM_X(s)} (MX(s))n=elog(MX(s))n=enlogMX(s)we have
    M Y ( s ) = ∑ n = 0 ∞ e n l o g M X ( s ) p N ( n ) M_Y(s)=\sum_{n=0}^\infty e^{nlogM_X(s)}p_N(n) MY(s)=n=0enlogMX(s)pN(n)
  • Comparing with the formula
    M N ( s ) = ∑ n = 0 ∞ e s n p N ( n ) M_N(s)=\sum_{n=0}^\infty e^{sn}p_N(n) MN(s)=n=0esnpN(n)we see that
    M Y ( s ) = M N ( l o g M X ( s ) ) M_Y(s)=M_N(logM_X(s)) MY(s)=MN(logMX(s))

Example 4.35. Sum of a Geometric Number of Independent Exponential Random Variables.
Jane visits a number of bookstores, looking for G r e a t Great Great E x p e c t a t i o n s Expectations Expectations. Any given bookstore carries the book with probability p p p, independent of the others. In a typical bookstore visited, Jane spends a random amount of time, exponentially distributed with parameter λ \lambda λ,until she either finds the book or she determines that the bookstore does not carry it. We assume that Jane will keep visiting bookstores until she buys the book and that the time spent in each is independent of everything else. We wish to find the mean, variance, and PDF of the total time spent in bookstores.

SOLUTION

  • The total number N N N of bookstores visited is geometrically distributed with parameter p p p. Hence, the total time Y Y Y spent in bookstores is the sum of a geometrically distributed number N N N of independent exponential random variables X 1 , X 2 , . . . . X_1, X_2, .... X1,X2,.... We have
    E [ Y ] = E [ N ] E [ X ] = 1 p ⋅ 1 λ E[Y]=E[N]E[X]=\frac{1}{p}\cdot\frac{1}{\lambda} E[Y]=E[N]E[X]=p1λ1
  • Using the formulas for the variance of geometric and exponential random variables, we also obtain
    v a r ( Y ) = E [ N ] v a r ( X ) + ( E [ X ] ) 2 v a r ( N ) = 1 p ⋅ 1 λ 2 + 1 λ 2 ⋅ 1 − p p 2 = 1 λ 2 p 2 var(Y)=E[N]var(X)+(E[X])^2var(N)=\frac{1}{p}\cdot\frac{1}{\lambda^2}+\frac{1}{\lambda^2}\cdot\frac{1-p}{p^2}=\frac{1}{\lambda^2p^2} var(Y)=E[N]var(X)+(E[X])2var(N)=p1λ21+λ21p21p=λ2p21
  • In order to find the transform M Y ( s ) M_Y(s) MY(s), let us recall that
    M X ( s ) = λ λ − s ,     M N ( s ) = p e s 1 − ( 1 − p ) e s M_X(s)=\frac{\lambda}{\lambda-s},\ \ \ M_N(s)=\frac{pe^s}{1-(1-p)e^s} MX(s)=λsλ,   MN(s)=1(1p)espesThen, M Y ( s ) M_Y(s) MY(s) is found by starting with M N ( s ) M_N(s) MN(s) and replacing each occurrence of e s e^s es with M X ( s ) M_X(s) MX(s). This yields
    M Y ( s ) = p λ λ − s 1 − ( 1 − p ) λ λ − s = p λ p λ − s M_Y(s)=\frac{p\frac{\lambda}{\lambda-s}}{1-(1-p)\frac{\lambda}{\lambda-s}}=\frac{p\lambda}{p\lambda-s} MY(s)=1(1p)λsλpλsλ=pλspλ
  • We recognize this as the transform associated with an exponentially distributed random variable with parameter p λ p\lambda pλ, and therefore,
    f Y ( y ) = p λ e − p λ y ,       y ≥ 0 f_Y(y)=p\lambda e^{-p\lambda y},\ \ \ \ \ y\geq0 fY(y)=pλepλy,     y0This result can be surprising because the sum of a f i x e d fixed fixed number n n n of independent exponential random variables is not exponentially distributed.

Example 4.36. Sum of a Geometric Number of Independent Geometric Random Variables.

  • This example is a discrete counterpart of the preceding one. We let N N N be geometrically distributed with parameter p p p. We also let each random variable X i X_i Xi be geometrically distributed with parameter q q q. We assume that all of these random variables are independent. Let Y = X 1 + ⋅ ⋅ ⋅ + X N Y = X_1 +· · ·+ X_N Y=X1++XN. We have
    M N ( s ) = p e s 1 − ( 1 − p ) e s ,       M X ( s ) = q e s 1 − ( 1 − q ) e s ∴ M Y ( s ) = p M X ( s ) 1 − ( 1 − p ) M X ( s ) = p q e s 1 − ( 1 − p q ) e s M_N(s)=\frac{pe^s}{1-(1-p)e^s},\ \ \ \ \ M_X(s)=\frac{qe^s}{1-(1-q)e^s} \\\therefore M_Y(s)=\frac{pM_X(s)}{1-(1-p)M_X(s)}=\frac{pqe^s}{1-(1-pq)e^s} MN(s)=1(1p)espes,     MX(s)=1(1q)esqesMY(s)=1(1p)MX(s)pMX(s)=1(1pq)espqes
  • We conclude that Y Y Y is geometrically distributed, with parameter p q pq pq.

Problem 43.
A motorist goes through 4 lights, each of which is found to be red with probability 1 / 2 1/2 1/2. The waiting times at each light are modeled as independent normal random variables with mean 1 1 1 minute and standard deviation 1 / 2 1/2 1/2 minute. Let X X X be the total waiting time at the red lights.

  • (a) Use the total probability theorem to find the PDF and the transform associated with X X X, and the probability that X X X exceeds 4 minutes. Is X X X normal?
  • (b) Find the transform associated with X X X by viewing X X X as a sum of a random number of random variables.

SOLUTION

  • (a) The conditional PDF of X X X given that k k k lights are red, is normal with mean k k k minutes and standard deviation ( 1 / 2 ) k (1/2)\sqrt k (1/2)k . Thus, X X X is a mixture of normal random variables and the transform associated with its (unconditional) PDF is the corresponding mixture of the transforms associated with the (conditional) normal PDFs. However, X X X is not
    normal, because a mixture of normal PDFs need not be normal. The probability P ( X > 4 ∣ k   l i g h t s   a r e   r e d ) P(X > 4| k\ lights\ are\ red) P(X>4k lights are red) can be computed from the normal tables for each k k k, and
    P ( X > 4 ) = ∑ k = 0 4 P ( k   l i g h t s   a r e   r e d ) P ( X > 4 ∣ k   l i g h t s   a r e   r e d ) P(X>4)=\sum_{k=0}^4P(k\ lights\ are\ red)P(X>4|k\ lights\ are\ red) P(X>4)=k=04P(k lights are red)P(X>4k lights are red)
  • (b) Let K K K be the number of traffic lights that are found to be red. We can view X X X as the sum of K K K independent normal random variables. Thus the transform associated with X X X can be found by replacing in the binomial transform M K ( s ) = ( 1 / 2 + ( 1 / 2 ) e s ) 4 M_K(s) = (1/2+(1/2)e^s)^4 MK(s)=(1/2+(1/2)es)4 the occurrence of e s e^s es by the normal transform corresponding to μ = 1 \mu = 1 μ=1 and σ = 1 / 2 \sigma = 1/2 σ=1/2. Thus
    M X ( s ) = ( 1 2 + 1 2 ( e ( 1 / 2 ) 2 s 2 2 + s ) ) 4 M_X(s)=(\frac{1}{2}+\frac{1}{2}(e^{\frac{(1/2)^2s^2}{2}+s}))^4 MX(s)=(21+21(e2(1/2)2s2+s))4

猜你喜欢

转载自blog.csdn.net/weixin_42437114/article/details/113858371