Convergence in probability and the limiting behavior of moments: A useful (elementary) reminder

 Convergence in probability is a stochastic analog of the convergence of a sequence of numbers. Hence, a sequence of random variables Y_n converges in probability to a constant c  i.e.

Y_n \rightarrow^P c

if \forall \epsilon >0

P ( |Y_n - c | < \epsilon ) \rightarrow 1 as n \rightarrow \infty

which suggests that for n sufficiently large Y_n will be sufficiently close to c.

 

 

Note that by the Chebychev inequality we get that the sufficient condition for Y_n \rightarrow^P c is that Y_n tends to c in quadratic mean i.e.

E(Y_n-c)^2 \rightarrow 0

The Chebychev inequality also implies that if a random variable has a small variance, then its distribution is closely concentrated about the mean. However the converse is not true*: The fact that a random variable is concentrated about its mean tell us nothing about its moments! Which means that

Y_n \rightarrow^P c

does not imply that

E(Y_n-c)^i \rightarrow 0    i=1,2,..

 

 

The convergence in probability claims that for large n, Y_n is very likely to be close to c. However, it suggests nothing about where the remaining small probability mass which is away of c is located. This small mass can affect significantly the value of the mean and of other moments.

 

 

*Unless the Y’s are uniformly bounded i.e.

\exists M   s.th.   P[|Y_n - c | < M] =1,   \forall n

Tagged , ,

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s