Convergence in probability is a stochastic analog of the convergence of a sequence of numbers. Hence, a sequence of random variables converges in probability to a constant i.e.

if

as

which suggests that for sufficiently large will be sufficiently close to .

Note that by the Chebychev inequality we get that the sufficient condition for is that tends to in *quadratic mean* i.e.

The Chebychev inequality also implies that if a random variable has a small variance, then its distribution is closely concentrated about the mean.** However the converse is not true*: The fact that a random variable is concentrated about its mean tell us nothing about its moments!** Which means that

does not imply that

The convergence in probability claims that for large n, is **very likely** to be close to c. However, it suggests nothing about where the remaining small probability mass which is away of c is located. This small mass can affect significantly the value of the mean and of other moments.

*Unless the Y’s are uniformly bounded i.e.

s.th. ,

### Like this:

Like Loading...

*Related*