Convergence in probability is a stochastic analog of the convergence of a sequence of numbers. Hence, a sequence of random variables converges in probability to a constant i.e.
if
as
which suggests that for sufficiently large will be sufficiently close to .
Note that by the Chebychev inequality we get that the sufficient condition for is that tends to in quadratic mean i.e.