Expectation: Useful properties and inequalities

 

Screenshot - 110614 - 16:15:09

 

If {X \geq 0} is a random variable on {(\Omega, \mathcal{F}, P)}. The expected value of {X } is defined as

\displaystyle \mathbb{E}(X) \equiv \int_{\Omega} X dP = \int_{\Omega} X(\omega) P (d \omega)

 

Inequalities

  • Jensen’s inequality. If {\varphi} is convex and {E|X|, E|\varphi(X)| < \infty}

\displaystyle \mathbb{E} (\varphi(X)) \geq \varphi(\mathbb{E}X)

  • Holder’s inequality. If {p,q \in [1, \infty]} with {1/p + 1/q =1} then

\displaystyle \mathbb{E}|XY| \leq \|X\|_p \|Y\|_q

  • Cauchy-Schwarz Inequality: For {p=q=2}

\displaystyle \mathbb{E}|XY| \leq \left( \mathbb{E}(X^2) \mathbb{E}(Y^2) \right)^{1/2}

Let {Y \geq 0} with {\mathbb{E} (Y^2) < \infty}. \displaystyle \mathbb{E}(Y)^2 \leq \mathbb{E}(Y \mathbb{I}_{Y>0})^2 \leq \mathbb{E}(Y^2) P(Y>0)
hence

\displaystyle P(Y>0) \geq E(Y)^2/E(Y^2)

 

  • Markov’s/Chebyshev’s inequality. Suppose {\varphi: \mathbb{R} \rightarrow \mathbb{R}} has {\varphi \geq 0}, let {\mathcal{A} \in \mathcal{R}} and let {i_A = \inf \{\varphi(y) : y \in A\}}

\displaystyle i_A P(X \in A) \leq \mathbb{E}(\varphi(X); X \in A) \leq \mathbb{E}(\varphi(X))

where

\displaystyle \mathbb{E}(X;A) = \int_A X dP, \qquad A \subset \Omega

Alternatively suppose {X \in m \mathcal{F}} and that {\varphi: \mathbb{R} \rightarrow [0,\infty]} is {\mathcal{B}}-measurable and non-decreasing (note {\varphi (X) = \varphi \circ X \in (m \mathcal{F})^+} ). Then

 

\displaystyle \varphi(a)P(X \geq A) \leq \mathbb{E}(\varphi(X); X \geq a) \leq \mathbb{E}(\varphi(X))

Example:

\displaystyle X \in \mathcal{L}^1, \qquad aP(|X| \geq a) \leq \mathbb{E}(|X|) \qquad (a>0)

 

Integration to the limit

 

{X_n} is uniformly integrable if { \forall \epsilon>0, \exists \delta>0} s.th.

\displaystyle \sup_{n \geq 1} \int_A |X_n| dP < \epsilon

whenever {P(A)< \delta } and

\displaystyle \sup_{n \geq 1} \mathbb{E} |X_n| < \infty.

 

Monotone Convergence Theorem: If {0 \leq X_n \uparrow X} then {\mathbb{E}(X_n) \uparrow \mathbb{E}(X)}.

 

Lemma: A measurable function {X} is integrable iff {\forall \epsilon>0 \exists \delta>0 \text{ s.th. } A \in \mathcal{F}, P(A)< \delta} implies

\displaystyle \int_A |X| dP< \epsilon, \qquad \mathbb{E}|X| \leq \frac{1}{\delta} .

Fatou’s lemma. If {X_n \geq 0} then

\displaystyle \liminf_{n \rightarrow \infty} \mathbb{E}(X_n) \geq \mathbb{E}(\liminf_{n \rightarrow \infty}X_n)

Dominated Convergence Theorem. If {X_n \xrightarrow{a.s.} X}, {|X_n| \leq Y \text{ } \forall n}, and {\mathbb{E}(Y) < \infty}, then

 

{\mathbb{E}(X_n) \rightarrow \mathbb{E}(X).}

 

Note when {Y} is constant {\Longrightarrow} Bounded Convergence Theorem.

 

Theorem: Suppose {X_n \xrightarrow{a.s.} X} and {g,h} continous functions with i) {g \geq 0} and {g(x) > 0} when {|x|} large. ii) {|h(x)|/g(x) \rightarrow 0} as {|x| \rightarrow \infty}, and iii) {\mathbb{E}(g(X_n)) \leq K < \infty} {\forall n}. Then

{\mathbb{E}(h(X_n)) \rightarrow \mathbb{E}(h(X))}.

 

Sums of non-negative Random Variables: Collection of useful results

  • If { X \in (m \mathcal{F})^+} and { \mathbb{E}(X) < \infty} then

{P(X< \infty)=1}.

  • If {\{Z_n\}_n \in (m \mathcal{F})^+}, then (see Monotone Convergence Theorem)

\displaystyle \mathbb{E}(\sum Z_n) = \sum \mathbb{E}(Z_k) \leq \infty.

 

  • If {\{Z_n\}_n \in (m \mathcal{F})^+} s.th. {\mathbb{E}(\sum Z_n) \leq \infty}, then

\displaystyle \sum Z_n < \infty \text{ a.s. and so } Z_n \xrightarrow{a.s.}0

  • First Borel-Cantelli Lemma. Suppose {\{A_k\} \in \mathcal{F}} sequence of events s.th. { \sum P(A_k) < \infty}. Set {Z_k = \mathbb{I}_{A_k}}. Then

{\mathbb{E}(Z_k) = P(Z_k)}

and by previous result

\displaystyle \sum Z_k = \sum \mathbb{I}_{A_k} < \infty

Note { \sum \mathbb{I}_{A_k} = P(\limsup_k A_k) = \cap_{n=1}^{\infty} \cup_{k=n}^{\infty} A_k }.

 

Computing Expected Values

 

Change of variables formula. Let {X: (\Omega, \mathcal{F}) \rightarrow (S, \mathcal{S})} with measure {\mu}, i.e. {\mu (A) = P(X \in A)}. If {f: (S, \mathcal{S}) \rightarrow (\mathbb{R}, \mathcal{R})}, so that {f \geq 0} or { \mathbb{E}|f(x)| < \infty}, then

\displaystyle \mathbb{E} (f(x)) = \int_S f(y) \mu (dy)

Tagged , ,

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s