Moments of Continuous Random Variables

Definition: The expected value of a continuous random variable is $$\mathbb{E}(X) = \int_{-\infty}^\infty tf(t)dt. $$ The expected value can be thought of as the mean of the distribution, or also as the center of mass of the distribution. If you were to make a perfect graph of the pdf $f(t)$ out of wood, the expected value would be the point at which the pdf balances perfectly.

Definition: The variance of a continuous random variable is $$\mathbb{V}(X) = \mathbb{E}(X^2)-\mathbb{E}(X)^2,$$ where $\mathbb{E}(X^n) = \int_{-\infty}^\infty t^n f(t) dt$ is the $n$th central moment for any $n\in \mathbb{N},$ similar to with discrete random variables. The variance is a positive number that describes how spread out the distribution is. If the variance is high, then values that are far apart are more probable.

Example: Let $$f(t) = \begin{cases}\frac{3}{4a^3}(a^2-x^2) & x\in[-a,a]\\ 0 & x\notin[-a,a].\end{cases}$$ Find $\mathbb{E}(X)$ and $\mathbb{V}(X)$.

Solution: \begin{align} \mathbb{E}(X) &= \int_{-1}^{1}xf(x)dx \\ & = \frac{3}{4a^3}\int_{-a}^{a}x(1-x^2)dx\\ & = \frac{3}{4a^3}\int_{-a}^{a}x-x^3 dx \\ &= 0, \end{align} since this is the integral of an odd function over a symmetric interval. \begin{align} \mathbb{V}(X) &= \mathbb{E}(X^2) - \mathbb{E}(X)^2\\ &= \frac{3}{4a^3}\int_{-a}^a x^2 (a^2-x^2)dx\\ &= \frac{3}{4a^3}\int_{-a}^a a^2x^2-x^4dx \\ &= \left.\frac{3}{4a^3}(\frac{a^2}{3}x^3 - \frac{1}{5}x^5)\right\vert_{-a}^a\\ &= \left. \frac{3}{4a^3}\left(\frac{5a^2x^3-3x^5}{15}\right)\right|_{-a}^a\\ &= \frac{a^2}{5} \end{align} Therefore as $a$ gets larger, so does the variance. This agrees with what we understand about the variance, since increasing $a$ effectively spreads out the distribution.

Example: Recall that for the exponential distribution $$f(x) = \begin{cases} \lambda e^{-\lambda x } & x \ge 0 \\ 0 & x < 0 \end{cases}$$ for some $\lambda >0.$ As we said last class, $1/\lambda$ is the "mean waiting time". If this interpretation is correct, we should find that $1/\lambda$ is the expected value for this distribution. \begin{align} \mathbb{E}(X) & = \int_0^\infty xf(x)dx\\ &=\int_0^\infty \lambda x e^{-\lambda x}\\ &= -xe^{-\lambda x }\bigg\vert_0^\infty +\int_0^\infty e^{-\lambda x}dx\\ &= -xe^{-\lambda x}-\frac{1}{\lambda}e^{-\lambda x} \bigg\vert_0^\infty\\ &= \lim_{x\to\infty}\left( -x e^{-\lambda x} -\frac{1}{\lambda}e^{-\lambda x} \right)+ 0 +\frac{1}{\lambda}\\ &= \frac{1}{\lambda}. \end{align} Where we used L'Hopital's rule to evaluate the limit in the 2nd last line.

Since $\lambda$ controls the rate of decay of the exponential distribution, it seems sensible that if we decrease $\lambda$, the pdf will decay slower, and therefore the variance should increase. Calculating the variance gives \begin{align} \mathbb{V}(X) &= \mathbb{E}(X^2) -\mathbb{E}(X)^2\\ & = \int_0^\infty x^2 \lambda e^{-\lambda x}dx - \frac{1}{\lambda^2}\\ & = -x^2e^{-\lambda x}\bigg \vert_0^\infty + \int_0^\infty 2x e^{-\lambda x}dx - \frac{1}{\lambda^2} \\ & = 0 + \frac{2}{\lambda}\int_0^\infty \lambda x e^{-\lambda x}dx - \frac{1}{\lambda ^2}\\ & = \frac{2}{\lambda}\mathbb{E}(X) - \frac{1}{\lambda^2} \\ & = \frac{1}{\lambda^2}. \end{align}

Exercises:

  1. For each of the following distributions, find the expected value and variance.
    1. $$f(x) = \begin{cases} \frac{1}{a} & x \in[0,a]\\ 0 & x \notin [0,a].\end{cases}$$
    2. $$f(x) = \begin{cases} \tfrac{1}{2\pi}(1+\cos( x)) & x \in [-\pi,\pi]\\ 0 & x\notin [-\pi,\pi]\end{cases} $$
    3. $$f(x) = \frac{\lambda}{2}e^{-\lambda|x|}$$