The Bernoulli Distribution

Definition: A Bernoulli random variable is a discrete random variable that only takes the values 0 and 1. Typically, we want to think of the value 0 representing failure, and the value 1 representing success. The distribution of Bernoulli random variable can be described using one parameter, the probability of success $p$. Then the probability of failure is $1-p$. The pmf is given by \begin{equation} f(x;p) = \begin{cases} 0 & x\ne 0~\text{or}~1\\ 1-p & x = 0\\ p & x = 1 \end{cases} \end{equation} The expected value is $\mathbb{E}(X) = f(0)\cdot 0 + f(1)\cdot 1 = p$, and the variance is $\mathbb{V}(X) = (0^2(1-p)+1^2p)-p^2 = p(1-p)$.

We are going to use these Bernoulli random variables as building blocks to create other useful distributions. In order to do so, we need to talk about how to add distributions together. While this sounds like a great idea in principle, the reality is that adding random variables together does some very nontrivial things to the distribution of the outcomes.

Definition: Let $X_1$ and $X_2$ be independent random variables with range $\{x_1,x_2,...\}$, and pmfs $f_1$ and $f_2$. The pmf of $Y = X_1+X_2$ is given by the discrete convolution of $f_1$ and $f_2$, $$ h(y) = (f_1*f_2)(y) = \sum_{j=1}^{\infty}f_1(y-x_j)f_2(x_j).$$

Example: Find the pmf for the number of heads given by tossing a coin twice.

Example: Let $X$ be a Bernoulli random variable with probability of success $p$. Find the pmf of $Y_2 = X+X$. Do the same for $Y_3 = X+X+X$.

We can also use the discrete convolution to derive some properties about the moments of a distributions:

  • If $X$ and $Y$ are random variables, then $\mathbb{E}(X+Y) = \mathbb{E}(X)+\mathbb{E}(Y)$
  • If $X$ and $Y$ are random variables, then $\mathbb{V}(X+Y) = \mathbb{V}(X)+\mathbb{V}(Y)$
  • If $X$ is a random variable, and $a$ is a real number, then $\mathbb{E}(aX) = a\mathbb{E}(X)$

Exercises

  • Let $X$ be the sum of $n$ six sided dice. Find $\mathbb{E}(X).$
  • Constants can be thought of as random variables that have only one possible value. I.e. the constant $c$ can be thought of as having the pmf $$f(x) =\begin{cases} 1 &\text{if}~x=c\\ 0 & \text{otherwise}.\end{cases}$$ Use this interpretation to show that if $X$ is a discrete random variable, then $\mathbb{E}(X+c) =\mathbb{E}(X)+c$ and $\mathbb{V}(X+c) = \mathbb{V}(X)$.
  • Let $X$ be a random variable with pmf $$f(x) = \begin{cases} \frac{1}{3} & \text{if}~ x = 1,2,3\\ 0 &\text{otherwise}. \end{cases}$$ Find the pmf of $Y=X+X$