Expected Value and Variance of Normal Distribution

2018-04-30

This article will show you the definition and properties of Expected Value $E(X)$ and Variance $Var(X)$. And we will show you why Normal Distribution $N \sim (0, 1)$ ‘s Expected Value $ = \mu$ and its Variance $ = \sigma^2 $.

Reference: Proof

1. Expected Value

Also called mean and average.

  • Discrete Random Variable:
    $$E(X) = \sum_{i=1}^{n}{x_ip(x_i)}$$
  • Continous Random Variable:
    $$E(X) = \int_{-\infty}^{\infty}{x f(x)}dx$$
    where $f(x)$ is the probability density function of $X$.
  • properties:
    1. $E(C) = C$
    2. $X$ and $Y$ are random variables on a sample space $\Omega$, then
      $E(X + Y) = E(X) + E(Y)$
    3. If $X$ and $Y$ are independent random variables, then
      $E(XY) = E(X)E(Y)$
    4. For $Y = g(X)$, $E(Y)=E(g(x))=∫_{-∞}^{+∞}g(x)f(x)dx$

2. Variance

Let $X$ be a continuous random variable with mean $\mu$. The variance of $X$ is
$$Var(X) = E((X - μ)^2)$$

  • properties:
    1. $Var(X + Y) = Var(X) + Var(Y)$
    2. $Var(aX + b) = a^2Var(X)$

3. Normal Distribution

As it is named, Normal Distribution, also known as (Gaussian Distribution) is a very normal distribution in our daily life. For example, the student’s grades, or many random variabls in natural science obeys normal distribution.

Central limit theorem is a group of very important theorems in probability. Their core concept is:
The central limit theorem states that under certain (fairly common) conditions, the sum of many random variables will have an approximately normal distribution.

Definition: $X∼N(μ, σ)$, whose pdf is,
$$f(x) = \frac{1}{σ√2π}\exp{\frac{(x - \mu)^2}{2σ^2}}$$
where $μ$ is its mean, $σ$ is its standard derivation.

Proof 1. $E(X)=μ$
First let’s standardize $X$, that is, we made $Z = \frac{X - μ}{σ}$
$∴ X = σZ + μ$

$∴ E(X) = E(σZ + μ) = σE(Z) + μ $

$∵ E(Z) = ∫_{-∞}^{∞}z\frac{1}{\sqrt{2\pi}}e^{-\frac{z^2}{2}}dz$

$ = \frac{1}{\sqrt{2\pi}} \int_{-\infty}^{\infty}ze^{-\frac{z^2}{2}}dz$

$= \frac{1}{\sqrt{2\pi}} e^{-\frac{z^2}{2}} \bigg|_{-\infty}^{+\infty}$

$∵ e^{-\frac{z^2}{2}}$ is an even function

$∴e^{-\frac{z^2}{2}} \bigg|_{-\infty}^{+\infty} = 0 $

$∴ E(Z) = 0$

$∴ E(X) = σ ⋅ 0 + μ = μ$

Proof 2. $Var(X)=σ^2$
Similarly, we standardize $X$, that is, we made $Z = \frac{X - μ}{σ}$
$∴ X = σZ + μ$

$∴ Var(X) = Var(σZ + μ) = σ^2Var(Z) = σ^2E((Z-0)^2) = σ^2E(Z^2)$

$∵ E(Z^2) = ∫_{-∞}^{+∞}z^2f(z)dz$

$ = ∫_{-∞}^{+∞}z^2 \frac{1}{√{2\pi}}e^{-\frac{z^2}{2}}dz$

here we use integration by parts to calculate the integral:

$\int uvdx = uv - \int u’vdx$

Let $u = z, v’ = z\frac{1}{√{2\pi}}e^{-\frac{z^2}{2}} $,

then $u’ = 1, v = -e^{-\frac{z^2}{2}}$

$= \frac{1}{\sqrt{2\pi}} [z(-e^{-\frac{z^2}{2}}) \bigg|_{-∞}^{+∞} + ∫1 \cdot e ^ {-\frac{z^2}{2}}dz]$

$= -\frac{1}{\sqrt{2π}}z⋅e^{-\frac{z^2}{2}}\bigg|_{-\infty}^{+∞} + \frac{1}{\sqrt{2π}}∫ e^{-\frac{z^2}{2}}dz$

$= 0 + 1$

for the first part, exp(x) decreases much faster than linear x, so first part is 0; for the second part, it is the pdf of Z, so its sum should be 1. (Sum of probabilities is 1)

$∴ E(Z^2) = 1$

$∴ Var(X) = σ^2E(Z^2) = σ^2$