Home Expected Value and Variance
Post
Cancel

Expected Value and Variance

Expected Value

The expectation or mean of the random variable $X$, denoted $\mathbb{E}[X]$, is defined by equation [1].

$\mathbb{E}[X] = \int_{-\infty}^{\infty}xp_{X}(x)dx$ [1]

The $k^{th}$ moment of the random variable $X$, denoted $\mathbb{E}[X^k]$, is defined by equation [2].

$\mathbb{E}[X^{k}] = \int_{-\infty}^{\infty}x^{k}p_{X}(x)dx$ [2]

The first moment is the expectation, and the second moment is the mean square or average power of the random variable.

The expectation of a function of the random variable $X$, denoted $\mathbb{E}[g(X)]$, is defined by equation [3].

$\mathbb{E}[g(X)] = \int_{-\infty}^{\infty}g(x)p_{X}(x)dx$ [3]

If the random variable $X$ has a joint distribution with $Y$, the expectation of a function of the random variables $X$ and $Y$, denoted $\mathbb{E}[g(X, Y)]$, is defined by equation [4].

$\mathbb{E}[g(X)] = \int_{-\infty}^{\infty} \int_{-\infty}^{\infty}g(x)p_{XY}(x,y)dxdy$

$= \int_{-\infty}^{\infty}x^{k}p_{X}(x)dx$ [4]

The expectation of a function of the random variables $X$ and $Y$, denoted $\mathbb{E}[g(X, Y)]$, is defined by equation [5].

$\mathbb{E}[g(X, Y)] = \int_{-\infty}^{\infty} \int_{-\infty}^{\infty}g(x, y)p_{XY}(x,y)dxdy$ [5]





Variance

The variance of a random variable is defined by equation [6].

$Var(X) = \mathbb{E}[X^{2}] - (\mathbb{E}[X])^{2}$

$= \mathbb{E}[(X-\mathbb{E}[X])^{2}]$ [6]

$= \int_{-\infty}^{\infty}(x-\mathbb{E}[X])^{2}p_{X}(x)dx$

The standard deviation of $X$ is defined as $\sigma_{X} = \sqrt{Var(X)}$. The $k^{th}$ central moment, denoted $\mathbb{E}[(X - \mu_{X})^{k}]$, is defined by equation [7].

$\mathbb{E}[(X - \mathbb{E}[X])^{k}] = \int_{-\infty}^{\infty}(x-\mathbb{E}[X])^{k}p_{X}(x)dx$ [7]

where $\mathbb{E}[X] = \mu_{X}$. The first central moment is 0, and the second central moment is the variance of the random variable.

The covariance of two random variables $X$ and $Y$ is defined by equation [8].

$Cov(X, Y) = \mathbb{E}[(X-\mathbb{E}[X])(Y-\mathbb{E}[Y])]$

$= \int_{-\infty}^{\infty}\int_{-\infty}^{\infty}(x-\mathbb{E}[X])(y-\mathbb{E}[Y])p_{XY}(x, y)dxdy$ [8]

As you can see in equation [8], if $X = Y$, $Cov(X, Y) = Var(X)$.

If the covariance of random variables $X$ and $Y$ is 0, $X$ and $Y$ are said to be uncorrelated. The correlation of $X$ and $Y$ is defined by equation [9].

$Cor(X, Y) = \mathbb{E}[XY]$

$= \int_{-\infty}^{\infty}\int_{-\infty}^{\infty}xyp_{XY}(x, y)dxdy$ [9]

However, if $X$ and $Y$ are independent, the correlation is expressed as:

$Cor(X, Y) = \mathbb{E}[XY] = \mathbb{E}[X] \mathbb{E}[Y]$

If $\mathbb{E}[XY] = 0$, $X$ and $Y$ are said to be orthogonal.





Conditional Expectation and Variance



Conditional Expectaion

The conditional expectation of $X$ given the random variable $Y$ as $y$ is defined by equation [10].

$\mathbb{E}[X \mid Y = y] = \int_{-\infty}^{\infty}xp_{X \mid Y}(x \mid y) dx$ [10]

The conditional expectation of $X$ given the random variable $Y$ is defined by equation [11].

$\mathbb{E}[X \mid Y] = \int_{-\infty}^{\infty}xp_{X \mid Y}(x \mid Y) dx$ [11]

Note that $\mathbb{E}[X \mid Y = y]$ is a real number as a function of the real number $y$, but $\mathbb{E}[X \mid Y]$ is a random variable as a function of the random variable $Y$.



The conditional expectation of $X$ given the random variable $Y$ as $y$ and the conditional expectation of $X$ given the random variable $Y$ are defined by equations [12] and [13], respectively.

$\mathbb{E}[g(X) \mid Y=y] = \int_{-\infty}^{\infty}g(x)p_{X \mid Y}(x \mid y) dx$ [12]

$\mathbb{E}[g(X) \mid Y] = \int_{-\infty}^{\infty}g(x)p_{X \mid Y}(x \mid Y) dx$ [13]

Since $Var(X \mid Y)$ is also a random variable, the expectation can be calculated by equation [14].

$\mathbb{E}[Var(X \mid Y)] = \mathbb{E}[\mathbb{E}[X^{2} \mid Y] - (\mathbb{E}[X \mid Y])^{2}]$

$= \mathbb{E}[X^{2}] - \mathbb{E}[(\mathbb{E}[X \mid Y])^{2}]$ [14]



Conditional variance

The conditional variance of $X$ given the random variable $Y$ as $y$ and the conditional variance of $X$ given the random variable $Y$ are defined by equations [15] and [16], respectively.

$Var(X \mid Y = y) = \mathbb{E}[(X-\mathbb{E}[X \mid Y=y)]^{2} \mid Y=y]$

$= \mathbb{E}[X^{2} \mid Y=y] - (\mathbb{E}[X \mid Y = y])^{2}$ [15]

$Var(X \mid Y) = \mathbb{E}[(X - \mathbb{E}[X \mid Y])^{2} \mid y] $

$= \mathbb{E}[X^{2} \mid Y] - (\mathbb{E}[X \mid Y])^{2}$ [16]

As with conditional expectation, note that $Var[X \mid Y = y]$ is a real number as a function of the real number $y$, but $Var[X \mid Y]$ is a random variable as a function of the random variable $Y$.

Additionally, since $\mathbb{E}(X \mid Y)$ is also a random variable, the expectation can be calculated by equation [17].

$Var(\mathbb{E}[X \mid Y]) = \mathbb{E}[(\mathbb{E}[X \mid Y] - \mathbb{E}[\mathbb{E}[X \mid Y]])^{2}]$

$= \mathbb{E}[(\mathbb{E}[X \mid Y])^{2}] - (\mathbb{E}[X])^{2}$ [17]

According to the definition of variance, $Var(X)$ can be expressed as follows:

$Var(X) = \mathbb{E}[Var(X \mid Y)] + Var(\mathbb{E}[X \mid Y])$
This post is licensed under CC BY 4.0 by the author.