Random Vector
A random vector is a vector consisting of random variables. When the elements of the vector $X$ are $X_{1}, X_{2}, \ldots, X_{n}$, the probability distribution function is defined by equation [1].
$F_{X_{1}, \ldots ,X_{n}}(x_{1}, \ldots , x_{n}) = P (X_{1} \leq x_{1}, \ldots , X_{n} \leq x_{n})$ [1]
The joint probability distribution function is given by equation [2].
$F_{X}(x) = F_{X_{1}, \ldots ,X_{n}}(x_{1}, \ldots , x_{n})$ [2]
where
The probability density function of the random vector $X$, denoted $p_{X}(x)$, is defined as the joint probability density function of random variables by equation [3].
$= \int_{-\infty}^{\infty}p_{X}(x)dx$ [3]
Where $p_{X}(x)$ is a multivariate function that is $p_{X}(x) = p_{X_{1}}, \ldots , p_{X_{n}}(x_{1}, \ldots , x_{n})$
The conditional probability density function of the random vector $X$ given the random vector $Y$ as $Y = y$ is defined by equation [4].
$P(X \leq x \mid Y = y)= \int_{-\infty}^{\infty}p_{X \mid Y}(x \mid y)dx$ [4]
where
And $p_{X \mid Y}(x \mid y)$ is a multivariate function.
Expectation and Covariance
Expectation
The expectation or mean of the random vector $X = [X_{1}, X_{2}, \ldots, X_{n}]^{T}$ is defined as the expectation of each element of the random vector, as shown in equation [5].
$=\int_{-\infty}^{\infty}xp_{X}(x)dx$ [5]
where $x = [x_{1}, x_{2}, \ldots , x_{n}]^{T}$.
The expectation of the function $g(X)$ of the random vector $X$ is defined as:
Covariance
The covariance matrix $Cov(X)$ of the random vector $X=[X_{1}, X_{2}, \ldots, X_{n}]^{T}$ is defined as a symmetric matrix as shown in equation [6].
$\begin{bmatrix} \sigma_{11}& \sigma_{12}& \cdots & \sigma_{1n}\\ \sigma_{21}& \sigma_{22}& \cdots & \sigma_{2n}\\ \vdots & \vdots & \ddots & \vdots \\ \sigma_{n1}& \sigma_{n2}& \cdots & \sigma_{nn} \end{bmatrix}$ [6]
Where $\sigma_{ij} = \sigma_{ji} = \mathbb{E}[(X_{i} - \mathbb{E}[X_{i}])(X_{j}-\mathbb{E}[X_{j}])]$.
And the correlation matrix of random vector $X$ and $Y$ is defined as:
The inter-covariance matrix of the random vectors $X$ and $Y$ is defined as follows:
If the inter-covariance matrix of $X$ and $Y$ is 0, $X$ and $Y$ are uncorrelated
If $\mathbb{E}[X^{T}Y]=0$, $X$ and $Y$ are orthogonal.
If $p_{XY}(x, y) = p_{X}(x)p_{Y}(y)$, $X$ and $Y$ are independent.
The conditional expectation of $X$ given the random variable $Y$ as $y$ is defined by equation [7].
$\mathbb{E}[X \mid Y = y] = \int_{-\infty}^{\infty}xp_{X \mid Y}(x \mid y) dx$ [7]
The conditional expectation of $X$ given the random variable $Y$ is defined by equation [8].
$\mathbb{E}[X \mid Y] = \int_{-\infty}^{\infty}xp_{X \mid Y}(x \mid Y) dx$ [8]
Note that $\mathbb{E}[X \mid Y = y]$ is a real number as a function of the real number $y$, but $\mathbb{E}[X \mid Y]$ is a random variable as a function of the random variable $Y$.
The conditional covariance matrix of $X$ given the random variable $Y$ as $y$ and the conditional covariance matrix of $X$ given the random variable $Y$ are defined by equations [9] and [10], respectively.
$Cov[X \mid Y=y] = \mathbb{E}[(X - \mathbb{E}[X \mid Y=y])(X - \mathbb{E}[X \mid Y=y])^{T} \mid Y=y]$ [9]
$Cov[X \mid Y] = \mathbb{E}[(X - \mathbb{E}[X \mid Y])(X - \mathbb{E} [X \mid Y])^{T} \mid Y]$ [10]
Characteristic function
The characteristic function of the random vector $X$ is defined by equation [11].
$\Phi_{X}(\omega) = \mathbb{E}[e^{j\omega^{T}X}]$ [11]
where $\omega =[\omega_{1}, \omega_{2}, \ldots, \omega_{n}]^{T}$ is an $n$-dimensional real number vector, and $n$ is the dimension of $X$.
According to the definition of expectation, equation [11] can be written as equation [12].
$\Phi_{X}(\omega) = \int_{-\infty}^{\infty}e^{j\omega^{T}X}p_{X}(x)dx$ [12]
As you can see in equation [12], the characteristic function of the random vector $X$ is the multi-dimensional inverse Fourier transform of the probability density function. Thus, the probability density function of $X$ can be obtained by transforming the characteristic function as follows: