Random variable sets

From COSSAN Wiki
Jump to: navigation, search
Imbox content.png Please improve this article by expanding it.
A short introduction is required... what is a random variable set?
There are only formula here...
Question book-new.svg This article does not cite any references or sources. Please help improve this article by adding citations to reliable sources.

A random variable set is a vector of Random variables \boldsymbol X = {X_{1},..., X_{n}}, where \boldsymbol X_{1},..., X_{n} do not necessarely have the same distribution

Two random variables

Two events X and Y are independent if:

P(X\cap Y) = P(X) \cdot P(Y)

For continuous probabilities, the previous equation is expressed as:

f(x,y)=f_{X}(x) \cdot f_{Y}(y)

where f_{X} and f_{Y} are the marginal density functions, defined as

f_{X}(x) = \int_{-\infty}^{+\infty} f(x,y) dy

f_{Y}(y) = \int_{-\infty}^{+\infty} f(x,y) dx

Let Y have a probability greater than zero. The probability of the event X assuming that Y occurred is (X and Y may not be independent):

P(X| Y) = \frac{P(X\cap Y)}{P(Y)}

P(X| Y) = \frac{P(Y | X)\cdot P(X)}{P(Y)}

If X and Y are continuous:

f_{X|Y}(x, y)= \frac{f(x,y)}{f_{Y}(y)}

Several random variables

Let \boldsymbol X= \left( X_{1}, ..., X_{n} \right) be a vector of n random variables. The mean and the standard deviation of the i^{th} term are:

 \mu_{X_{i}} = E(X_{i}) = \int_{-\infty}^{+\infty} ... \int_{-\infty}^{+\infty} x_{i} f_{\boldsymbol X}(\boldsymbol x) dx_{1} ...dx_{n}

\sigma_{X_{i}}^{2} = E\left( (X_{i}- \mu_{X_{i}}) ^{2}\right) = \int_{-\infty}^{+\infty} ... \int_{-\infty}^{+\infty} \left( x_{i} - \mu_{X_{i}} \right)^{2} f_{\boldsymbol X}(\boldsymbol x) dx_{1} ...dx_{n}

The covariance of the random variable X_{i} and X_{j} is defined as:

Cov(X_{i},X_{j}) = E\left( (X_{i} - \mu_{X_{i}})(X_{j} - \mu_{X_{j}})\right)

It can be shown that:

Cov(X_{i},X_{j}) = E\left( X_{i} X_{j} \right) - E\left( X_{i} \right) E\left( X_{j} \right)

The covariance measures the relationship between two random variable. If Cov(X_{i},X_{j})>0, X_{i} is likely to be greater (resp. lesser) than its mean value if X_{j} is greater (resp. lesser) than its mean value. If Cov(X_{i},X_{j})<0, X_{i} is likely to be lesser (resp. greater) than its mean value if X_{j} is greater (resp. lesser) than its mean value. Independent random variables are uncorrelated, whereas uncorrelated random variables are not necessary independent.

The covariance matrix \boldsymbol C is defined as:

\boldsymbol C =  \left[ Cov(X_{i},X_{j})\right] _{i,j = 1...n}

The correlation \rho is expressed as:

\rho = Corr(X_{i},X_{j}) = \frac{Cov(X_{i},X_{j})}{\sqrt[]{Cov(X_{i},X_{i})\cdot Cov(X_{j},X_{j})}} = \frac{Cov(X_{i},X_{j})}{\sigma_{X_{i}} \cdot \sigma_{X_{j}} }

The value of the correlation is in the range [-1, 1]. It is a scaled measure of the relationship among random variables

The probability density function of n-dimensional Gaussian distribution can be determined using the covariance matrix and a vector containing the mean of each random variable \boldsymbol\mu_{\boldsymbol X } (the i^{th} term being the mean value of X_{i}):

f(\boldsymbol x ) = \frac{1}{(2\pi)^{\frac{n}{2} }det(\boldsymbol C )^{\frac{1}{2}}} \cdot exp\left(-\frac{1}{2} (\boldsymbol x  - \boldsymbol\mu_{\boldsymbol X } )^{T} \boldsymbol C ^{-1} (\boldsymbol x  - \boldsymbol\mu_{\boldsymbol X } ) \right)

See Also

Random variables

Random fields

Random process