Links: Single Continuous Random Variable When we have multiple random variables we consider it as a Vector in . E.g. . We can consider how individual variables in are distributed by just consider as a Single Continuous Random Variable. But also many concepts generalise nicely to variables at once.

Joint Probability Density Function

A Joint Probability Density Function is

Where

Marginal PDF

Given a joint PDF we can find a marginal PDF which is just a regular PDF of a subset of variables. For a subset of indices we get

This is basically saying for any fixed to get the PDF is equal to summing the PDF of what all the other variables could possibly be. In short you fix and vary .

Independence

Variables are mutually independent iff

Pairwise independence is not enough.

We can generalise this further to consider disjoint subsets of that are independent of each other but may be dependant on other variables in that subset. Formally we get Theorem Given a random vector with probabilistic graphical model . If we have disjoint subsets of , . That are independent of each other that is . There is no path in between and . Then

Note: a probabilistic graphical model just means a dependency graph where two nodes are connected if they are mathematically independent.

Expected Value

Mean

Covariance

For two random variables

  1. above the mean, is above.
  2. above the mean, is below.
  3. No linear relationship

Note: does not mean independence.

Lemma

Proof:

Covariance Matrix

For random variables we define a covariance Matrix denoted . It is defined as

So we get that

We can using this compute the variance of a linear with

Correlation Matrix

Linear Transforms

For we have that and

Conditionals and Bayes’ Theorem

For two random vectors we have