Links: Single Continuous Random Variable
When we have multiple random variables we consider it as a Vector in
Joint Probability Density Function
A Joint Probability Density Function is
Where
Marginal PDF
Given a joint PDF we can find a marginal PDF which is just a regular PDF of a subset of variables. For a subset of indices
This is basically saying for any fixed
Independence
Variables
Pairwise independence is not enough.
We can generalise this further to consider disjoint subsets of
Note: a probabilistic graphical model
Expected Value
Mean
Covariance
For two random variables
above the mean, is above. above the mean, is below. No linear relationship
Note:
Lemma
Proof:
Covariance Matrix
For
So we get that
We can using this compute the variance of a linear
Correlation Matrix
Linear Transforms
For
Conditionals and Bayes’ Theorem
For two random vectors