Math 311

Class 6

Jointly Distributed Variables

  • Two (or more) random variables \(X\), \(Y\).
  • Joint cumulative distribution function \[F_{X,Y} (a, b) = \operatorname{P}(X \le a \text{ and } Y \le b)\]
  • Marginal cumulative distribution functions \[ F_X(a) = \lim_{b \to \infty} F_{X,Y}(a, b)\qquad \text{ and } \qquad F_Y(b) = \lim_{a \to \infty} F_{X,Y}(a, b)\]
  • Joint probability mass function \[p_{X,Y}(a, b) = \operatorname{P}(X = a \text{ and } Y = b)\]
  • Joint probability density function \(f_{X,Y}\): for intervals \(I = [a_1,a_2]\), \(J = [b_1, b_2]\) \[\operatorname{P}(a_1 \le X \le a_2\text{ and } b_1 \le Y \le b_2) = \int_{a_1}^{a_2}\int_{b_1}^{b_2} f_{X,Y}(x,y)\;dy\;dx\]

Marginal Distribution Functions

  • Discrete case: marginal probability mass functions
    • \(\displaystyle p_X(a) = \operatorname{P}(X = a) = \sum_{\text{possible values $y_i$ of $Y$}} p_{X,Y} (a, y_i)\)
    • \(\displaystyle p_Y(b) = \operatorname{P}(Y = b) = \sum_{\text{possible values $x_i$ of $X$}} p_{X,Y} (x_i, b)\)
  • Continuous case: marginal probability density functions
    • \(\displaystyle f_X(a) = \int_{-\infty}^\infty f_{X,Y} (a, y)\;dy\)
    • \(\displaystyle f_Y(b) = \int_{-\infty}^\infty f_{X,Y} (x, b)\;dx\)

Independent Random Variables

  • \(X\) and \(Y\) are independent if for all \(a\) and \(b\), the events \(X \le a\) and \(Y \le b\) are independent.
  • \(X\) and \(Y\) are independent if and only if \(F_{X,Y}(a,b) = F_X(a)F_Y(b)\) for all \(a\) and \(b\).
  • For discrete variables: \(p_{X,Y}(x,y) = p_X(x)p_Y(y)\).
  • For continuous variables: \(f_{X,Y}(x,y) = f_X(x)f_Y(y)\).

Functions of Random Variables

  • Last week: Given a function \(g:\mathbb{R} \to \mathbb{R}\)
    • \(X\) discrete: \(\displaystyle\operatorname{E} g(X) = \sum_{x_i} g(x_i)f(x_i)\)
    • \(X\) continuous: \(\displaystyle\operatorname{E} g(X) = \int_{-\infty}^\infty g(x)f(x)\;dx\)
  • Now: Given a function \(g:\mathbb{R}^2 \to \mathbb{R}\)
    • \(X\), \(Y\) discrete: \(\displaystyle\operatorname{E} g(X,Y) = \sum_{x_i} \sum_{y_i} g(x_i,y_i)f_{X,Y}(x_i,y_i)\)
    • \(X\), \(Y\) continuous: \(\displaystyle\operatorname{E} g(X,Y) = \int_{-\infty}^\infty\int_{-\infty}^\infty g(x,y)f_{X,Y}(x,y)\;dy\;dx\)

Conditional Distributions

  • Given jointly distributed variables \(X\) and \(Y\), for each \(y\), the conditionally distributed random variable \(X\vert (Y = y)\) is a variable with
    • probability mass function \(\displaystyle p_{X\vert Y}(x\vert y) = \frac{p_{X,Y}(x,y)}{p_Y(y)}\)
    • probability density function \(\displaystyle f_{X\vert Y}(x\vert y) = \frac{f_{X,Y}(x,y)}{f_Y(y)}\)
  • Conditional Expectation:
    • \(\displaystyle\operatorname{E}(X\vert Y = y) = \sum_{x_i} x_ip_{X\vert Y}(x_i \vert y)\)
    • \(\displaystyle\operatorname{E}(X\vert Y = y) = \int_{-\infty}^\infty x f_{X\vert Y}(x \vert y)\;dx\)

Conditional Expectation

  • \(\displaystyle\operatorname{E}(X\vert Y = y) = \sum_{x_i} x_ip_{X\vert Y}(x_i \vert y) = \sum_{x_i} x_i\frac{p_{X, Y}(x_i, y)}{p_Y(y)}\)
  • \(\displaystyle\operatorname{E}(X\vert Y = y) = \int_{-\infty}^\infty x f_{X\vert Y}(x \vert y)\;dx = \int_{-\infty}^\infty x \frac{f_{X, Y}(x, y)}{f_Y(y)}\;dx\)