Class 5
For any \(\varepsilon > 0\),
\[ \operatorname{P}\left(a - \frac{\varepsilon}{2} \le X \le a + \frac{\varepsilon}{2}\right) = \int_{a - \frac{\varepsilon}{2}}^{a + \frac{\varepsilon}{2}} f(x)\;dx \]
If \(f\) is continuous at \(a\) and \(\varepsilon\) is small, the integral is approximately equal to \[ \int_{a - \frac{\varepsilon}{2}}^{a + \frac{\varepsilon}{2}} f(a)\;dx = \varepsilon f(a) \]
If \(f\) is continuous at \(a\) then for a small \(\varepsilon\)
\[ \operatorname{P}\left(a - \frac{\varepsilon}{2} \le X \le a + \frac{\varepsilon}{2}\right) \approx \varepsilon f(a) \]
Uniform
Exponential \(\displaystyle f(x) = \begin{cases} \lambda e^{-\lambda x} & \text{ if } x \ge 0\\ 0 & \text{ if} x < 0\end{cases}\)
Gamma \(\displaystyle f(x) = \begin{cases}\frac{\lambda e^{-\lambda x}(\lambda x)^{\alpha - 1}}{\Gamma(\alpha)} & \text{ if } x \ge 0\\0 & \text{ if } x < 0\end{cases}\)
where \(\Gamma\) is the Gamma function defined by \(\displaystyle\Gamma(t) = \int_0^\infty e^{-x} x^{t-1}\;dx\)
Note that for a positive integer \(n\), \(\Gamma(n) = (n-1)!\).
Normal \(f(x) = \frac{1}{\sqrt{2\pi}\sigma} e^{-\frac{(x-\mu)^2}{2\sigma^2}}\)
Important property: If \(X\) is normally distributed with parameters \(\mu\) and \(\sigma\)^2, then \(Y = aX + b\) is also normally distributed, with parameters \(a\mu + b\) and \(a^2\sigma^2\).
Game:
The mean or expected value of a discrete random variable \(X\):
\[\mu_X = \operatorname{E} X = \sum_{x \text{ possible value}} x\cdot \operatorname{P}(X = x) = \sum_{x \text{ possible value}} x\cdot p(x)\]
The mean or expected value of a continuous random variable \(X\):
\[\mu_X = \operatorname{E} X = \int_{-\infty}^\infty x f(x)\;dx\]
If \(X\) and \(Y\) are random variables and \(a\) and \(b\) real numbers, then \(\operatorname{E}(aX + bY) = a\operatorname{E}X + b\operatorname{E}Y\).
In particular \(\operatorname{E}(aX + b) = a\operatorname{E}X + b\).
If \(X\) is a discrete random variable and \(g\colon \mathbb R \to \mathbb R\), then
\[\mu_X = \operatorname{E} g(X) = \sum_{x \text{ possible value}} g(x)\cdot p(x)\]
If \(X\) is a continuous random variable and \(g\colon \mathbb R \to \mathbb R\), then
\[\mu_X = \operatorname{E} g(X) = \int_{-\infty}^\infty g(x) f(x)\;dx\]