Math 311

Class 8

Stochastic Process

  • A sequence of experiments

  • Simplest case: all mutually independent.

    • rolling a die repeatedly
    • simple weather model: 70% rainy, 30% sunny.
  • Little more sophisticated:

    The probabilities of outcomes of each experiment depend only on the outcome of the previous experiment.

    This is called a Markov chain

  • Example:

    • If it’s sunny today, probability of staying sunny is 90%
    • If it is rainy today, probability of staying rainy is 60%

Model using random variables

A sequence of random variables \(X_n\) (steps) with possible values 0, 1, 2, … (states)

\[ \operatorname{P}(X_{n+1} = j \mid X_n = i, X_{n-1} = i_{n-1}, X_{n-2} = i_{n-2}, \dots, X_0 = i_0) = \operatorname{P}(X_{n+1} = j \mid X_n = i) = P_{ij} \]

If \(X_n = i\) and \(X_{n+1} = j\), we say that the process transitioned from state \(i\) to state \(j\).

For each state \(i\) of the variable \(X_n\), there is a conditional distribution of the variable \(X_{n+1}\):

\[\operatorname{P}(X_{n+1} = j \mid X_n = i) = P_{ij},\]

so called transition probability from state \(i\) to state \(j\).

Transition Matrix

\[ \mathbf{P} = \begin{bmatrix} P_{00} & P_{01} & P_{02} & \cdots \\ P_{10} & P_{11} & P_{12} & \cdots \\ \vdots & \vdots & \vdots & \\ P_{i0} & P_{i1} & P_{i2} & \cdots \\ \vdots & \vdots & \vdots & \\ \end{bmatrix} \]

Example: Gambling model:

  • Win $1 with probability \(p\)
  • Lose $1 with probability \(1-p\).
  • Stops playing when funds are 0.
  • Stops playing when funds are \(N\).