Math 311

Class 9

Markov Chains

The probabilities of outcomes of each experiment depend only on the outcome of the previous experiment.

\[ \operatorname{P}(X_{n+1} = j \mid X_n = i, X_{n-1} = i_{n-1}, X_{n-2} = i_{n-2}, \dots, X_0 = i_0) = \operatorname{P}(X_{n+1} = j \mid X_n = i) = P_{ij} \]

Transition probabilities

\[P_{ij} = \operatorname{P}(X_{n+1} = j \mid X_n = i)\]

Transition Matrix

\[ \mathbf{P} = \begin{bmatrix} P_{00} & P_{01} & P_{02} & \cdots \\ P_{10} & P_{11} & P_{12} & \cdots \\ \vdots & \vdots & \vdots & \\ P_{i0} & P_{i1} & P_{i2} & \cdots \\ \vdots & \vdots & \vdots & \\ \end{bmatrix} \]

Multi-step transitions

Transition over \(k\) steps:

\[P^k_{ij} = \operatorname{P}(X_{n+k} = j \mid X_n = i)\]

Chapman-Kolmogorov Equations:

\[P_{ij}^{m+n} = \sum_{k = 0, \dots} P_{ik}^m P_{kj}^n\]

\(n\)-step transition matrix is simply \(\mathbf{P}^n\).

Weather model

  • If it’s sunny today, probability of staying sunny is 90%
  • If it is rainy today, probability of staying rainy is 60%

\[ \mathbf{P} = \begin{bmatrix} .9 & .1 \\ .4 & .6 \end{bmatrix} \]

Gambling model

  • Win $1 with probability \(p\)
  • Lose $1 with probability \(1-p\).
  • Stops playing when funds are 0.
  • Stops playing when funds are \(N\).

\[ \small \mathbf{P} = \begin{bmatrix} 1 & 0 & 0 & 0 & 0 & \cdots & 0 & 0 \\ 1-p & 0 & p & 0 & 0 & \cdots & 0 & 0 \\ 0 & 1-p & 0 & p & 0 & \cdots & 0 & 0 \\ 0 & 0 & 1-p & 0 & p & \cdots & 0 & 0 \\ 0 & 0 & 0 & 1-p & 0 & \cdots & 0 & 0 \\ \vdots & \vdots & \vdots & \vdots & \vdots & \ddots & \vdots & \vdots \\ 0 & 0 & 0 & 0 & 0 & \cdots & 0 & p \\ 0 & 0 & 0 & 0 & 0 & \cdots & 0 & 1 \end{bmatrix} \]

Two-day memory weather model

  • If it rained for two days, it will rain the next day with probability 70%
  • If it rained one day but not the previous day, it will rain the next day with probability 50%
  • If it did not rain one day but rained the previous day, it will rain the next day with probability 40%
  • If it did not rain for two days, it will rain the next day with probability 20%

Example 4.10

  • An urn contain two balls, each can be red or blue
  • You randomly pick one of the balls and replace it with another one:
    • of the same color, with probability 80%
    • of the opposite color, with probability 20%
  • Initially, both balls are red.
  • What is the probability that the fifth selected ball is red?

Example 4.11

  • Balls are successively distributed among 8 urns
  • Each ball is equally likely to be places in any of the urns
  • What is the probability that there will be exactly 3 non-empty urns after 9 balls are distributed?

Example 4.12

In a sequence of independent coin flips, let \(N\) denote the number of flips until there are 3 consecutive heads.

  • Find \(\operatorname{P}(N \le 8)\)
  • Find \(\operatorname{P}(N = 8)\)

Example 4.13 (modified)

Given a Markov Chain with states 1, 2, 3, 4, and 5, with some transition probabilities

  • \(P_{11} = .3\)
  • \(P_{12} = .3\)
  • \(P_{21} = .1\)
  • \(P_{22} = .2\)

Starting at state 1, what is the probability that the first time you enter a state greater than 2 is on step 4?

Random communication

You have 5 stations, a message sent from each station will end up at any of the stations with the following probabilities:

\[\mathbf{P} = \begin{bmatrix} .1 & .2 & .5 & 0 & .2 \\ .3 & 0 & .7 & 0 & 0 \\ .6 & .1 & .3 & 0 & 0 \\ 0 & 0 & 0 & .6 & .4 \\ 0 & 0 & 0 & .3 & .7 \end{bmatrix}\]