Math 311

Class 22

Conditional Distribution of Arrivals

Suppose that \(N(t) = 1\) for some time \(t\). In other words, \(T_1 \le t\). What is the conditional distribution of \(T_1\) in the interval \((0,t]\)?

Suppose that \(N(t) = n\) for some time \(t\). In other words, \(S_n \le t\). What is the (joint) conditional distribution of the arrival times \(S_1, S_2, \ldots, S_n\)?

Their joint conditional density function is

\[f\left(s_1, s_2, s_3, ..., s_n \mid N(t) = n\right) = \begin{cases} \frac{n!}{t^n} & \text{ if } 0 < s_1 < s_2 < \cdots < s_n < t\\ 0 & \text{ otherwise} \end{cases} \]

Order Statistics

  • \(n\) events happening independently within the interval \((0,t]\),
  • each at time \(\hat S_i\) that is uniformly distributed in \((0,t]\).
  • Each has density function \(\displaystyle f_i(s) = \begin{cases}\frac{1}{t} & \text{ if } 0 < s \le t\\ 0 & \text{ otherwise.}\end{cases}\)
  • The joint density function is \(\displaystyle f(s_1, s_2, \ldots, s_n) = \begin{cases}\frac{1}{t^n} & \text{ if } 0 < s_i \le t\\ 0 \text{ for } i = 1, 2, \ldots, n & \text{ otherwise.}\end{cases}\)
  • Now define \(S_i = {}\) the \(i\)-th of the times \(\hat S_1, \hat S_2, \ldots, \hat S_n\) for \(i = 1, 2, \ldots, n\).
  • What is the joint distribution of \(S_i\)’s?
  • Conclusion: Given \(N(t) = n\), the arrival times of the \(n\) individual events are all \({}\stackrel{\text{iid}}{\sim} \operatorname{Unif}(0,t)\).

Sampling a Poisson Process

Given a Poisson process \(\{N(t), t \ge 0\}\) with rate \(\lambda\). Suppose the events can be classified into one of \(k\) different types, with probabilities \(P_i(s)\), \(i = 1, 2, \ldots, k\), where \(s\) is the time the event occurred, independently of any other events.

Define \(N_i(t) = {}\) the number of events classified as type \(i\) that occurred by the time \(t\).

Then for each \(t\), the \(N_i(t)\)’s, \(i = 1, 2, \ldots, k\) are independent Poisson distributed random variables, with means (or rates)

\[\operatorname{E} \left(N_i(t)\right) = \lambda \int_0^t P_i(s)\;ds\]

Sketch of Proof

  • Goal: calculate the joint probability distribution of \(N_i(t)\)’s, \(i = 1, 2, \ldots, k\):
    \(\operatorname{P}(N_i(t) = n_i, i = 1, 2, \ldots, k)\).

  • Let \(\operatorname n = \sum_{i=1}^k n_i\), and condition on \(N(t) = n\).

  • Each of the \(n\) events happens independently at time \(S\) that is uniformly distributed on \((0,t]\), and get’s classified as type \(i\) with conditional probability \(P_i(s)\), given \(S = s\):

    \[P_i = \operatorname{P}(\text{the given event is classified as type } i) = \frac{1}{t}\int_0^t P_i(s)\;ds\]

    independently of the other events.

  • \(\operatorname{P}(N_i(t) = n_i, i = 1, 2, \ldots, k\mid N(t) = n) = {}\) the probability that exactly \(n_i\) of the \(n\) events is classified as type \(i\), for \(i = 1, 2, \ldots, k\).

  • This is so called multinomial distribution: \[\operatorname{P}(N_i(t) = n_i, i = 1, 2, \ldots, k\mid N(t) = n) = \frac{n!}{n_1!n_2!\cdots n_k!}P_1^{n_1}P_2^{n_2}\cdots P_k^{n_k}\]

Sketch of Proof (cont.)

\[\begin{aligned} \operatorname{P}\left({\color{blue}N_i\left(t\right) = n_i, i = 1, 2, \ldots, k}\right) &\class{fragment}{{}= \operatorname{P}\left({\color{blue}N_i\left(t\right) = n_i, i = 1, 2, \ldots, k}\mid {\color{orange}N(t) = n}\right)\operatorname{P}({\color{orange}N(t) = n})}\\[1.2em] &\class{fragment}{{}= {\color{green}\frac{n!}{n_1!n_2!\cdots n_k!}P_1^{n_1}P_2^{n_2}\cdots P_k^{n_k}}\;{\color{orange}e^{-\lambda t}\frac{(\lambda t)^n}{n!}}}\\[1.2em] &\class{fragment}{{}= \frac{1}{n_1!n_2!\cdots n_k!}P_1^{n_1}P_2^{n_2}\cdots P_k^{n_k}e^{-\lambda t (\overbrace{P_1 + P_2 + \cdots + P_k}^1)}(\lambda t)^{(\overbrace{n_1 + n_2 + \cdots + n_k}^{n})}}\\[1.2em] &\class{fragment}{{}= \frac{1}{n_1!} (P_1 \lambda t)^{n_1}e^{-\lambda t P_1} \frac{1}{n_2!} (P_2 \lambda t)^{n_2}e^{-\lambda t P_2} \cdots \frac{1}{n_k!} (P_k \lambda t)^{n_k}e^{-\lambda t P_k}}\\[1.2em] &\class{fragment}{{}= \operatorname{P}({\color{blue}N_1(t)=n_1})\; \operatorname{P}({\color{blue}N_2(t)=n_2})\;\cdots\; \operatorname{P}({\color{blue}N_k(t)=n_k})}\\[1.2em] &\class{fragment}{\text{rate of }N_i(t) = \lambda t P_i = \lambda t \frac{1}{t}\int_0^t P_i(s)\;ds} \end{aligned}\]

Example (Replacing Machine Parts)

A machine has a large number of identical parts that fail according to a Poisson process with rate \(\lambda\).

  • Some failures can be immediately fixed.
  • In other cases, the part must be replaced.

Suppose the probability that the part can be fixed is given by the function \(P_1(s) = e^{-rs}\), where \(s\) is the time of the failure.

What is the expected number of parts that must be replaced by the time \(t = 20\)?

Example (Infinite Server Queue)

Suppose customers arriving to a service station can be modeled as a Poisson process with rate \(\lambda\).

  • When a customer arrives, they are immediately served by a server.
  • Service times are independent of each other and of the arrival time, distributed according a cumulative density function \(G\).

Define the following two random variables:

  • \(X(t) = {}\) the number of customers that completed the service by the time \(t\).
  • \(Y(t) = {}\) the number of customers that are still being served at the time \(t\).

How are those distributed?