Poisson Arrival in a Continuous Time



11.1.2 Basic Concepts of the Poisson Process

The Poisson process is one of the most widely-used counting processes. It is usually used in scenarios where we are counting the occurrences of certain events that appear to happen at a certain rate, but completely at random (without a certain structure). For example, suppose that from historical data, we know that earthquakes occur in a certain area with a rate of $2$ per month. Other than this information, the timings of earthquakes seem to be completely random. Thus, we conclude that the Poisson process might be a good model for earthquakes. In practice, the Poisson process or its extensions have been used to model [24]

$-$ the number of car accidents at a site or in an area;
$-$ the location of users in a wireless network;
$-$ the requests for individual documents on a web server;
$-$ the outbreak of wars;
$-$ photons landing on a photodiode.


Poisson random variable: Here, we briefly review some properties of the Poisson random variable that we have discussed in the previous chapters. Remember that a discrete random variable $X$ is said to be a Poisson random variable with parameter $\mu$, shown as $X \sim Poisson(\mu)$, if its range is $R_X=\{0,1,2,3,...\}$, and its PMF is given by \begin{equation} \nonumber P_X(k) = \left\{ \begin{array}{l l} \frac{e^{-\mu} \mu^k}{k!}& \quad \text{for } k \in R_X\\ 0 & \quad \text{ otherwise} \end{array} \right. \end{equation} Here are some useful facts that we have seen before:

  1. If $X \sim Poisson(\mu)$, then $EX=\mu$, and $\textrm{Var}(X)=\mu$.
  2. If $X_i \sim Poisson(\mu_i)$, for $i=1,2,\cdots, n$, and the $X_i$'s are independent, then \begin{align*} X_1+X_2+\cdots+X_n \sim Poisson(\mu_1+\mu_2+\cdots+\mu_n). \end{align*}
  3. The Poisson distribution can be viewed as the limit of binomial distribution.


Theorem
Let $Y_n \sim Binomial\big(n,p=p(n)\big)$. Let $\mu>0$ be a fixed real number, and $\lim_{n \rightarrow \infty} np=\mu$. Then, the PMF of $Y_n$ converges to a $Poisson(\mu)$ PMF, as $n \rightarrow \infty$. That is, for any $k \in \{0,1,2,...\}$, we have \begin{equation}%\label{} \nonumber \lim_{n \rightarrow \infty} P_{Y_n}(k)=\frac{e^{-\mu} \mu^k}{k!}. \end{equation}


Poisson Process as the Limit of a Bernoulli Process:

Suppose that we would like to model the arrival of events that happen completely at random at a rate $\lambda$ per unit time. Here is one way to do this. At time $t=0$, we have no arrivals yet, so $N(0)=0$. We now divide the half-line $[0, \infty)$ to tiny subintervals of length $\delta$ as shown in Figure 11.2.

Poisson-coinflip
Figure 11.2 - Dividing the half-line $[0, \infty)$ to tiny subintervals of length $\delta$.

Each subinterval corresponds to a time slot of length $\delta$. Thus, the intervals are $(0,\delta]$, $(\delta, 2\delta]$, $(2\delta, 3\delta]$, $\cdots$. More generally, the $k$th interval is $\big((k-1)\delta, k\delta \big]$. We assume that in each time slot, we toss a coin for which $P(H)=p=\lambda \delta$. If the coin lands heads up, we say that we have an arrival in that subinterval. Otherwise, we say that we have no arrival in that interval. Figure 11.3 shows this process. Here, we have an arrival at time $t=k \delta$, if the $k$th coin flip results in a heads.

Poisson-coinflip-2
Figure 11.3 - Poisson process as a limit of a Bernoulli process.

Now, let $N(t)$ be defined as the number of arrivals (number of heads) from time $0$ to time $t$. There are $n \approx \frac{t}{\delta}$ time slots in the interval $(0,t]$. Thus, $N(t)$ is the number of heads in $n$ coin flips. We conclude that $N(t) \sim Binomial(n,p)$. Note that here $p=\lambda \delta$, so \begin{align*} np &= n \lambda \delta\\ &=\frac{t}{\delta} \cdot \lambda \delta\\ &=\lambda t. \end{align*}

Thus, by Theorem 11.1, as $\delta \rightarrow 0$, the PMF of $N(t)$ converges to a Poisson distribution with rate $\lambda t$. More generally, we can argue that the number of arrivals in any interval of length $\tau$ follows a $Poisson(\lambda \tau)$ distribution as $\delta \rightarrow 0$.

Consider several non-overlapping intervals. The number of arrivals in each interval is determined by the results of the coin flips for that interval. Since different coin flips are independent, we conclude that the above counting process has independent increments.

Definition of the Poisson Process:

The above construction can be made mathematically rigorous. The resulting random process is called a Poisson process with rate (or intensity) $\lambda$. Here is a formal definition of the Poisson process.

The Poisson Process

Let $\lambda>0$ be fixed. The counting process $\{N(t), t \in [0, \infty)\}$ is called a Poisson process with rates $\lambda$ if all the following conditions hold:

  1. $N(0)=0$;
  2. $N(t)$ has independent increments;
  3. the number of arrivals in any interval of length $\tau>0$ has $Poisson(\lambda \tau)$ distribution.

Note that from the above definition, we conclude that in a Poisson process, the distribution of the number of arrivals in any interval depends only on the length of the interval, and not on the exact location of the interval on the real line. Therefore the Poisson process has stationary increments.


Example
The number of customers arriving at a grocery store can be modeled by a Poisson process with intensity $\lambda=10$ customers per hour.

  1. Find the probability that there are $2$ customers between 10:00 and 10:20.
  2. Find the probability that there are $3$ customers between 10:00 and 10:20 and $7$ customers between 10:20 and 11.
  • Solution
      1. Here, $\lambda=10$ and the interval between 10:00 and 10:20 has length $\tau=\frac{1}{3}$ hours. Thus, if $X$ is the number of arrivals in that interval, we can write $X \sim Poisson(10/3)$. Therefore, \begin{align*} P(X=2)&=\frac{e^{-\frac{10}{3}} \left(\frac{10}{3}\right)^2}{2!}\\ &\approx 0.2 \end{align*}
      2. Here, we have two non-overlapping intervals $I_1 =$(10:00 a.m., 10:20 a.m.] and $I_2=$ (10:20 a.m., 11 a.m.]. Thus, we can write

        \begin{align*} P\bigg(\textrm{$3$ arrivals in $I_1$ $\;$ and $\;$ $7$ arrivals in $I_2$}\bigg)&=\\ & P\bigg(\textrm{$3$ arrivals in $I_1$}\bigg) \cdot P\bigg(\textrm{$7$ arrivals in $I_2$}\bigg). \end{align*} Since the lengths of the intervals are $\tau_1=1/3$ and $\tau_2=2/3$ respectively, we obtain $\lambda \tau_1=10/3$ and $\lambda \tau_2=20/3$. Thus, we have \begin{align*} P\bigg(\textrm{$3$ arrivals in $I_1$ $\;$ and $\;$ $7$ arrivals in $I_2$}\bigg)&=\frac{e^{-\frac{10}{3}} \left(\frac{10}{3}\right)^3}{3!} \cdot \frac{e^{-\frac{20}{3}} \left(\frac{20}{3}\right)^7}{7!} \\ &\approx 0.0325 \end{align*}


Second Definition of the Poisson Process:

Let $N(t)$ be a Poisson process with rate $\lambda$. Consider a very short interval of length $\Delta$. Then, the number of arrivals in this interval has the same distribution as $N(\Delta)$. In particular, we can write \begin{align*} P(N(\Delta)=0) &=e^{-\lambda \Delta} \\ &=1-\lambda \Delta+ \frac{\lambda^2}{2} \Delta^2-\cdots \; (\textrm{Taylor Series}). \end{align*} Note that if $\Delta$ is small, the terms that include second or higher powers of $\Delta$ are negligible compared to $\Delta$. We write this as \begin{align} P(N(\Delta)=0) =1-\lambda \Delta+ o(\Delta) \hspace{30pt} (11.1) \end{align} Here $o(\Delta)$ shows a function that is negligible compared to $\Delta$, as $\Delta \rightarrow 0$. More precisely, $g(\Delta)=o(\Delta)$ means that \begin{align*} \lim_{\Delta \rightarrow 0} \frac{g(\Delta)}{\Delta}=0. \end{align*} Now, let us look at the probability of having one arrival in an interval of length $\Delta$. \begin{align*} P(N(\Delta)=1) &=e^{-\lambda \Delta} \lambda \Delta \\ &=\lambda \Delta \left(1-\lambda \Delta+ \frac{\lambda^2}{2} \Delta^2-\cdots \right) \; (\textrm{Taylor Series})\\ &=\lambda \Delta+\left(-\lambda^2 \Delta^2+ \frac{\lambda^3}{2} \Delta^3\cdots \right)\\ &=\lambda \Delta+o(\Delta). \end{align*} We conclude that \begin{align} P(N(\Delta)=1)=\lambda \Delta+o(\Delta) \hspace{30pt} (11.2) \end{align} Similarly, we can show that \begin{align} P(N(\Delta) \geq 2)=o(\Delta) \hspace{30pt} (11.3) \end{align} In fact, equations 11.1, 11.2, and 11.3 give us another way to define a Poisson process.

The Second Definition of the Poisson Process

Let $\lambda>0$ be fixed. The counting process $\{N(t), t \in [0, \infty)\}$ is called a Poisson process with rate $\lambda$ if all the following conditions hold:

  1. $N(0)=0$;
  2. $N(t)$ has independent and stationary increments
  3. we have \begin{align*} &P(N(\Delta)=0) =1-\lambda \Delta+ o(\Delta),\\ &P(N(\Delta)=1)=\lambda \Delta+o(\Delta),\\ &P(N(\Delta) \geq 2)=o(\Delta). \end{align*}

We have already shown that any Poisson process satisfies the above definition. To show that the above definition is equivalent to our original definition, we also need to show that any process that satisfies the above definition also satisfies the original definition. A method to show this is outlined in the End of Chapter Problems.

Arrival and Interarrival Times:

Let $N(t)$ be a Poisson process with rate $\lambda$. Let $X_1$ be the time of the first arrival. Then, \begin{align*} P(X_1>t)&=P\big(\textrm{no arrival in }(0,t] \big) \\ &=e^{-\lambda t}. \end{align*} We conclude that \begin{align} \nonumber F_{X_1}(t) = \left\{ \begin{array}{l l} 1-e^{-\lambda t} & \quad t>0 \\ & \quad \\ 0 & \quad \text{otherwise} \end{array} \right. \end{align} Therefore, $X_1 \sim Exponential(\lambda)$. Let $X_2$ be the time elapsed between the first and the second arrival (Figure 11.4).

Poisson-interarrival
Figure 11.4 - The random variables $X_1$, $X_2$, $\cdots$ are called the interarrival times of the counting process $N(t)$.

Let $s>0$ and $t>0$. Note that the two intervals $(0,s]$ and $(s,s+t]$ are disjoint. We can write \begin{align*} P(X_2>t |X_1=s)&=P\big(\textrm{no arrival in }(s,s+t] | X_1=s\big) \\ &=P\big(\textrm{no arrivals in }(s,s+t]\big) \; (\textrm{independent increments})\\ &=e^{-\lambda t}. \end{align*} We conclude that $X_2 \sim Exponential(\lambda)$, and that $X_1$ and $X_2$ are independent. The random variables $X_1$, $X_2$, $\cdots$ are called the interarrival times of the counting process $N(t)$. Similarly, we can argue that all $X_i$'s are independent and $X_i \sim Exponential(\lambda)$ for $i=1,2,3, \cdots$.

Interarrival Times for Poisson Processes

If $N(t)$ is a Poisson process with rate $\lambda$, then the interarrival times $X_1$, $X_2$, $\cdots$ are independent and \begin{align*} X_i \sim Exponential(\lambda), \; \textrm{ for }i=1,2,3, \cdots. \end{align*}

Remember that if $X$ is exponential with parameter $\lambda>0$, then $X$ is a memoryless random variable, that is \begin{align}%\label{} \nonumber P(X>x+a |X>a)=P(X>x), \; \textrm{ for }a,x \geq 0. \end{align} Thinking of the Poisson process, the memoryless property of the interarrival times is consistent with the independent increment property of the Poisson distribution. In some sense, both are implying that the number of arrivals in non-overlapping intervals are independent. To better understand this issue, let's look at an example.


Example
Let $N(t)$ be a Poisson process with intensity $\lambda=2$, and let $X_1$, $X_2$, $\cdots$ be the corresponding interarrival times.

  1. Find the probability that the first arrival occurs after $t=0.5$, i.e., $P(X_1>0.5)$.
  2. Given that we have had no arrivals before $t=1$, find $P(X_1>3)$.
  3. Given that the third arrival occurred at time $t=2$, find the probability that the fourth arrival occurs after $t=4$.
  4. I start watching the process at time $t=10$. Let $T$ be the time of the first arrival that I see. In other words, $T$ is the first arrival after $t=10$. Find $ET$ and $\textrm{Var}(T)$.
  5. I start watching the process at time $t=10$. Let $T$ be the time of the first arrival that I see. Find the conditional expectation and the conditional variance of $T$ given that I am informed that the last arrival occurred at time $t=9$.
  • Solution
      1. Since $X_1 \sim Exponential(2)$, we can write \begin{align*} P(X_1>0.5) &=e^{-(2 \times 0.5)} \\ &\approx 0.37 \end{align*} Another way to solve this is to note that \begin{align*} P(X_1>0.5) &=P(\textrm{no arrivals in }(0,0.5])=e^{-(2 \times 0.5)}\approx 0.37 \end{align*}
      2. We can write \begin{align*} P(X_1>3|X_1>1) &=P(X_1>2) \; (\textrm{memoryless property})\\ &=e^{-2 \times 2}\\ &\approx 0.0183 \end{align*} Another way to solve this is to note that the number of arrivals in $(1,3]$ is independent of the arrivals before $t=1$. Thus, \begin{align*} P(X_1>3|X_1>1) &=P\big(\textrm{no arrivals in }(1,3] \; | \; \textrm{no arrivals in }(0,1]\big)\\ &=P\big(\textrm{no arrivals in }(1,3]\big)\; (\textrm{independent increments})\\ &=e^{-2 \times 2}\\ &\approx 0.0183 \end{align*}
      3. The time between the third and the fourth arrival is $X_4 \sim Exponential(2)$. Thus, the desired conditional probability is equal to \begin{align*} P(X_4>2|X_1+X_2+X_3=2)&=P(X_4>2) \; (\textrm{independence of the $X_i$'s})\\ &=e^{-2 \times 2}\\ &\approx 0.0183 \end{align*}
      4. When I start watching the process at time $t=10$, I will see a Poisson process. Thus, the time of the first arrival from $t=10$ is $Exponential(2)$. In other words, we can write \begin{align*} T=10+X, \end{align*} where $X \sim Exponential(2)$. Thus, \begin{align*} ET&=10+EX\\ &=10+\frac{1}{2}=\frac{21}{2}, \end{align*} \begin{align*} \textrm{Var}(T)&=\textrm{Var}(X)\\ &=\frac{1}{4}. \end{align*}
      5. Arrivals before $t=10$ are independent of arrivals after $t=10$. Thus, knowing that the last arrival occurred at time $t=9$ does not impact the distribution of the first arrival after $t=10$. Thus, if $A$ is the event that the last arrival occurred at $t=9$, we can write \begin{align*} E[T|A]&=E[T]\\ &=\frac{21}{2}, \end{align*} \begin{align*} \textrm{Var}(T|A)&=\textrm{Var}(T)\\ &=\frac{1}{4}. \end{align*}

Now that we know the distribution of the interarrival times, we can find the distribution of arrival times \begin{align*} &T_1=X_1,\\ &T_2=X_1+X_2,\\ &T_3=X_1+X_2+X_3,\\ & \hspace{30pt} \vdots \end{align*} More specifically, $T_n$ is the sum of $n$ independent $Exponential (\lambda)$ random variables. In previous chapters we have seen that if $T_n=X_1+X_2+\cdots+X_n$, where the $X_i$'s are independent $Exponential(\lambda)$ random variables, then $T_n \sim Gamma(n, \lambda)$. This has been shown using MGFs. Note that here $n \in \mathbb{N}$. The $Gamma(n, \lambda)$ is also called Erlang distribution, i.e, we can write \begin{align*} T_n \sim Erlang(n,\lambda)=Gamma(n, \lambda), \; \textrm{ for }n=1,2,3, \cdots. \end{align*} The PDF of $T_n$, for $n=1,2,3,\cdots$, is given by \begin{align*} f_{T_n}(t)=\frac{\lambda^n t^{n-1}e^{-\lambda t}}{(n-1)!}, \; \textrm{ for }t>0. \end{align*} Remember that if $X \sim Exponential(\lambda)$, then \begin{align*} &E[X]=\frac{1}{\lambda},\\ &\textrm{Var}(X)=\frac{1}{\lambda^2}. \end{align*} Since $T_n=X_1+X_2+\cdots+X_n$, we conclude that \begin{align*} &E[T_n]=n EX_1=\frac{n}{\lambda},\\ &\textrm{Var}(T_n)=n \textrm{Var}(X_n)=\frac{n}{\lambda^2}. \end{align*} Note that the arrival times are not independent. In particular, we must have $T_1 \leq T_2 \leq T_3 \leq \cdots$.

Arrival Times for Poisson Processes

If $N(t)$ is a Poisson process with rate $\lambda$, then the arrival times $T_1$, $T_2$, $\cdots$ have $Gamma(n, \lambda)$ distribution. In particular, for $n=1,2,3,\cdots$, we have \begin{align*} E[T_n]=\frac{n}{\lambda}, \; \textrm{and} \; \textrm{Var}(T_n)=\frac{n}{\lambda^2}. \end{align*}

The above discussion suggests a way to simulate (generate) a Poisson process with rate $\lambda$. We first generate i.i.d. random variables $X_1$, $X_2$, $X_3$, $\cdots$, where $X_i \sim Exponential(\lambda)$. Then the arrival times are given by \begin{align*} &T_1=X_1,\\ &T_2=X_1+X_2,\\ &T_3=X_1+X_2+X_3,\\ & \hspace{30pt} \vdots \end{align*}

The print version of the book is available through Amazon here.

Book Cover

barrerasured1944.blogspot.com

Source: https://www.probabilitycourse.com/chapter11/11_1_2_basic_concepts_of_the_poisson_process.php

0 Response to "Poisson Arrival in a Continuous Time"

Post a Comment

Iklan Atas Artikel

Iklan Tengah Artikel 1

Iklan Tengah Artikel 2

Iklan Bawah Artikel