Question sets-Poisson Distribution
Question 1
1.1
Given X\sim\text{Exp}(\lambda), please show that P(X>a+b|X>a)=P(X > b)
\begin{aligned} P(X>a+b|X>a) =& \frac{P(X>a+b, X>a)}{P(X>a)} \\ =& \frac{P(X>a+b)}{P(X>a)} \\ =& \frac{e^{-\lambda(a+b)}}{e^{-\lambda}} \\ =& e^{-\lambda b} \\ =& P(X>b) \end{aligned}
1.2
Let X be the failure time of a machine. Assume that X follows an exponential distribution with rate \lambda=2 failures per month. What is the probability that the machine can run over 4 months without any failure
\begin{aligned} P(X>4) =& e^{-\lambda x} \\ =& e^{-2 \times 4} \\ =& e^{-8}=0.0003 \end{aligned}
1.3
Based on 1.2 what is the probability that the machine can run 4 months more given that it has run for 1 year already?
\begin{aligned} P(X>4|X>12) =& P(X>4) \\ =& e^{-8}=0.0003 \end{aligned}
Question 2
The number of alpha particle emissions of carbon-14 that are counted by a Geiger counter follows a Poisson distribution with rate 19.7 particles per second. Let W be the waiting time (in seconds) until the fifth count is made.
2.1
What distribution does W follow ?
W follows a Gamma distribution with shape parameter k=5 and rate parameter \lambda=19.7. In other words, W \sim \text{Gamma}(5, 19.7).
2.2
Find the probability that it takes less than 0.1 second to observe the fifth count
\begin{aligned} P(W<0.1) =& F_W(0.1) \\ =& \int_0^{0.1} \frac{19.7^5}{\Gamma(5)} w^{5-1} e^{-19.7w} dw \ =& \int_0^{0.1} \frac{19.7^5}{4!} w^{4} e^{-19.7w} dw \\ =& 0.9999 \end{aligned}
Question 3
The number of patients having influenza in Hong Kong is 10 thoundsands per month. For the below calculation you can simply using the unit of thousands.
3.1
Determine the probability that there will be more than 12 thoundsands patients get influenza in a month.
\begin{aligned} P(X>12) =& 1-P(X\leq 12) \\ =& 1-P (X=0)+P(X=1)+\dots+P(X=12) \\ =& 1-\sum_{k=0}^{12} \frac{10^k e^{-10}}{k!} \\ =& 0.2084 \end{aligned}
3.2
Determine the probability that there will be more than 5 thoundsands patients get influenza in the first two weeks.
\begin{aligned} P(X>5) =& 1-P(X\leq 5) \\ =& 1-P(X=0)+P(X=1)+\dots+P(X=5) \\ =& 1-\sum_{k=0}^{5} \frac{5^k e^{-5}}{k!} \\ =& 0.3840
3.3
It is known that at least 1K patients had been reported between in the first two weeks. Determine the probability that totally there will be more than 5K patients having influenza in the first two weeks.
\begin{aligned} P(Y > 5 \mid Y \geq 1) &= \frac{P(Y > 5)}{1 - P(Y = 0)} \\ &= \frac{0.3840}{1 - e^{-5}} \\ &\approx \frac{0.3840}{1 - 0.0067} \\ &\approx \frac{0.3840}{0.9933} \\ &= \mathbf{0.3866} \end{aligned}
3.4
From the calculation in 3.3, what will the probability chnage if we have observed more than 1K patients in the first two weeks? What is the interpretation of observing partial information affects the final probability?
The probability will increase. As the denominator in the conditional probability formula becomes smaller (since we are conditioning on a more specific event), the overall probability will increase. In other words, observing more than 1K patients in the first two weeks provides additional information that makes it more likely for there to be more than 5K patients in total.
Question 4
Please show that the Poisson distribution can be derived as a limit of the Binomial distribution when the number of trials goes to infinity and the probability of success goes to zero, while keeping the expected number of successes constant. In other words, show that if X_n \sim \text{Binomial}(n, \frac{\lambda}{n}), then \lim_{n \to \infty} P(X_n = k) = \frac{\lambda^k e^{-\lambda}}{k!} for k = 0, 1, 2, \dots.
Calculation hints: In the proof, you may find it useful to use the following limit: \lim_{n \to \infty} \left(1 - \frac{\lambda}{n}\right)^n = e^{-\lambda}
\begin{aligned} \lim_{n\rightarrow\infty}P(X=k)= &\lim_{n\rightarrow\infty}\binom{n}{k}p^k(1-p)^{n-k}\\ = &\lim_{n\rightarrow\infty}\frac{n!}{(n-k)!k!}(\frac{\lambda}{n})^k(1-\frac{\lambda}{n})^{n-k}\\ = &\lim_{n\rightarrow\infty}\frac{n!}{n^k(n-k)!}\frac{\lambda^k}{k!}(1-\frac{\lambda}{n})^n(1-\frac{\lambda}{n})^{-k}\\ = &\lim_{n\rightarrow\infty}\frac{n}{n}\frac{n-1}{n}\dots\frac{n-k+1}{n}\frac{\lambda^k}{k!}(1-\frac{\lambda}{n})^n(1-\frac{\lambda}{n})^{-k}\\ = & \frac{\lambda^k}{k!}\exp(-\lambda) \end{aligned}
Question 5
Please show that the Negative Binomial distribution can be approximated by the Poisson distribution when the number of failures r goes to infinity and the probability of success p goes to 1, while keeping the mean of the Negative Binomial distribution constant. In other words, show that if X \sim \text{NegativeBinomial}(r, p) with mean \lambda = r\frac{p}{1-p}, then \lim_{r \to \infty, p \to 1} P(X = k) = \frac{\lambda^k e^{-\lambda}}{k!} for k = 0, 1, 2, \dots.
Calculation hints: In the proof, you may find it useful to use the following limits: \lim_{r \to \infty, p \to 1} \left(\frac{\lambda}{r + \lambda}\right)^r = e^{-\lambda} and \lim_{r\rightarrow\infty,p\rightarrow 1}\frac{(x+r-1)!}{x!(r-1)!} \sim \frac{r^k}{k!}
We first define \lambda=r\frac{p}{1-p} (i.e., mean of the NB). Consequently, p can be expressed as \frac{r}{r+\lambda}, and 1-p is \frac{\lambda}{r+\lambda} \small \begin{aligned} \lim_{r\rightarrow\infty,p\rightarrow 1}\frac{(x+r-1)!}{x!(r-1)!}(\frac{\lambda}{r+\lambda})^r(\frac{r}{r+\lambda})^x \\ \end{aligned}
Binomial coefficient term
\small \lim_{r\rightarrow\infty,p\rightarrow 1}\frac{(x+r-1)!}{x!(r-1)!} \sim \frac{r^k}{k!}
Power terms
\begin{aligned} \small &\lim_{r\rightarrow\infty,p\rightarrow 1}(\frac{\lambda}{r+\lambda})^r = \lim_{r\rightarrow\infty,p\rightarrow 1}(\frac{1}{1+\frac{\lambda}{r}})^r = \lim_{r\rightarrow\infty,p\rightarrow 1}(1+\frac{\lambda}{r})^{-r} =\exp(-\lambda) \end{aligned}
\begin{aligned} \small \lim_{r\rightarrow\infty,p\rightarrow 1}(\frac{\lambda}{r+\lambda})^x = \lim_{r\rightarrow\infty,p\rightarrow 1}\lambda^x\frac{1}{(r+\lambda)^x} = \lambda^k(r(1+\frac{\lambda}{r}))^{-k} = & \frac{\lambda^k}{r^k}(1+\frac{\lambda}{r})^{-k}\\ = &\frac{\lambda^k}{r^k} \end{aligned}
Combining the three terms, you get \lim_{r\rightarrow\infty,p\rightarrow 1}NB(r,p) = Pois(r\frac{p}{1-p})
Question 6
Please show that the sum of two independent Poisson random variables with parameters \lambda_1 and \lambda_2 is also a Poisson random variable with parameter \lambda_1 + \lambda_2. In other words, if X \sim \text{Poisson}(\lambda_1) and Y \sim \text{Poisson}(\lambda_2) are independent, then Z = X + Y \sim \text{Poisson}(\lambda_1 + \lambda_2).
\begin{aligned} P(Z=k) &= P(X+Y=k) \\ &= \sum_{i=0}^k P(X=i, Y=k-i) \\ &= \sum_{i=0}^k P(X=i)P(Y=k-i) \\ &= \sum_{i=0}^k \frac{\lambda_1^i}{i!} \frac{\lambda_2^{k-i}}{k-i!} e^{-\lambda_1} e^{-\lambda_2} \\ &= \frac{\lambda_1^k}{k!} \frac{\lambda_2^{k}}{k!} e^{-\lambda_1-\lambda_2} \\ &= \frac{\lambda_1^k}{k!} \frac{\lambda_2^{ k}}{k!} e^{-(\lambda_1+\lambda_2)} \\ &= \frac{(\lambda_1 + \lambda_2)^k}{k!} e^{-(\lambda_1+\lambda_2)} \end{aligned}
Question 7
Please show that the negative binomial distribution can be derived as a Poisson distribution with a gamma distribution prior.
You can started from the the conclusion we had in the class: if X | \lambda \sim \text{Poisson}(\lambda) and \lambda \sim \text{Gamma}(\alpha, \beta), the posterior probability of \lambda is \text{Gamma}(\alpha + \sum_{i=1}^n X_i, \beta + n).
Calculation hints: In the proof, you may find it useful to use the following properties:
\begin{aligned} \int_0^\infty x^{A-1}e^{-Bx}dx =& \frac{\Gamma(A)}{B^A} \\ \text{ ,and } \frac{\Gamma(a+b)}{a! \Gamma(b)} =& \binom{a+b - 1}{a} \end{aligned}
\begin{aligned} p(\tilde{x})= &\int_0^\infty\bigl[ \frac{\lambda^{\tilde{x}}e^{-\lambda}}{\tilde{x}!} \bigr]\bigl[ \frac{\beta^\alpha}{\Gamma(\alpha)}\lambda^{\alpha-1}e^{-\beta\lambda} \bigr]d\lambda \\ =& \frac{\beta^\alpha}{\tilde{x}!\Gamma(\alpha)}\int_0^\infty \lambda^{\tilde{x}+\alpha-1}e^{-(1+\beta)\lambda}d\lambda \\ =& \frac{\beta^\alpha}{\tilde{x}!\Gamma(\alpha)}\frac{\Gamma(\tilde{x}+\alpha)}{(1+\beta)^{\tilde{x}+\alpha}} \\ =& \binom{\tilde{x}+\alpha - 1}{\tilde{x}}\Bigl(\frac{\beta}{1+\beta}\Bigr)^{\alpha}\Bigl(\frac{1}{1+\beta}\Bigr)^{\tilde{x}} \end{aligned}