Poisson Processes and Continuous-Time Markov Chains-1
Poisson Processes and Continuous-Time Markov Chains-1
Chapter 5
5.3 X is exponentially distributed and therefore memoryless. This implies that for all
t, s > 0, we have
P(X > t + s|X > t) = P(X > s).
In particular, this implies that for any continuous function f : R+ → R+ (or any other
nice enough function), we have
So,
E[X 2 |X > 1] = E[(X + 1)2 ] = E[X 2 ] + 2E[X] + 1,
which is only equal to E[X 2 ]+1 if E[X] = 0 (which is never for an exponental distribution),
and equal to
(1 + E[X])2 = (E[X])2 + 2E[X] + 1
if E[X 2 ] = (E[X])2 (i.e. if V ar(X) = 0), which is also never. So only 3(a) is true.
5.8 Let X have density function fX (t) = λe−λt and Y have density function fY (t) = µe−µt ,
both for t ≥ 0. Furthermore X and Y are independent.
We compute
R∞R∞
P(X > t, X ≤ Y ) fY (y)fX (x)dydx
P(X > t|X ≤ Y ) = = Rt∞ Rx∞
P(X ≤ Y ) 0 x
fY (y)fX (x)dydx
R ∞ R ∞ −µy −λx R ∞ −λx −µx
µe λe dydx λe e dx
λ
λ+µ
e−(λ+µ)t
Rt Rx
= ∞ ∞ −µy −λx t
= ∞ −λx −µx =
R λ
= e−(λ+µ)t ,
0 0
µe λe dydx 0
λe e dx λ+µ
1
5.36 Let {N (t), t ≥ 0} be a homogeneous Poisson Process with rate λ and let for
i = 1, 2, · · · the random variables Xi be independent identically distributed exponen-
tial random variables with mean 1/µ, which are independent of {N (t), t ≥ 0}. Define
S(t) = s N
Q (t)
i=1 Xi . Then using the “telescoping property of expectations”
where we used the independence of the Xi ’s for the third identity. We may now use that
N (t) is Poisson distributed with expectation λt and thus that
∞ ∞
X X (λt)k
E[(1/µ) N (t)
]= P(N (t) = k)(1/µ) = k
e−λt (1/µ)k = e−λ(1−1/µ)t .
k=0 k=0
k!
Similarly, using that E[(Xi )2 ] = V ar(Xi ) + (E[Xi ])2 = 1/µ2 + 1/µ2 = 2/µ2 , we obtain
N (t) N (t) N (t)
2
Y Y Y
2
E[(S(t)) ] = E[( 2
Xi ) ] = E[E[ 2
(Xi ) |N (t)]] = E[ (2/µ2 )] = E[(2/µ2 )N (t) ] = e−λ(1−2/µ )t .
i=1 i=1 i=1
while (4)
Where we have used that the product of two functions which are linear in h is o(h). This
finishes the proof.
2
5.45 Let {N (t), t ≥ 0} be a homogeneous Poisson Process with rate λ, which is indepen-
dent of T (≥ 0), which has mean µ and variance σ 2 . Note that
Then observe
E[T N (T )] = E[E[T N (T )]|T ]] = E[T × λT ] = λE[T 2 ] = λ(V ar(T ) + (E[T ])2 ) = λ(σ 2 + µ2 )
and
E[N (T )] = E[E[N (T )]|T ]] = E[λT ] = λE[T ] = λµ.
So,
Cov(T, N (t)) = λ(σ 2 + µ2 ) − µ × λµ = λσ 2 .
The variance of N (T ) can be computed similarly: V ar(N (T )) = E[(N (T ))2 ]−(E[N (T )])2 ,
where
So,
V ar(N (T )) = λµ + λ2 (σ 2 + µ2 ) − (λµ)2 = λµ + λ2 σ 2 .
5.49 A translation of this problem is to compute P(N (T )−N (s) = 1), where {N (t), t ≥ 0}
be a homogeneous Poisson Process with rate λ. Because if N (T ) − N (s) > 1, then the
first arrival after s is not the last one before T while if N (T ) − N (s) = 0, the first arrival
after s is after T . By definition we have
3
5.46 Let {N (t), t ≥ 0} be a homogeneous Poisson Process with rate λ and let for i =
1, 2, · · · the random variables Xi be independent identically distributed random variables
with mean µ, which are independent of {N (t), t ≥ 0}. Then,
5.78 Consider an inhomogeneous process between time 0 and 9, where the time is the
time (in hours) since 8AM. λ(t) = 4 for t ∈ (0, 2]; λ(t) = 8 for t ∈ (2, 4]; λ(t) = 8 + (t − 4)
for t ∈ (4, 6] and λ(t) = 10 − 2(t − 6) for t ∈ (6, 9]. From the theory on inhomogeneous
Poisson processes we know that the total
R 9 number of arrivals of this Poisson Process is
Poisson distributed with expectation 0 λ(t)dt = 8 + 16 + 18 + 21 = 63.
5.81b Use part a), but define G(x) = m(x)/m(t) for x ≤ t and G(x) = 1 for x > t. As-
sume that there are N (t) workers injured before time t. Note that this number is Poisson
distributed with expectation m(t). By part (a), the times of injury are independent and
0
have distribution G(x) and density g(x) = m R t(x)/m(t) for x ≤ t. So, the probability that
a worker is still injured at time t is given by 0 g(x)(1−F (t−x))dx. The expected number
of workers injured at time t is then given by this probability R t times the expected number
of workers injured before time t. This product is given by 0 m0 (x)(1 − F (t − x))dx.
5.95 Let {N (t), t ≥ 0} be a mixed Poisson Process with random rate L. We first want to
compute E[L|N (t) = n]. By the definition of conditional expectation we have
1(N (t) = n)]
E[L1
E[L|N (t) = n] = .
P(N (t) = n)
The latter is equal to
n
1(N (t) = n)|L]]
E[E[L1 E[L (Lt)
n!
e−Lt ] E[Ln+1 e−Lt ]
= n = .
E[P(N (t) = n|L)] E[ (Lt) e−Lt ] E[Ln e−Lt ]
n!
E[N (s)|N (t) = n] = E[E[N (s)|N (t) = n, L]|N (t) = n] = E[ns/t|N (t) = n] = ns/t,
where we have used the order statistic property for computing E[N (s)|N (t) = n, L].