In applied probability, a Markov additive process (MAP) is a bivariate Markov process where the future states depends only on one of the variables.

Definition

Finite or countable state space for J(t)

The process { ( X ( t ) , J ( t ) ) : t 0 } {\displaystyle \{(X(t),J(t)):t\geq 0\}} is a Markov additive process with continuous time parameter t if

  1. { ( X ( t ) , J ( t ) ) ; t 0 } {\displaystyle \{(X(t),J(t));t\geq 0\}} is a Markov process
  2. the conditional distribution of ( X ( t s ) X ( t ) , J ( t s ) ) {\displaystyle (X(t s)-X(t),J(t s))} given ( X ( t ) , J ( t ) ) {\displaystyle (X(t),J(t))} depends only on J ( t ) {\displaystyle J(t)} .

The state space of the process is R × S where X(t) takes real values and J(t) takes values in some countable set S.

General state space for J(t)

For the case where J(t) takes a more general state space the evolution of X(t) is governed by J(t) in the sense that for any f and g we require

E [ f ( X t s X t ) g ( J t s ) | F t ] = E J t , 0 [ f ( X s ) g ( J s ) ] {\displaystyle \mathbb {E} [f(X_{t s}-X_{t})g(J_{t s})|{\mathcal {F}}_{t}]=\mathbb {E} _{J_{t},0}[f(X_{s})g(J_{s})]} .

Example

A fluid queue is a Markov additive process where J(t) is a continuous-time Markov chain.

Applications

Çinlar uses the unique structure of the MAP to prove that, given a gamma process with a shape parameter that is a function of Brownian motion, the resulting lifetime is distributed according to the Weibull distribution.

Kharoufeh presents a compact transform expression for the failure distribution for wear processes of a component degrading according to a Markovian environment inducing state-dependent continuous linear wear by using the properties of a MAP and assuming the wear process to be temporally homogeneous and that the environmental process has a finite state space.

Notes



Markov Process PDF Stochastic Process Markov Chain

A fragment of a diagram of a multidimentional Markov process in the

(PDF) Stability of Overshoots of Markov Additive Processes

Markov Processes for Markov Processes Notes MAT 211 Docsity

Markov Process 2 PDF