Markov Chains and Random Walks

Markov chain is a discrete-time stochastic process whose evolution in future depends only on the present state and does not depend on the past states. Many phenomena can be modelled with Markov Chains and this is the reason why Markov chains are used in many areas of Applied Probability including Queueing Theory and Insurance Mathematics. An important example of a Markov Chain is a random walk. This is a discrete-time analogue of Brownian motion and Levy processes. This model is easy to describe and it has a number of applications in different areas of physics and mathematics.

The problems I am working on include estimates for the exit times probabilities of Markov Chains and random walks. The exit time is the time when a Markov chain leaves a domain for the first time. Exit times appear naturally in many situations. For example, if one describes the amount of work to be done in a queueing system as a Markov chain, then the exit time from the domain $(0,\infty)$ corresponds to the time when the system becomes empty. A closely related problem is obtaining of the estimates for stationary probabilities. Many Markov chains are stable in the sense that after a long period of time the probability to be in some state $i$ converges to a number $\pi_i$. The probabilities $\pi_i$ define the stationary distribution and are essentially one of the main characteristics of stable Markov Chains. It is rarely possible to obtain an exact expression for the exit time probabilities or stationary probabilities, hence it is important to have estimates for them.

Academic contact

Dr Denis Denisov, Tel: +44 (0)161 306 3678, E-mail: Denis.Denisov (@manchester.ac.uk)

▲ Up to the top