We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
At a certain level of abstraction, computing and communication systems as well as banking, manufacturing and transport systems, can be described in terms of ‘jobs’ and ‘servers’, i.e. requests for service and devices that provide service. The jobs may be computing tasks, input/output commands, telephone calls, data packets. The servers may be processors, storage devices, communication channels, software modules. A model aimed at evaluating and predicting the performance of such a system has to capture the following essential aspects of its behaviour:
(a) The pattern of demand, i.e. the the manner in which jobs arrive into the system and the nature of services that they require.
(b) The competition for service, i.e. the effect of admission, queueing and routing policies on performance.
This chapter is devoted to (a). It introduces tools and results that are used when modelling the arrivals and services of jobs.
Renewal processes
Consider a phenomenon which takes place first at time 0 and thereafter keeps occurring, at random intervals, ad infinitum. Denote the consecutive instants of occurrence by Tn (n = 0,1,…; T0 0), and let Sn Tn — Tn-1 (n = 1,2,…) be the intervals between them. Assume that the random variables Sn are independent and identically distributed.
Markov chains are the simplest mathematical models for random phenomena evolving in time. Their simple structure makes it possible to say a great deal about their behaviour. At the same time, the class of Markov chains is rich enough to serve in many applications. This makes Markov chains the first and most important examples of random processes. Indeed, the whole of the mathematical study of random processes can be regarded as a generalization in one way or another of the theory of Markov chains.
This book is an account of the elementary theory of Markov chains, with applications. It was conceived as a text for advanced undergraduates or master's level students, and is developed from a course taught to undergraduates for several years. There are no strict prerequisites but it is envisaged that the reader will have taken a course in elementary probability. In particular, measure theory is not a prerequisite.
The first half of the book is based on lecture notes for the undergraduate course. Illustrative examples introduce many of the key ideas. Careful proofs are given throughout. There is a selection of exercises, which forms the basis of classwork done by the students, and which has been tested over several years. Chapter 1 deals with the theory of discrete-time Markov chains, and is the basis of all that follows. You must begin here. The material is quite straightforward and the ideas introduced permeate the whole book.
In the first three chapters we have given an account of the elementary theory of Markov chains. This already covers a great many applications, but is just the beginning of the theory of Markov processes. The further theory inevitably involves more sophisticated techniques which, although having their own interest, can obscure the overall structure. On the other hand, the overall structure is, to a large extent, already present in the elementary theory. We therefore thought it worth while to discuss some features of the further theory in the context of simple Markov chains, namely, martingales, potential theory, electrical networks and Brownian motion. The idea is that the Markov chain case serves as a guiding metaphor for more complicated processes. So the reader familiar with Markov chains may find this chapter helpful alongside more general higher-level texts. At the same time, further insight is gained into Markov chains themselves.
Martingales
A martingale is a process whose average value remains constant in a particular strong sense, which we shall make precise shortly. This is a sort of balancing property. Often, the identification of martingales is a crucial step in understanding the evolution of a stochastic process.
Applications of Markov chains arise in many different areas. Some have already appeared to illustrate the theory, from games of chance to the evolution of populations, from calculating the fair price for a random reward to calculating the probability that an absent-minded professor is caught without an umbrella. In a real-world problem involving random processes you should always look for Markov chains. They are often easy to spot. Once a Markov chain is identified, there is a qualitative theory which limits the sorts of behaviour that can occur – we know, for example, that every state is either recurrent or transient. There are also good computational methods – for hitting probabilities and expected rewards, and for long-run behaviour via invariant distributions.
In this chapter we shall look at five areas of application in detail: biological models, queueing models, resource management models, Markov decision processes and Markov chain Monte Carlo. In each case our aim is to provide an introduction rather than a systematic account or survey of the field. References to books for further reading are given in each section.
Markov chains in biology
Randomness is often an appropriate model for systems of high complexity, such as are often found in biology. We have already illustrated some aspects of the theory by simple models with a biological interpretation. See Example 1.1.5 (virus), Exercise 1.1.6 (octopus), Example 1.3.4 (birth-and-death chain) and Exercise 2.5.1 (bacteria).
The material on continuous-time Markov chains is divided between this chapter and the next. The theory takes some time to set up, but once up and running it follows a very similar pattern to the discrete-time case. To emphasise this we have put the setting-up in this chapter and the rest in the next. If you wish, you can begin with Chapter 3, provided you take certain basic properties on trust, which are reviewed in Section 3.1. The first three sections of Chapter 2 fill in some necessary background information and are independent of each other. Section 2.4 on the Poisson process and Section 2.5 on birth processes provide a gentle warm-up for general continuous-time Markov chains. These processes are simple and particularly important examples of continuous-time chains. Sections 2.6–2.8, especially 2.8, deal with the heart of the continuous-time theory. There is an irreducible level of difficulty at this point, so we advise that Sections 2.7 and 2.8 are read selectively at first. Some examples of more general processes are given in Section 2.9. As in Chapter 1 the exercises form an important part of the text.
Q-matrices and their exponentials
In this section we shall discuss some of the basic properties of Q-matrices and explain their connection with continuous-time Markov chains.