To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
GI/M/1-type Markov chains make up a class of two-dimensional Markov chains. One dimension is usually called the level, and the other is often called the phase. Transitions from states in level k are restricted to states in levels less than or equal to k+1. For given transition probabilities in the interior of the state space, we show that it is always possible to define the boundary transition probabilities in such a way that the level and phase are independent under the stationary distribution. We motivate our analysis by first considering the quasi-birth-and-death process special case in which transitions from any state are restricted to states in the same, or adjacent, levels.
We formulate and verify an almost-sure lattice renewal theorem for branching random walks, whose non-lattice analogue is originally due to Nerman. We also identify the limit in these renewal theorems (both lattice and non-lattice) as the limit of Kingman's well-known martingale multiplied by a deterministic factor.
Certain types of Markov jump processes x(t) with continuous state space and one or more absorbing states are studied. Cases where the transition rate in state x is of the form λ(x) = |x|δ in a neighbourhood of the origin in ℝd are considered, in particular. This type of problem arises from quantum physics in the study of laser cooling of atoms, and the present paper connects to recent work in the physics literature. The main question addressed is that of the asymptotic behaviour of x(t) near the origin for large t. The study involves solution of a renewal equation problem in continuous state space.
Consider a population of fixed size consisting of N haploid individuals. Assume that this population evolves according to the two-allele neutral Moran model in mathematical genetics. Denote the two alleles by A1 and A2. Allow mutation from one type to another and let 0 < γ < 1 be the sum of mutation probabilities. All the information about the population is recorded by the Markov chain X = (X(t))t≥0 which counts the number of individuals of type A1. In this paper we study the time taken for the population to ‘reach’ stationarity (in the sense of separation and total variation distances) when initially all individuals are of one type. We show that after t∗ = Nγ-1logN + cN the separation distance between the law of X(t∗) and its stationary distribution converges to 1 - exp(-γe-γc) as N → ∞. For the total variation distance an asymptotic upper bound is obtained. The results depend on a particular duality, and couplings, between X and a genealogical process known as the lines of descent process.
Let ε be a (single or composite) pattern defined over a sequence of Bernoulli trials. This article presents a unified approach for the study of the joint distribution of the number Sn of successes (and Fn of failures) and the number Xn of occurrences of ε in a fixed number of trials as well as the joint distribution of the waiting time Tr till the rth occurrence of the pattern and the number STr of successes (and FTr of failures) observed at that time. General formulae are developed for the joint probability mass functions and generating functions of (Xn,Sn), (Tr,STr) (and (Xn,Sn,Fn),(Tr,STr,FTr)) when Xn belongs to the family of Markov chain imbeddable variables of binomial type. Specializing to certain success runs, scans and pattern problems several well-known results are delivered as special cases of the general theory along with some new results that have not appeared in the statistical literature before.
Classic works of Karlin and McGregor and Jones and Magnus have established a general correspondence between continuous-time birth-and-death processes and continued fractions of the Stieltjes-Jacobi type together with their associated orthogonal polynomials. This fundamental correspondence is revisited here in the light of the basic relation between weighted lattice paths and continued fractions otherwise known from combinatorial theory. Given that sample paths of the embedded Markov chain of a birth-and-death process are lattice paths, Laplace transforms of a number of transient characteristics can be obtained systematically in terms of a fundamental continued fraction and its family of convergent polynomials. Applications include the analysis of evolutions in a strip, upcrossing and downcrossing times under flooring and ceiling conditions, as well as time, area, or number of transitions while a geometric condition is satisfied.
A curious connection exists between the theory of optimal stopping for independent random variables, and branching processes. In particular, for the branching process Zn with offspring distribution Y, there exists a random variable X such that the probability P(Zn = 0) of extinction of the nth generation in the branching process equals the value obtained by optimally stopping the sequence X1,…, Xn, where these variables are i.i.d. distributed as X. Generalizations to the inhomogeneous and infinite horizon cases are also considered. This correspondence furnishes a simple ‘stopping rule’ method for computing various characteristics of branching processes, including rates of convergence of the nth generation's extinction probability to the eventual extinction probability, for the supercritical, critical and subcritical Galton-Watson process. Examples, bounds, further generalizations and a connection to classical prophet inequalities are presented. Throughout, the aim is to show how this unexpected connection can be used to translate methods from one area of applied probability to another, rather than to provide the most general results.
Miyazawa and Taylor (1997) introduced a class of assemble-transfer batch service queueing networks which do not have tractable stationary distribution. However by assuming a certain additional arrival process at each node when it is empty, they obtain a geometric product-form stationary distribution which is a stochastic upper bound for the stationary distribution of the original network. In this paper we develop a stochastic lower bound for the original network by introducing an additional departure process at each node which tends to remove all the customers present in it. This model in combination with the aforementioned upper bound model gives a better sense for the properties of the original network.
Consider the following birth-growth model in ℝ. Seeds are born randomly according to an inhomogeneous space-time Poisson process. A newly formed point immediately initiates a bi-directional coverage by sending out a growing branch. Each frontier of a branch moves at a constant speed until it meets an opposing one. New seeds continue to form on the uncovered parts on the line. We are interested in the time until a bounded interval is completely covered. The exact and limiting distributions as the length of interval tends to infinity are obtained for this completion time by considering a related Markov process. Moreover, some strong limit results are also established.
This paper presents bounds on convergence rates of Markov chains in terms of quantities calculable directly from chain transition operators. Bounds are constructed by creating a probability distribution that minorizes the transition kernel over some region, and by examining bounds on an expectation conditional on lying within and without this region. These are shown to be sharper in most cases than previous similar results. These bounds are applied to a Markov chain useful in frequentist conditional inference in canonical generalized linear models.
Using a representation in terms of a two-type branching particle system, we prove that positive solutions of the system remain bounded for suitable bounded initial conditions, provided A and B generate processes with independent increments and one of the processes is transient with a uniform power decay of its semigroup. For the case of symmetric stable processes on R1,this answers a question raised in [4].
We study the variance-to-mean ratio of the distributions of parasites among hosts for some models of parasite infection, using the cohort approach. We consider a model with density dependence in parasite mortality, and two different formulations of disease induced host mortality. We show that the distributions of parasites, conditional on host survival, converge to quasi-stationary distributions as host age increases. When there is density dependence in parasite mortality, the limiting variance-to-mean ratio is less than 1 (an ‘under-dispersed’ distribution). In contrast, the two modes of disease induced host mortality show that either over- or underdispersed distributions may result.
It is shown that an Ornstein-Uhlenbeck type process associated with a spectrallypositive Lévy process can be obtained as the fluctuation limits of both discrete state and continuous state branching processes with immigration.
We derive some asymptotic results for the rate of convergence to equilibrium for the number of busy servers in an M/M/N/N queue with input rate λN and service rate 1 for N → ∞ in the ‘subcritical’ case λ ∈]0, 1[. These results improve recent contributions of Fricker, Robert and Tibi.
Two features are desired in designing a sequential clinical trial: randomness and balance. The former makes the ground for valid statistical inferences and the latter strengthens efficiency in inference procedures. Unfortunately randomness and balance can be in conflict with one another, and clinicians may be caught between the need for both of them. This paper raises an interesting question: can one design consistently achieve more balance than another when both designs own the same randomness? The Ehrenfest urn design is presented to allocate two treatments under a sequential clinical trial, and its balance and randomness properties are investigated. The design is compared with the biased coin design with imbalance tolerance.
We consider birth-death processes taking values in but allow the death rate in state 0 to be positive, so that escape from is possible. Two such processes with transition functions are said to be similar if, for all there are constants cij such that for all t ≥ 0. We determine conditions on the birth and death rates of a birth-death process for the process to be a member of a family of similar processes, and we identify the members of such a family. These issues are also resolved in the more general setting in which the two processes are called similar if there are constants cij and ν such that for all t ≥ 0.
We consider a probabilistic model of a heterogeneous population P subdivided into homogeneous sub-cohorts. A main assumption is that the frailties give rise to a discrete, exchangeable random vector. We put ourselves in the framework of stochastic filtering to derive the conditional distribution of residual lifetimes of surviving individuals, given an observed history of failures and survivals. As a main feature of our approach, this study is based on the analysis of behaviour of the vector of ‘occupation numbers’.
A duality is presented for continuous-time, real-valued, monotone, stochastic recursions driven by processes with stationary increments. A given recursion defines the time evolution of a content process (such as a dam or queue), and it is shown that the existence of the content process implies the existence of a corresponding dual risk process that satisfies a dual recursion. The one-point probabilities for the content process are then shown to be related to the one-point probabilities of the risk process. In particular, it is shown that the steady-state probabilities for the content process are equivalent to the first passage time probabilities for the risk process. A number of applications are presented that flesh out the general theory. Examples include regulated processes with one or two barriers, storage models with general release rate, and jump and diffusion processes.
A martingale is used to study extinction probabilities of the Galton-Watson process using a stopping time argument. This same martingale defines a martingale function in its argument s; consequently, its derivative is also a martingale. The argument s can be classified as regular or irregular and this classification dictates very different behavior of the Galton-Watson process. For example, it is shown that irregularity of a point s is equivalent to the derivative martingale sequence at s being closable, (i.e., it has limit which, when attached to the original sequence, the martingale structure is retained). It is also shown that for irregular points the limit of the derivative is the derivative of the limit, and two different types of norming constants for the asymptotics of the Galton-Watson process are asymptotically equivalent only for irregular points.
In this paper we give bounds on the total variation distance from convergence of a continuous time positive recurrent Markov process on an arbitrary state space, based on Foster-Lyapunov drift and minorisation conditions. Considerably improved bounds are given in the stochastically monotone case, for both discrete and continuous time models, even in the absence of a reachable minimal element. These results are applied to storage models and to diffusion processes.