To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
We study the structure of genealogical trees of reduced subcritical Galton-Watson processes in a random environment assuming that all (randomly varying in time) offspring generating functions are fractional linear. We show that this structure may differ significantly from that of the ‘classical’ reduced subcritical Galton-Watson processes. In particular, it may look like a complex ‘hybrid’ of classical reduced super and subcritical processes. Some relations with random walks in a random environment are discussed.
We present a general recurrence model which provides a conceptual framework for well-known problems such as ascents, peaks, turning points, Bernstein's urn model, the Eggenberger–Pólya urn model and the hypergeometric distribution. Moreover, we show that the Frobenius-Harper technique, based on real roots of a generating function, can be applied to this general recurrence model (under simple conditions), and so a Berry–Esséen bound and local limit theorems can be found. This provides a simple and unified approach to asymptotic theory for diverse problems hitherto treated separately.
A multitype chain-binomial epidemic process is defined for a closed finite population by sampling a simple multidimensional counting process at certain points. The final size of the epidemic is then characterized, given the counting process, as the smallest root of a non-linear system of equations. By letting the population grow, this characterization is used, in combination with a branching process approximation and a weak convergence result for the counting process, to derive the asymptotic distribution of the final size. This is done for processes with an irreducible contact structure both when the initial infection increases at the same rate as the population and when it stays fixed.
We define a stochastic process {Xn} based on partial sums of a sequence of integer-valued random variables (K0,K1,…). The process can be represented as an urn model, which is a natural generalization of a gambling model used in the first published exposition of the criticality theorem of the classical branching process. A special case of the process is also of interest in the context of a self-annihilating branching process. Our main result is that when (K1,K2,…) are independent and identically distributed, with mean a ∊ (1,∞), there exist constants {cn} with cn+1/cn → a as n → ∞ such that Xn/cn converges almost surely to a finite random variable which is positive on the event {Xn ↛ 0}. The result is extended to the case of exchangeable summands.
The study of the distribution of the distance between words in a random sequence of letters is interesting in view of application in genome sequence analysis. In this paper we give the exact distribution probability and cumulative distribution function of the distances between two successive occurrences of a given word and between the nth and the (n+m)th occurrences under three models of generation of the letters: i.i.d. with the same probability for each letter, i.i.d. with different probabilities and Markov process. The generating function and the first two moments are also given. The point of studying the distances instead of the counting process is that we get some knowledge not only about the frequency of a word but also about its longitudinal distribution in the sequence.
The mathematical model we consider here is the classical Bienaymé–Galton–Watson branching process modified with immigration in the state zero.
We study properties of the waiting time to explosion of the supercritical modified process, i.e. that time until all beginning cycles which die out have disappeared. We then derive the expected total progeny of a cycle and show how higher moments can be computed. With a view to applications the main goal is to show that any statistical inference from observed cycle lengths or estimates of total progeny on the fertility rate of the process must be treated with care. As an example we discuss population experiments with trout.
This paper is concerned with submultiplicative moments for the stationary distributions π of some Markov chains taking values in ℝ+ or ℝ which are closely related to the random walks generated by sequences of independent identically distributed random variables. Necessary and sufficient conditions are given for ∫φ(x)π(dx) < ∞, where φ(x) is a submultiplicative function, i.e. φ(0) = 1 and φ(x+y) ≤ φ(x)φ(y) for all x, y.
In this paper, given personnel distributions that are not attainable, we introduce the grade of attainability in order to measure the degree to which there exists a similar distribution that is attainable. For constant size systems controlled by recruitment, properties of the most similar distribution to a given distribution are formulated.
Interest has been shown in Markovian sequences of geometric shapes. Mostly the equations for invariant probability measures over shape space are extremely complicated and multidimensional. This paper deals with rectangles which have a simple one-dimensional shape descriptor. We explore the invariant distributions of shape under a variety of randomised rules for splitting the rectangle into two sub-rectangles, with numerous methods for selecting the next shape in sequence. Many explicit results emerge. These help to fill a vacant niche in shape theory, whilst contributing at the same time, new distributions on [0,1] and interesting examples of Markov processes or, in the language of another discipline, of stochastic dynamical systems.
We study the present value Z∞ = ∫0∞ e-Xt-dYt where (X,Y) is an integrable Lévy process. This random variable appears in various applications, and several examples are known where the distribution of Z∞ is calculated explicitly. Here sufficient conditions for Z∞ to exist are given, and the possibility of finding the distribution of Z∞ by Markov chain Monte Carlo simulation is investigated in detail. Then the same ideas are applied to the present value Z-∞ = ∫0∞ exp{-∫0tRsds}dYt where Y is an integrable Lévy process and R is an ergodic strong Markov process. Numerical examples are given in both cases to show the efficiency of the Monte Carlo methods.
We provide a probabilistic proof of the Stein's factors based on properties of birth and death Markov chains, solving a tantalizing puzzle in using Markov chain knowledge to view the celebrated Stein–Chen method for Poisson approximations. This work complements the work of Barbour (1988) for the case of Poisson random variable approximation.
We generalize a population-size-dependent branching process to a more general branching model called the population-size-dependent branching process in random environments. For the model where {Zn}n≥0 is associated with the stationary environment ξ− = {ξn}n≥0, let B = {ω : Zn(ω) = for some n}, and q(ξ−) = P(B | ξ−, Z0 = 1). The result is that P(q(̅ξ) = 1) is either 1 or 0, and sufficient conditions for certain extinction (i.e. P(q(ξ−) = 1) = 1) and for non-certain extinction (i.e. P(q(ξ−) < 1) = 1) are obtained for the model.
Recently Propp and Wilson [14] have proposed an algorithm, called coupling from the past (CFTP), which allows not only an approximate but perfect (i.e. exact) simulation of the stationary distribution of certain finite state space Markov chains. Perfect sampling using CFTP has been successfully extended to the context of point processes by, amongst other authors, Häggström et al. [5]. In [5] Gibbs sampling is applied to a bivariate point process, the penetrable spheres mixture model [19]. However, in general the running time of CFTP in terms of number of transitions is not independent of the state sampled. Thus an impatient user who aborts long runs may introduce a subtle bias, the user impatience bias. Fill [3] introduced an exact sampling algorithm for finite state space Markov chains which, in contrast to CFTP, is unbiased for user impatience. Fill's algorithm is a form of rejection sampling and similarly to CFTP requires sufficient monotonicity properties of the transition kernel used. We show how Fill's version of rejection sampling can be extended to an infinite state space context to produce an exact sample of the penetrable spheres mixture process and related models. Following [5] we use Gibbs sampling and make use of the partial order of the mixture model state space. Thus we construct an algorithm which protects against bias caused by user impatience and which delivers samples not only of the mixture model but also of the attractive area-interaction and the continuum random-cluster process.
Recently, Elmes et al. (see [2]) proposed a definition of a quasistationary distribution to accommodate absorbing Markov chains for which absorption occurs with probability less than 1. We will show that the probabilistic interpretation pertaining to cases where absorption is certain (see [13]) does not hold in the present context. We prove that the state probabilities at time t conditional on absorption taking place after t, generally depend on t. Conditions are derived under which there is no initial distribution such that the conditional state probabilities are stationary.
In an attempt to investigate the adequacy of the normal approximation for the number of nuclei in certain growth/coverage models, we consider a Markov chain which has properties in common with related continuous-time Markov processes (as well as being of interest in its own right). We establish that the rate of convergence to normality for the number of ‘drops’ during times 1,2,…n is of the optimal ‘Berry–Esséen’ form, as n → ∞. We also establish a law of the iterated logarithm and a functional central limit theorem.
Queueing networks have been rather restricted in order to have product form distributions for network states. Recently, several new models have appeared and enlarged this class of product form networks. In this paper, we consider another new type of queueing network with concurrent batch movements in terms of such product form results. A joint distribution of the requested batch sizes for departures and the batch sizes of the corresponding arrivals may be arbitrary. Under a certain modification of the network and mild regularity conditions, we give necessary and sufficient conditions for the network state to have the product form distribution, which is shown to provide an upper bound for the one in the original network. It is shown that two special settings satisfy these conditions. Algorithms to calculate their stationary distributions are considered, with numerical examples.
By lower estimates of the functionals 𝔼[eStKtNt], where St and Nt denote the total length up to time t and the number of individuals at time t in a Galton-Watson tree, we obtain sufficient criteria for the blow-up of semilinear equations and systems of the type ∂wt/∂t = Awt + Vwtβ. Roughly speaking, the growth of the tree length has to win against the ‘mobility’ of the motion belonging to the generator A, since, in the probabilistic representation of the equations, the latter results in small K(t) as t → ∞. In the single-type situation, this gives a re-interpretation of classical results of Nagasawa and Sirao(1969); in the multitype scenario, part of the results obtained through analytic methods in Escobedo and Herrero (1991) and (1995) are re-proved and extended from the case A = Δ to the case of α-Laplacians.
This paper considers continuous-time Markov chains whose state space consists of an irreducible class, 𝒞, and an absorbing state which is accessible from 𝒞. The purpose is to provide a way to determine the expected time to absorption conditional on such time being finite, in the case where absorption occurs with probability less than 1. The results are illustrated by applications to the general birth and death process and the linear birth, death and catastrophe process.
We study a class of simulated annealing type algorithms for global minimization with general acceptance probabilities. This paper presents simple conditions, easy to verify in practice, which ensure the convergence of the algorithm to the global minimum with probability 1.
Modelling malaria with consistency necessitates the introduction of at least two families of interconnected processes. Even in a Markovian context the simplest fully stochastic model is intractable and is usually transformed into a hybrid model, by supposing that these two families are stochastically independent and linked only through two deterministic connections. A model closer to the fully stochastic model is presented here, where one of the two families is subordinated to the other and just a unique deterministic connection is required. For this model a threshold theorem can be proved but the threshold level is not the one obtained in a hybrid model. The difference disappears only when the human population size approaches infinity.