To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
We present a law of large numbers and a central limit theorem for the time to absorption of Λ-coalescents with dust started from n blocks, as n→∞. The proofs rely on an approximation of the logarithm of the block-counting process by means of a drifted subordinator.
We study the asymptotic behavior of the survival probability of a multi-type branching process in a random environment. In the one-dimensional situation, the class of processes considered corresponds to the strongly subcritical case. We also prove a conditional limit theorem describing the distribution of the number of particles in the process given its survival for a long time.
We introduce a modified Galton‒Watson process using the framework of an infinite system of particles labelled by (x,t), where x is the rank of the particle born at time t. The key assumption concerning the offspring numbers of different particles is that they are independent, but their distributions may depend on the particle label (x,t). For the associated system of coupled monotone Markov chains, we address the issue of pathwise duality elucidated by a remarkable graphical representation in which the trajectories of the primary Markov chains and their duals coalesce to form forest graphs on a two-dimensional grid.
Bizarrely shaped voting districts are frequently lambasted as likely instances of gerrymandering. In order to systematically identify such instances, researchers have devised several tests for so-called geographic compactness (i.e. shape niceness). We demonstrate that under certain conditions, a party can gerrymander a competitive state into geographically compact districts to win an average of over 70% of the districts. Our results suggest that geometric features alone may fail to adequately combat partisan gerrymandering.
In this paper we consider the degree-wise effect of a second step for a random walk on a graph. We prove that under the configuration model, for any fixed degree sequence the probability of exceeding a given degree threshold is smaller after two steps than after one. This builds on recent work of Kramer et al. (2016) regarding the friendship paradox under random walks.
In this paper we are concerned with modelling the reliability of a system subject to external shocks. In a run shock model, the system fails when a sequence of shocks above a threshold arrive in succession. Nevertheless, using a single threshold to measure the severity of a shock is too critical in real practice. To this end, we develop a generalized run shock model with two thresholds. We employ a phase-type distribution to model the damage size and the inter-arrival time of shocks, which is highly versatile and may be used to model many quantitative features of random phenomenon. Furthermore, we use the Markovian property to construct a multi-state system which degrades with the arrival of shocks. We also provide a numerical example to illustrate our results.
In this paper we introduce multitype branching processes with inhomogeneous Poisson immigration, and consider in detail the critical Markov case when the local intensity r(t) of the Poisson random measure is a regularly varying function. Various multitype limit distributions (conditional and unconditional) are obtained depending on the rate at which r(t) changes with time. The asymptotic behaviour of the first and second moments, and the probability of nonextinction are investigated.
In this paper we give a new flavour to what Peter Jagers and his co-authors call `the path to extinction'. In a neutral population of constant size N, assume that each individual at time 0 carries a distinct type, or allele. Consider the joint dynamics of these N alleles, for example the dynamics of their respective frequencies and more plainly the nonincreasing process counting the number of alleles remaining by time t. Call this process the extinction process. We show that in the Moran model, the extinction process is distributed as the process counting (in backward time) the number of common ancestors to the whole population, also known as the block counting process of the N-Kingman coalescent. Stimulated by this result, we investigate whether it extends (i) to an identity between the frequencies of blocks in the Kingman coalescent and the frequencies of alleles in the extinction process, both evaluated at jump times, and (ii) to the general case of Λ-Fleming‒Viot processes.
We examine a system of interacting random walks with leftward drift on ℤ, which begins with a single active particle at the origin and some distribution of inactive particles on the positive integers. Inactive particles become activated when landed on by other particles, and all particles beginning at the same point possess equal leftward drift. Once activated, the trajectories of distinct particles are independent. This system belongs to a broader class of problems involving interacting random walks on rooted graphs, referred to collectively as the frog model. Additional conditions that we impose on our model include that the number of frogs (i.e. particles) at positive integer points is a sequence of independent random variables which is increasing in terms of the standard stochastic order, and that the sequence of leftward drifts associated with frogs originating at these points is decreasing. Our results include sharp conditions with respect to the sequence of random variables and the sequence of drifts that determine whether the model is transient (meaning the probability infinitely many frogs return to the origin is 0) or nontransient. We consider several, more specific, versions of the model described, and a cleaner, more simplified set of sharp conditions will be established for each case.
Different Markov chains can be used for approximate sampling of a distribution given by an unnormalized density function with respect to the Lebesgue measure. The hit-and-run, (hybrid) slice sampler, and random walk Metropolis algorithm are popular tools to simulate such Markov chains. We develop a general approach to compare the efficiency of these sampling procedures by the use of a partial ordering of their Markov operators, the covariance ordering. In particular, we show that the hit-and-run and the simple slice sampler are more efficient than a hybrid slice sampler based on hit-and-run, which, itself, is more efficient than a (lazy) random walk Metropolis algorithm.
The effect of small noise in a smooth dynamical system is negligible on any finite time interval; in this paper we study situations where the effect persists on intervals increasing to ∞. Such an asymptotic regime occurs when the system starts from an initial condition that is sufficiently close to an unstable fixed point. In this case, under appropriate scaling, the trajectory converges to a solution of the unperturbed system started from a certain random initial condition. In this paper we consider the case of one-dimensional diffusions on the positive half-line; this case often arises as a scaling limit in population dynamics.
In their 1960 book on finite Markov chains, Kemeny and Snell established that a certain sum is invariant. The value of this sum has become known as Kemeny’s constant. Various proofs have been given over time, some more technical than others. We give here a very simple physical justification, which extends without a hitch to continuous-time Markov chains on a finite state space. For Markov chains with denumerably infinite state space, the constant may be infinite and even if it is finite, there is no guarantee that the physical argument will hold. We show that the physical interpretation does go through for the special case of a birth-and-death process with a finite value of Kemeny’s constant.
Genealogical constructions of population processes provide models which simultaneously record the forward-in-time evolution of the population size (and distribution of locations and types for models that include them) and the backward-in-time genealogies of the individuals in the population at each time t. A genealogical construction for continuous-time Markov branching processes from Kurtz and Rodrigues (2011) is described and exploited to give the normalized limit in the supercritical case. A Seneta‒Heyde norming is identified as a solution of an ordinary differential equation. The analogous results are given for continuous-state branching processes, including proofs of the normalized limits of Grey (1974) in both the supercritical and critical/subcritical cases.
Let Xn(k) be the number of vertices at level k in a random recursive tree with n+1 vertices. We are interested in the asymptotic behavior of Xn(k) for intermediate levels k=kn satisfying kn→∞ and kn=o(logn) as n→∞. In particular, we prove weak convergence of finite-dimensional distributions for the process (Xn ([knu]))u>0, properly normalized and centered, as n→∞. The limit is a centered Gaussian process with covariance (u,v)↦(u+v)−1. One-dimensional distributional convergence of Xn(kn), properly normalized and centered, was obtained with the help of analytic tools by Fuchs et al. (2006). In contrast, our proofs, which are probabilistic in nature, exploit a connection of our model with certain Crump–Mode–Jagers branching processes.
Khintchine's (necessary and sufficient) slowly varying function condition for the weak law of large numbers (WLLN) for the sum of n nonnegative, independent and identically distributed random variables is used as an overarching (sufficient) condition for the case that the number of summands is more generally [cn],cn→∞. Either the norming sequence {an},an→∞, or the number of summands sequence {cn}, can be chosen arbitrarily. This theorem generalizes results from a motivating branching process setting in which Khintchine's sufficient condition is automatically satisfied. A second theorem shows that Khintchine's condition is necessary for the generalized WLLN when it holds with cn→∞ and an→∞. Theorem 3, which is known, gives a necessary and sufficient condition for Khintchine's WLLN to hold with cn=n and an a specific function of n; it is extended to general cn subject to a growth restriction in Theorem 4. Section 6 returns to the branching process setting.
Let (Mn,Sn)n≥0 be a Markov random walk with positive recurrent driving chain (Mn)n≥0 on the countable state space 𝒮 with stationary distribution π. Suppose also that lim supn→∞Sn=∞ almost surely, so that the walk has almost-sure finite strictly ascending ladder epochs σn>. Recurrence properties of the ladder chain (Mσn>)n≥0 and a closely related excursion chain are studied. We give a necessary and sufficient condition for the recurrence of (Mσn>)n≥0 and further show that this chain is positive recurrent with stationary distribution π> and 𝔼π>σ1><∞ if and only if an associated Markov random walk (𝑀̂n,𝑆̂n)n≥0, obtained by time reversal and called the dual of (Mn,Sn)n≥0, is positive divergent, i.e. 𝑆̂n→∞ almost surely. Simple expressions for π> are also provided. Our arguments make use of coupling, Palm duality theory, and Wiener‒Hopf factorization for Markov random walks with discrete driving chain.
Recent progress in microdissection and in DNA sequencing has facilitated the subsampling of multi-focal cancers in organs such as the liver in several hundred spots, helping to determine the pattern of mutations in each of these spots. This has led to the construction of genealogies of the primary, secondary, tertiary, and so forth, foci of the tumor. These studies have led to diverse conclusions concerning the Darwinian (selective) or neutral evolution in cancer. Mathematical models of the development of multi-focal tumors have been devised to support these claims. We offer a model for the development of a multi-focal tumor: it is a mathematically rigorous refinement of a model of Ling et al. (2015). Guided by numerical studies and simulations, we show that the rigorous model, in the form of an infinite-type branching process, displays distributions of tumor size which have heavy tails and moments that become infinite in finite time. To demonstrate these points, we obtain bounds on the tails of the distributions of the process and an infinite series expansion for the first moments. In addition to its inherent mathematical interest, the model is corroborated by recent literature on apparent super-exponential growth in cancer metastases.
We consider the evolution of the ancestral structure of a classical branching process in space and its diffusion limit. We also indicate how the conditional structure of the past can be described asymptotically in terms of suitable uniform Brownian trees.
Tail asymptotics of the solution R to a fixed-point problem of the type R=DQ+∑1NRm are derived under heavy-tailed conditions allowing both dependence between Q and N and the tails to be of the same order of magnitude. Similar results are derived for a K-class version with applications to multi-type branching processes and busy periods in multi-class queues.
Let (Mt:t>0) be a Markov process of tessellations of ℝℓ, and let (𝒞t:t>0) be the process of their zero cells (zero polytopes), which has the same distribution as the corresponding process for Poisson hyperplane tessellations. In the present paper we describe the stationary zero cell process (at𝒞at:t∈ℝ),a>1, in terms of some regenerative structure and we show that it is a Bernoulli flow. An important application is to STIT tessellation processes.