We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Given a finite strongly connected directed graph
$G=(V, E)$
, we study a Markov chain taking values on the space of probability measures on V. The chain, motivated by biological applications in the context of stochastic population dynamics, is characterized by transitions between states that respect the structure superimposed by E: mass (probability) can only be moved between neighbors in G. We provide conditions for the ergodicity of the chain. In a simple, symmetric case, we fully characterize the invariant probability.
We present a recurrence–transience classification for discrete-time Markov chains on manifolds with negative curvature. Our classification depends only on geometric quantities associated to the increments of the chain, defined via the Riemannian exponential map. We deduce that a recurrent chain that has zero average drift at every point cannot be uniformly elliptic, unlike in the Euclidean case. We also give natural examples of zero-drift recurrent chains on negatively curved manifolds, including on a stochastically incomplete manifold.
Consider a topologically transitive countable Markov shift $\Sigma $ and a summable locally constant potential $\phi $ with finite Gurevich pressure and $\mathrm {Var}_1(\phi ) < \infty $. We prove the existence of the limit $\lim _{t \to \infty } \mu _t$ in the weak$^\star $ topology, where $\mu _t$ is the unique equilibrium state associated to the potential $t\phi $. In addition, we present examples where the limit at zero temperature exists for potentials satisfying more general conditions.
Let f be the density function associated to a matrix-exponential distribution of parameters
$(\boldsymbol{\alpha}, T,\boldsymbol{{s}})$
. By exponentially tilting f, we find a probabilistic interpretation which generalizes the one associated to phase-type distributions. More specifically, we show that for any sufficiently large
$\lambda\ge 0$
, the function
$x\mapsto \left(\int_0^\infty e^{-\lambda s}f(s)\textrm{d} s\right)^{-1}e^{-\lambda x}f(x)$
can be described in terms of a finite-state Markov jump process whose generator is tied to T. Finally, we show how to revert the exponential tilting in order to assign a probabilistic interpretation to f itself.
It was recently proven that the correlation function of the stationary version of a reflected Lévy process is nonnegative, nonincreasing, and convex. In another branch of the literature it was established that the mean value of the reflected process starting from zero is nondecreasing and concave. In the present paper it is shown, by putting them in a common framework, that these results extend to substantially more general settings. Indeed, instead of reflected Lévy processes, we consider a class of more general stochastically monotone Markov processes. In this setup we show monotonicity results associated with a supermodular function of two coordinates of our Markov process, from which the above-mentioned monotonicity and convexity/concavity results directly follow, but now for the class of Markov processes considered rather than just reflected Lévy processes. In addition, various results for the transient case (when the Markov process is not in stationarity) are provided. The conditions imposed are natural, in that they are satisfied by various frequently used Markovian models, as illustrated by a series of examples.
We present a study of the joint distribution of both the state of a level-dependent quasi-birth–death (QBD) process and its associated running maximum level, at a fixed time t: more specifically, we derive expressions for the Laplace transforms of transition functions that contain this information, and the expressions we derive contain familiar constructs from the classical theory of QBD processes. Indeed, one important takeaway from our results is that the distribution of the running maximum level of a level-dependent QBD process can be studied using results that are highly analogous to the more well-established theory of level-dependent QBD processes that focuses primarily on the joint distribution of the level and phase. We also explain how our methods naturally extend to the study of level-dependent Markov processes of M/G/1 type, if we instead keep track of the running minimum level instead of the running maximum level.
We consider a sequence of Poisson cluster point processes on
$\mathbb{R}^d$
: at step
$n\in\mathbb{N}_0$
of the construction, the cluster centers have intensity
$c/(n+1)$
for some
$c>0$
, and each cluster consists of the particles of a branching random walk up to generation n—generated by a point process with mean 1. We show that this ‘critical cluster cascade’ converges weakly, and that either the limit point process equals the void process (extinction), or it has the same intensity c as the critical cluster cascade (persistence). We obtain persistence if and only if the Palm version of the outgrown critical branching random walk is locally almost surely finite. This result allows us to give numerous examples for persistent critical cluster cascades.
This paper provides a full classification of the dynamics for continuous-time Markov chains (CTMCs) on the nonnegative integers with polynomial transition rate functions and without arbitrary large backward jumps. Such stochastic processes are abundant in applications, in particular in biology. More precisely, for CTMCs of bounded jumps, we provide necessary and sufficient conditions in terms of calculable parameters for explosivity, recurrence versus transience, positive recurrence versus null recurrence, certain absorption, and implosivity. Simple sufficient conditions for exponential ergodicity of stationary distributions and quasi-stationary distributions as well as existence and nonexistence of moments of hitting times are also obtained. Similar simple sufficient conditions for the aforementioned dynamics together with their opposite dynamics are established for CTMCs with unbounded forward jumps. Finally, we apply our results to stochastic reaction networks, an extended class of branching processes, a general bursty single-cell stochastic gene expression model, and population processes, none of which are birth–death processes. The approach is based on a mixture of Lyapunov–Foster-type results, the classical semimartingale approach, and estimates of stationary measures.
Matryoshka dolls, the traditional Russian nesting figurines, are known worldwide for each doll’s encapsulation of a sequence of smaller dolls. In this paper, we exploit the structure of a new sequence of nested matrices we call matryoshkan matrices in order to compute the moments of the one-dimensional polynomial processes, a large class of Markov processes. We characterize the salient properties of matryoshkan matrices that allow us to compute these moments in closed form at a specific time without computing the entire path of the process. This simplifies the computation of the polynomial process moments significantly. Through our method, we derive explicit expressions for both transient and steady-state moments of this class of Markov processes. We demonstrate the applicability of this method through explicit examples such as shot noise processes, growth–collapse processes, ephemerally self-exciting processes, and affine stochastic differential equations from the finance literature. We also show that we can derive explicit expressions for the self-exciting Hawkes process, for which finding closed-form moment expressions has been an open problem since their introduction in 1971. In general, our techniques can be used for any Markov process for which the infinitesimal generator of an arbitrary polynomial is itself a polynomial of equal or lower order.
We present a Markov chain on the n-dimensional hypercube
$\{0,1\}^n$
which satisfies
$t_{{\rm mix}}^{(n)}(\varepsilon) = n[1 + o(1)]$
. This Markov chain alternates between random and deterministic moves, and we prove that the chain has a cutoff with a window of size at most
$O(n^{0.5+\delta})$
, where
$\delta>0$
. The deterministic moves correspond to a linear shift register.
Let
$(\xi_k,\eta_k)_{k\in\mathbb{N}}$
be independent identically distributed random vectors with arbitrarily dependent positive components. We call a (globally) perturbed random walk a random sequence
$T\,{:\!=}\, (T_k)_{k\in\mathbb{N}}$
defined by
$T_k\,{:\!=}\, \xi_1+\cdots+\xi_{k-1}+\eta_k$
for
$k\in\mathbb{N}$
. Consider a general branching process generated by T and let
$N_j(t)$
denote the number of the jth generation individuals with birth times
$\leq t$
. We treat early generations, that is, fixed generations j which do not depend on t. In this setting we prove counterparts for
$\mathbb{E}N_j$
of the Blackwell theorem and the key renewal theorem, prove a strong law of large numbers for
$N_j$
, and find the first-order asymptotics for the variance of
$N_j$
. Also, we prove a functional limit theorem for the vector-valued process
$(N_1(ut),\ldots, N_j(ut))_{u\geq0}$
, properly normalized and centered, as
$t\to\infty$
. The limit is a vector-valued Gaussian process whose components are integrated Brownian motions.
The standard coalescent is widely used in evolutionary biology and population genetics to model the ancestral history of a sample of molecular sequences as a rooted and ranked binary tree. In this paper we present a representation of the space of ranked trees as a space of constrained ordered matched pairs. We use this representation to define ergodic Markov chains on labeled and unlabeled ranked tree shapes analogously to transposition chains on the space of permutations. We show that an adjacent-swap chain on labeled and unlabeled ranked tree shapes has a mixing time at least of order
$n^3$
, and at most of order
$n^{4}$
. Bayesian inference methods rely on Markov chain Monte Carlo methods on the space of trees. Thus it is important to define good Markov chains which are easy to simulate and for which rates of convergence can be studied.
We obtain series expansions of the q-scale functions of arbitrary spectrally negative Lévy processes, including processes with infinite jump activity, and use these to derive various new examples of explicit q-scale functions. Moreover, we study smoothness properties of the q-scale functions of spectrally negative Lévy processes with infinite jump activity. This complements previous results of Chan et al. (Prob. Theory Relat. Fields150, 2011) for spectrally negative Lévy processes with Gaussian component or bounded variation.
Motivated by recent studies of big samples, this work aims to construct a parametric model which is characterized by the following features: (i) a ‘local’ reinforcement, i.e. a reinforcement mechanism mainly based on the last observations, (ii) a random persistent fluctuation of the predictive mean, and (iii) a long-term almost sure convergence of the empirical mean to a deterministic limit, together with a chi-squared goodness-of-fit result for the limit probabilities. This triple purpose is achieved by the introduction of a new variant of the Eggenberger–Pólya urn, which we call the rescaled Pólya urn. We provide a complete asymptotic characterization of this model, pointing out that, for a certain choice of the parameters, it has properties different from the ones typically exhibited by the other urn models in the literature. Therefore, beyond the possible statistical application, this work could be interesting for those who are concerned with stochastic processes with reinforcement.
We prove a surprising symmetry between the law of the size
$G_n$
of the greedy independent set on a uniform Cayley tree
$ \mathcal{T}_n$
of size n and that of its complement. We show that
$G_n$
has the same law as the number of vertices at even height in
$ \mathcal{T}_n$
rooted at a uniform vertex. This enables us to compute the exact law of
$G_n$
. We also give a Markovian construction of the greedy independent set, which highlights the symmetry of
$G_n$
and whose proof uses a new Markovian exploration of rooted Cayley trees that is of independent interest.
In this paper we introduce new birth-and-death processes with partial catastrophe and study some of their properties. In particular, we obtain some estimates for the mean catastrophe time, and the first and second moments of the distribution of the process at a fixed time t. This is completed by some asymptotic results.
We study the quasi-stationary behavior of the birth–death process with an entrance boundary at infinity. We give by the h-transform an alternative and simpler proof for the exponential convergence of conditioned distributions to a unique quasi-stationary distribution in the total variation norm. In addition, we also show that starting from any initial distribution the conditional probability converges to the unique quasi-stationary distribution exponentially fast in the
$\psi$
-norm.
We analyse features of the patterns formed from a simple model for a martensitic phase transition that fragments the unit square into rectangles. This is a fragmentation model that can be encoded by a general branching random walk. An important quantity is the distribution of the lengths of the interfaces in the pattern, and we establish limit theorems for some of the asymptotics of the interface profile. In particular, we are able to use a general branching process to show almost sure power law decay of the number of interfaces of at least a certain size and a general branching random walk to examine the numbers of rectangles of a certain aspect ratio. In doing so we extend a theorem on the growth of the general branching random walk as well as developing results on the tail behaviour of the limiting random variable in our general branching process.
We study convergence to non-minimal quasi-stationary distributions for one-dimensional diffusions. We give a method for reducing the convergence to the tail behavior of the lifetime via a property we call the first hitting uniqueness. We apply the results to Kummer diffusions with negative drift and give a class of initial distributions converging to each non-minimal quasi-stationary distribution.
Branching-stable processes have recently appeared as counterparts of stable subordinators, when addition of real variables is replaced by branching mechanisms for point processes. Here we are interested in their domains of attraction and describe explicit conditions for a branching random walk to converge after a proper magnification to a branching-stable process. This contrasts with deep results obtained during the past decade on the asymptotic behavior of branching random walks and which involve either shifting without rescaling, or demagnification.