To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
A continuous-state population-size-dependent branching process {Xt} is a modification of the Jiřina process. We prove that such a process arises as the limit of a sequence of suitably scaled population-size-dependent branching processes with discrete states. The extinction problem for the population Xt is discussed, and the limit distribution of Xt / t obtained when Xt tends to infinity.
The problem of computing the moment generating function of the first passage time T to a > 0 or −b < 0 for a one-dimensional Wiener process {X(t), t ≥ 0} is generalized by assuming that the infinitesimal parameters of the process may depend on the sign of X(t). The probability that the process is absorbed at a is also computed explicitly, as is the expected value of T.
We give conditions under which the number of events which occur in a sequence of m-dependent events is stochastically smaller than a suitably defined compound Poisson random variable. The results are applied to counts of sequence pattern appearances and to system reliability. We also provide a numerical example.
We consider growth-collapse processes (GCPs) that grow linearly between random partial collapse times, at which they jump down according to some distribution depending on their current level. The jump occurrences are governed by a state-dependent rate function r(x). We deal with the stationary distribution of such a GCP, (Xt)t≥0, and the distributions of the hitting times Ta = inf{t ≥ 0 : Xt = a}, a > 0. After presenting the general theory of these GCPs, several important special cases are studied. We also take a brief look at the Markov-modulated case. In particular, we present a method of computing the distribution of min[Ta, σ] in this case (where σ is the time of the first jump), and apply it to determine the long-run average cost of running a certain Markov-modulated disaster-ridden system.
In this paper we study polynomial and geometric (exponential) ergodicity for M/G/1-type Markov chains and Markov processes. First, practical criteria for M/G/1-type Markov chains are obtained by analyzing the generating function of the first return probability to level 0. Then the corresponding criteria for M/G/1-type Markov processes are given, using their h-approximation chains. Our method yields the radius of convergence of the generating function of the first return probability, which is very important in obtaining explicit bounds on geometric (exponential) convergence rates. Our results are illustrated, in the final section, in some examples.
We consider the optimal stopping problem for g(Zn), where Zn, n = 1, 2, …, is a homogeneous Markov sequence. An algorithm, called forward improvement iteration, is presented by which an optimal stopping time can be computed. Using an iterative step, this algorithm computes a sequence B0 ⊇ B1 ⊇ B2 ⊇ · · · of subsets of the state space such that the first entrance time into the intersection F of these sets is an optimal stopping time. Various applications are given.
We investigate the asymptotic behaviour of homogeneous multidimensional Markov chains whose states have nonnegative integer components. We obtain growth rates for these models in a situation similar to the near-critical case for branching processes, provided that they converge to infinity with positive probability. Finally, the general theoretical results are applied to a class of controlled multitype branching process in which random control is allowed.
In this paper, we present an iterative procedure to calculate explicitly the Laplace transform of the distribution of the maximum for a Lévy process with positive jumps of phase type. We derive error estimates showing that this iteration converges geometrically fast. Subsequently, we determine the Laplace transform of the law of the upcrossing ladder process and give an explicit pathwise construction of this process.
The purpose of this paper is to determine the exact distribution of the final size of an epidemic for a wide class of models of susceptible–infective–removed type. First, a nonstationary version of the classical Reed–Frost model is constructed that allows us to incorporate, in particular, random levels of resistance to infection in the susceptibles. Then, a randomized version of this nonstationary model is considered in order to take into account random levels of infectiousness in the infectives. It is shown that, in both cases, the distribution of the final number of infected individuals can be obtained in terms of Abel–Gontcharoff polynomials. The new methodology followed also provides a unified approach to a number of recent works in the literature.
Competing patterns are compound patterns that compete to be the first to occur pattern-specific numbers of times. They represent a generalisation of the sooner waiting time problem and of start-up demonstration tests with both acceptance and rejection criteria. Through the use of finite Markov chain imbedding, the waiting time distribution of competing patterns in multistate trials that are Markovian of a general order is derived. Also obtained are probabilities that each particular competing pattern will be the first to occur its respective prescribed number of times, both in finite time and in the limit.
A cut-off phenomenon is shown to occur in a sample of n independent, identically distributed Ornstein-Uhlenbeck processes and its average. Their distributions stay far from equilibrium before a certain O(log(n)) time, and converge exponentially fast after. Precise estimates show that the total variation distance drops from almost 1 to almost 0 over an interval of time of length O(1) around log(n)/(2α), where α is the viscosity coefficient of the sampled process. The distribution of the hitting time of 0 by the average of the sample is computed. As n tends to infinity, the hitting time becomes concentrated around the cut-off instant, and its tails match the estimates given for the total variation distance.
We introduce the idea of controlling branching processes by means of another branching process, using the fractional thinning operator of Steutel and van Harn. This idea is then adapted to the model of alternating branching, where two Markov branching processes act alternately at random observation and treatment times. We study the extinction probability and limit theorems for reproduction by n cycles, as n → ∞.
We extend the definition of level-crossing ordering of stochastic processes, proposed by Irle and Gani (2001), to the case in which the times to exceed levels are compared using an arbitrary stochastic order, and work, in particular, with integral stochastic orders closed for convolution. Using a sample-path approach, we establish level-crossing ordering results for the case in which the slower of the processes involved in the comparison is skip-free to the right. These results are specially useful in simulating processes that are ordered in level crossing, and extend results of Irle and Gani (2001), Irle (2003), and Ferreira and Pacheco (2005) for skip-free-to-the-right discrete-time Markov chains, semi-Markov processes, and continuous-time Markov chains, respectively.
We derive a large deviation principle for a Brownian immigration branching particle system, where the immigration is governed by a Poisson random measure with a Lebesgue intensity measure.
We consider the planar random motion of a particle that moves with constant finite speed c and, at Poisson-distributed times, changes its direction θ with uniform law in [0, 2π). This model represents the natural two-dimensional counterpart of the well-known Goldstein–Kac telegraph process. For the particle's position (X(t), Y(t)), t > 0, we obtain the explicit conditional distribution when the number of changes of direction is fixed. From this, we derive the explicit probability law f(x, y, t) of (X(t), Y(t)) and show that the density p(x, y, t) of its absolutely continuous component is the fundamental solution to the planar wave equation with damping. We also show that, under the usual Kac condition on the velocity c and the intensity λ of the Poisson process, the density p tends to the transition density of planar Brownian motion. Some discussions concerning the probabilistic structure of wave diffusion with damping are presented and some applications of the model are sketched.
Let (Xt)t≥0 be a continuous-time irreducible Markov chain on a finite state space E, let v: E→ℝ\{0}, and let (φt)t≥0 be defined by φt=∫0tv(Xs)d s. We consider the case in which the process (φt)t≥0 is oscillating and that in which (φt)t≥0 has a negative drift. In each of these cases, we condition the process (Xt,φt)t≥0 on the event that (φt)t≥0 hits level y before hitting 0 and prove weak convergence of the conditioned process as y→∞. In addition, we show the relationship between the conditioning of the process (φt)t≥0 with a negative drift to oscillate and the conditioning of it to stay nonnegative for a long time, and the relationship between the conditioning of (φt)t≥0 with a negative drift to drift to ∞ and the conditioning of it to hit large levels before hitting 0.
We consider risk processes that locally behave like Brownian motion with some drift and variance, these both depending on an underlying Markov chain that is also used to generate the claims arrival process. Thus, claims arrive according to a renewal process with waiting times of phase type. Positive claims (downward jumps) are always possible but negative claims (upward jumps) are also allowed. The claims are assumed to form an independent, identically distributed sequence, independent of everything else. As main results, the joint Laplace transform of the time to ruin and the undershoot at ruin, as well as the probability of ruin, are explicitly determined under the assumption that the Laplace transform of the positive claims is a rational function. Both the joint Laplace transform and the ruin probability are decomposed according to the type of ruin: ruin by jump or ruin by continuity. The methods used involve finding certain martingales by first finding partial eigenfunctions for the generator of the Markov process composed of the risk process and the underlying Markov chain. We also use certain results from complex function theory as important tools.
We derive necessary and sufficient conditions for the existence of bounded or summable solutions to systems of linear equations associated with Markov chains. This substantially extends a famous result of G. E. H. Reuter, which provides a convenient means of checking various uniqueness criteria for birth-death processes. Our result allows chains with much more general transition structures to be accommodated. One application is to give a new proof of an important result of M. F. Chen concerning upwardly skip-free processes. We then use our generalization of Reuter's lemma to prove new results for downwardly skip-free chains, such as the Markov branching process and several of its many generalizations. This permits us to establish uniqueness criteria for several models, including the general birth, death, and catastrophe process, extended branching processes, and asymptotic birth-death processes, the latter being neither upwardly skip-free nor downwardly skip-free.
We study the following model for a phylogenetic tree on n extant species: the origin of the clade is a random time in the past whose (improper) distribution is uniform on (0,∞); thereafter, the process of extinctions and speciations is a continuous-time critical branching process of constant rate, conditioned on there being the prescribed number n of species at the present time. We study various mathematical properties of this model as n→∞: namely the time of origin and of the most recent common ancestor, the pattern of divergence times within lineage trees, the time series of the number of species, the total number of extinct species, the total number of species ancestral to the extant ones, and the ‘local’ structure of the tree itself. We emphasize several mathematical techniques: the association of walks with trees; a point process representation of lineage trees; and Brownian limits.
In this paper, we investigate the geometric growth of homogeneous multitype Markov chains whose states have nonnegative integer coordinates. Such models are considered in a situation similar to the supercritical case for branching processes. Finally, our general theoretical results are applied to a class of controlled multitype branching process in which the control is random.