We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
We consider a Lévy process with no negative jumps, reflected at a stochastic boundary that is a positive constant multiple of an age process associated with a Poisson process. We show that the stability condition for this process is identical to the one for the case of reflection at the origin. In particular, there exists a unique stationary distribution that is independent of initial conditions. We identify the Laplace-Stieltjes transform of the stationary distribution and observe that it satisfies a decomposition property. In fact, it is a sum of two independent random variables, one of which has the stationary distribution of the process reflected at the origin, and the other the stationary distribution of a certain clearing process. The latter is itself distributed as an infinite sum of independent random variables. Finally, we discuss the tail behavior of the stationary distribution and in particular observe that the second distribution in the decomposition always has a light tail.
Admission control can be employed to avoid congestion in queueing networks subject to overload. In distributed networks, the admission decisions are often based on imperfect measurements on the network state. In this paper, we study how the lack of complete state information affects the system performance, by considering a simple network model for distributed admission control. The stability region of the network is characterized and it is shown how feedback signaling makes the system very sensitive to its parameters.
In this paper, we present an iterative procedure to calculate explicitly the Laplace transform of the distribution of the maximum for a Lévy process with positive jumps of phase type. We derive error estimates showing that this iteration converges geometrically fast. Subsequently, we determine the Laplace transform of the law of the upcrossing ladder process and give an explicit pathwise construction of this process.
In this paper, we introduce a generalization of the two-color multitype contact process intended to mimic a biological process called allelopathy. To be precise, we have two types of particle. Particles of each type give birth to particles of the same type, and die at rate 1. When a particle of type 1 dies, it gives way to a frozen site that blocks particles of type 2 for an exponentially distributed amount of time. Specifically, we investigate in detail the phase transitions and the duality properties of the interacting particle system.
We consider a stochastic model for the spread of a susceptible–infective–removed (SIR) epidemic among a closed, finite population, in which there are two types of severity of infectious individuals, namely mild and severe. The type of severity depends on the amount of infectious exposure an individual receives, in that infectives are always initially mild but may become severe if additionally exposed. Large-population properties of the model are derived. In particular, a coupling argument is used to provide a rigorous branching process approximation to the early stages of an epidemic, and an embedding argument is used to derive a strong law and an associated central limit theorem for the final outcome of an epidemic in the event of a major outbreak. The basic reproduction number, which determines whether or not a major outbreak can occur given few initial infectives, depends only on parameters of the mild infectious state, whereas the final outcome in the event of a major outbreak depends also on parameters of the severe state. Moreover, the limiting final size proportions need not even be continuous in the model parameters.
The purpose of this paper is to determine the exact distribution of the final size of an epidemic for a wide class of models of susceptible–infective–removed type. First, a nonstationary version of the classical Reed–Frost model is constructed that allows us to incorporate, in particular, random levels of resistance to infection in the susceptibles. Then, a randomized version of this nonstationary model is considered in order to take into account random levels of infectiousness in the infectives. It is shown that, in both cases, the distribution of the final number of infected individuals can be obtained in terms of Abel–Gontcharoff polynomials. The new methodology followed also provides a unified approach to a number of recent works in the literature.
We introduce the idea of controlling branching processes by means of another branching process, using the fractional thinning operator of Steutel and van Harn. This idea is then adapted to the model of alternating branching, where two Markov branching processes act alternately at random observation and treatment times. We study the extinction probability and limit theorems for reproduction by n cycles, as n → ∞.
We extend the definition of level-crossing ordering of stochastic processes, proposed by Irle and Gani (2001), to the case in which the times to exceed levels are compared using an arbitrary stochastic order, and work, in particular, with integral stochastic orders closed for convolution. Using a sample-path approach, we establish level-crossing ordering results for the case in which the slower of the processes involved in the comparison is skip-free to the right. These results are specially useful in simulating processes that are ordered in level crossing, and extend results of Irle and Gani (2001), Irle (2003), and Ferreira and Pacheco (2005) for skip-free-to-the-right discrete-time Markov chains, semi-Markov processes, and continuous-time Markov chains, respectively.
We construct random dynamics for collections of nonintersecting planar contours, leaving invariant the distributions of length- and area-interacting polygonal Markov fields with V-shaped nodes. The first of these dynamics is based on the dynamic construction of consistent polygonal fields, as presented in the original articles by Arak (1983) and Arak and Surgailis (1989), (1991), and it provides an easy-to-implement Metropolis-type simulation algorithm. The second dynamics leads to a graphical construction in the spirit of Fernández et al. (1998), (2002) and yields a perfect simulation scheme in a finite window in the infinite-volume limit. This algorithm seems difficult to implement, yet its value lies in that it allows for theoretical analysis of the thermodynamic limit behaviour of length-interacting polygonal fields. The results thus obtained include, in the class of infinite-volume Gibbs measures without infinite contours, the uniqueness and exponential α-mixing of the thermodynamic limit of such fields in the low-temperature region. Outside this class, we conjecture the existence of an infinite number of extreme phases breaking both the translational and rotational symmetries.
We consider the planar random motion of a particle that moves with constant finite speed c and, at Poisson-distributed times, changes its direction θ with uniform law in [0, 2π). This model represents the natural two-dimensional counterpart of the well-known Goldstein–Kac telegraph process. For the particle's position (X(t), Y(t)), t > 0, we obtain the explicit conditional distribution when the number of changes of direction is fixed. From this, we derive the explicit probability law f(x, y, t) of (X(t), Y(t)) and show that the density p(x, y, t) of its absolutely continuous component is the fundamental solution to the planar wave equation with damping. We also show that, under the usual Kac condition on the velocity c and the intensity λ of the Poisson process, the density p tends to the transition density of planar Brownian motion. Some discussions concerning the probabilistic structure of wave diffusion with damping are presented and some applications of the model are sketched.
We consider risk processes that locally behave like Brownian motion with some drift and variance, these both depending on an underlying Markov chain that is also used to generate the claims arrival process. Thus, claims arrive according to a renewal process with waiting times of phase type. Positive claims (downward jumps) are always possible but negative claims (upward jumps) are also allowed. The claims are assumed to form an independent, identically distributed sequence, independent of everything else. As main results, the joint Laplace transform of the time to ruin and the undershoot at ruin, as well as the probability of ruin, are explicitly determined under the assumption that the Laplace transform of the positive claims is a rational function. Both the joint Laplace transform and the ruin probability are decomposed according to the type of ruin: ruin by jump or ruin by continuity. The methods used involve finding certain martingales by first finding partial eigenfunctions for the generator of the Markov process composed of the risk process and the underlying Markov chain. We also use certain results from complex function theory as important tools.
We define an inverse subordinator as the passage times of a subordinator to increasing levels. It has previously been noted that such processes have many similarities to renewal processes. Here we present an expression for the joint moments of the increments of an inverse subordinator. This is an analogue of a result for renewal processes. The main tool is a theorem on processes which are both renewal processes and Cox processes.
A simple, widely applicable method is described for determining factorial moments of N̂t, the number of occurrences in (0,t] of some event defined in terms of an underlying Markov renewal process, and asymptotic expressions for these moments as t → ∞. The factorial moment formulae combine to yield an expression for the probability generating function of N̂t, and thereby further properties of such counts. The method is developed by considering counting processes associated with events that are determined by the states at two successive renewals of a Markov renewal process, for which it both simplifies and generalises existing results. More explicit results are given in the case of an underlying continuous-time Markov chain. The method is used to provide novel, probabilistically illuminating solutions to some problems arising in the stochastic modelling of ion channels.
Consider two sequences of bounded random variables, a value and a timing process, that satisfy the large deviation principle (LDP) with rate function J(⋅,⋅) and whose cumulative process satisfies the LDP with rate function I(⋅). Under mixing conditions, an LDP for estimates of I constructed by transforming an estimate of J is proved. For the case of a cumulative renewal process it is demonstrated that this approach is favourable to a more direct method, as it ensures that the laws of the estimates converge weakly to a Dirac measure at I.
In this paper, we consider the asymptotic behavior of stationary probability vectors of Markov chains of GI/G/1 type. The generating function of the stationary probability vector is explicitly expressed by the R-measure. This expression of the generating function is more convenient for the asymptotic analysis than those in the literature. The RG-factorization of both the repeating row and the Wiener-Hopf equations for the boundary row are used to provide necessary spectral properties. The stationary probability vector of a Markov chain of GI/G/1 type is shown to be light tailed if the blocks of the repeating row and the blocks of the boundary row are light tailed. We derive two classes of explicit expression for the asymptotic behavior, the geometric tail, and the semigeometric tail, based on the repeating row, the boundary row, or the minimal positive solution of a crucial equation involved in the generating function, and discuss the singularity classes of the stationary probability vector.
Consider the single-server queue with an infinite buffer and a first-in–first-out discipline, either of type M/M/1 or Geom/Geom/1. Denote by 𝒜 the arrival process and by s the services. Assume the stability condition to be satisfied. Denote by 𝒟 the departure process in equilibrium and by r the time spent by the customers at the very back of the queue. We prove that (𝒟, r) has the same law as (𝒜, s), which is an extension of the classical Burke theorem. In fact, r can be viewed as the sequence of departures from a dual storage model. This duality between the two models also appears when studying the transient behaviour of a tandem by means of the Robinson–Schensted–Knuth algorithm: the first and last rows of the resulting semistandard Young tableau are respectively the last instant of departure from the queue and the total number of departures from the store.
We study the impact of service time distributions on the distribution of the maximum queue length during a busy period for the MX/G/1 queue. The maximum queue length is an important random variable to understand when designing the buffer size for finite-buffer (M/G/1/n) systems. We show the somewhat surprising result that, for three variations of the preemptive last-come–first-served discipline, the maximum queue length during a busy period is smaller when service times are more variable (in the convex sense).
We consider a hard-sphere model in ℝd generated by a stationary point process N and the lilypond growth protocol: at time 0, every point of N starts growing with unit speed in all directions to form a system of balls in which any particular ball ceases its growth at the instant that it collides with another ball. Some quite general conditions are given, under which it is shown that the model is well defined and exhibits no percolation. The absence of percolation is attributable to the fact that, under our assumptions, there can be no descending chains in N. The proof of this fact forms a significant part of the paper. It is also shown that, in the absence of descending chains, mutual-nearest-neighbour matching can be used to construct a bijective point map as defined by Thorisson.
We study the expected time to ruin in a risk process in which dividends are paid when the surplus is above the barrier. We consider the case in which the dividend rate is smaller than the premium rate. We obtain results for the classical compound Poisson risk process with phase-type claim size. When the ruin probability is 1, we derive the expected time to ruin and the expected dividends paid. When the ruin probability is less than 1, these quantities are derived conditioning on the event that ruin occurs.
Let (Φt)t∈ℝ+ be a Harris ergodic continuous-time Markov process on a general state space, with invariant probability measure π. We investigate the rates of convergence of the transition function Pt(x, ·) to π; specifically, we find conditions under which r(t)||Pt(x, ·) − π|| → 0 as t → ∞, for suitable subgeometric rate functions r(t), where ||·|| denotes the usual total variation norm for a signed measure. We derive sufficient conditions for the convergence to hold, in terms of the existence of suitable points on which the first hitting time moments are bounded. In particular, for stochastically ordered Markov processes, explicit bounds on subgeometric rates of convergence are obtained. These results are illustrated in several examples.