We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
An asymptotic expansion for the expected number, μ(t), of particles of an age-dependent branching process is obtained with a general submultiplicative estimate for the remainder term. The influence of the roots of the characteristic equation on the asymptotic behaviour of μ(t) is taken into account.
We prove a d-dimensional renewal theorem, with an estimate on the rate of convergence, for Markov random walks. This result is applied to a variety of boundary crossing problems for a Markov random walk (Xn,Sn), n ≥0, in which Xn takes values in a general state space and Sn takes values in ℝd. In particular, for the case d = 1, we use this result to derive an asymptotic formula for the variance of the first passage time when Sn exceeds a high threshold b, generalizing Smith's classical formula in the case of i.i.d. positive increments for Sn. For d > 1, we apply this result to derive an asymptotic expansion of the distribution of (XT,ST), where T = inf { n : Sn,1 > b } and Sn,1 denotes the first component of Sn.
We study a service system in which, in each service period, the server performs the current set B of tasks as a batch, taking time s(B), where the function s(·) is subadditive. A natural definition of ‘traffic intensity under congestion’ in this setting is ρ := limt→∞t-1Es (all tasks arriving during time [0,t]). We show that ρ > 1 and a finite mean of individual service times are necessary and sufficient to imply stability of the system. A key observation is that the numbers of arrivals during successive service periods form a Markov chain {An}, enabling us to apply classical regenerative techniques and to express the stationary distribution of the process in terms of the stationary distribution of {An}.
In this paper we study random variables related to a shock reliability model. Our models can be used to study systems that fail when k consecutive shocks with critical magnitude (e.g. above or below a certain critical level) occur. We obtain properties of the distribution function of the random variables involved and we obtain their limit behaviour when k tends to infinity or when the probability of entering a critical set tends to zero. This model generalises the Poisson shock model.
We study the convergence of certain matrix sequences that arise in quasi-birth-and-death (QBD) Markov chains and we identify their limits. In particular, we focus on a sequence of matrices whose elements are absorption probabilities into some boundary states of the QBD. We prove that, under certain technical conditions, that sequence converges. Its limit is either the minimal nonnegative solution G of the standard nonlinear matrix equation, or it is a stochastic solution that can be explicitly expressed in terms of G. Similar results are obtained relative to the standard matrix R that arises in the matrix-geometric solution of the QBD. We present numerical examples that clarify some of the technical issues of interest.
Daley and Vesilo (1997) introduced long-range count dependence (LRcD) for stationary point processes on the real line as a natural augmentation of the classical long-range dependence of the corresponding interpoint sequence. They studied LRcD for some renewal processes and some output processes of queueing systems, continuing the previous research on such processes of Daley (1968), (1975). Subsequently, Daley (1999) showed that a necessary and sufficient condition for a stationary renewal process to be LRcD is that under its Palm measure the generic lifetime distribution has infinite second moment. We show that point processes dominating, in a sense of stochastic ordering, LRcD point processes are LRcD, and as a corollary we obtain that for arbitrary stationary point processes with finite intensity a sufficient condition for LRcD is that under Palm measure the interpoint distances are positively dependent (associated) with infinite second moment. We give many examples of LRcD point processes, among them exchangeable, cluster, moving average, Wold, semi-Markov processes and some examples of LRcD point processes with finite second Palm moment of interpoint distances. These examples show that, in general, the condition of infiniteness of the second moment is not necessary for LRcD. It is an open question whether the infinite second Palm moment of interpoint distances suffices to make a stationary point process LRcD.
Consider a renewal process. The renewal events partition the process into i.i.d. renewal cycles. Assume that on each cycle, a rare event called 'success’ can occur. Such successes lend themselves naturally to approximation by Poisson point processes. If each success occurs after a random delay, however, Poisson convergence may be relatively slow, because each success corresponds to a time interval, not a point. In 1996, Altschul and Gish proposed a finite-size correction to a particular approximation by a Poisson point process. Their correction is now used routinely (about once a second) when computers compare biological sequences, although it lacks a mathematical foundation. This paper generalizes their correction. For a single renewal process or several renewal processes operating in parallel, this paper gives an asymptotic expansion that contains in successive terms a Poisson point approximation, a generalization of the Altschul-Gish correction, and a correction term beyond that.
As models for molecular evolution, immune response, and local search algorithms, various authors have used a stochastic process called the evolutionary walk, which goes as follows. Assign a random number to each vertex of the infinite N-ary tree, and start with a particle on the root. A step of the process consists of searching for a child with a higher number and moving the particle there; if no such child exists, the process stops. The average number of steps in this process is asymptotic, as N → ∞, to log N, a result first proved by Macken and Perelson. This paper relates the evolutionary walk to a process called random bisection, familiar from combinatorics and number theory, which can be thought of as a transformed Poisson process. We first give a thorough treatment of the exact walk length, computing its distribution, moments and moment generating function. Next we show that the walk length is asymptotically normally distributed. We also treat it as a mixture of Poisson random variables and compute the asymptotic distribution of the Poisson parameter. Finally, we show that in an evolutionary walk with uniform vertex numbers, the ‘jumps’, ordered by size, have the same asymptotic distribution as the normalized cycle lengths in a random permutation.
We consider a controlled M/M/1 queueing system where customers may be subject to two potential rejections. The first occurs upon arrival and is dependent on the number of customers in the queue and the service rate of the customer currently in service. The second, which may or may not occur, occurs immediately prior to the customer receiving service. That is, after each service completion the customer in the front of the queue is assessed and the service rate of that customer is revealed. If the second decision-maker recommends rejection, the customer is denied service with a fixed probability. We show the existence of long-run average optimal monotone switching-curve policies. Further, we show that the average reward is increasing in the probability that the second decision-maker's recommendation of rejection is honored. Applications include call centers with delayed classifications and manufacturing systems when the server is responsible for multiple tasks.
We define and analyse a random coverage process of the d-dimensional Euclidean space which allows us to describe a continuous spectrum that ranges from the Boolean model to the Poisson–Voronoi tessellation to the Johnson–Mehl model. As for the Boolean model, the minimal stochastic setting consists of a Poisson point process on this Euclidean space and a sequence of real valued random variables considered as marks of this point process. In this coverage process, the cell attached to a point is defined as the region of the space where the effect of the mark of this point exceeds an affine function of the cumulative effect of all marks. This cumulative effect is defined as the shot-noise process associated with the marked point process. In addition to analysing and visualizing this spectrum, we study various basic properties of the coverage process such as the probability that a point or a pair of points be covered by a typical cell. We also determine the distribution of the number of cells which cover a given point, and show how to provide deterministic bounds on this number. Finally, we also analyse convergence properties of the coverage process using the framework of closed sets, and its differentiability properties using perturbation analysis. Our results require a pathwise continuity property for the shot-noise process for which we provide sufficient conditions. The model in question stems from wireless communications where several antennas share the same (or different but interfering) channel(s). In this case, the area where the signal of a given antenna can be received is the area where the signal to interference ratio is large enough. We describe this class of problems in detail in the paper. The results obtained allow us to compute quantities of practical interest within this setting: for instance the outage probability is obtained as the complement of the volume fraction; the law of the number of cells covering a point allows us to characterize handover strategies, and so on.
If τ is the lifetime of a coherent system, then the signature of the system is the vector of probabilities that the lifetime coincides with the ith order statistic of the component lifetimes. The signature can be useful in comparing different systems. In this treatment we give a characterization of the signature of a system with independent identically distributed components in terms of the number of path sets in the system as well as in terms of the number of what we call ordered cut sets. We consider, in particular, the signatures of indirect majority systems and compare them with the signatures of simple majority systems of the same size. We note that the signature of an indirect majority system of size r × s = n is symmetric around , and use this to show that the expected lifetime of an r × s = n indirect majority system exceeds that of a simple (direct) majority system of size n when the components are exponentially distributed with the same parameter.
In this paper two burn-in procedures for a general failure model are considered. There are two types of failure in the general failure model. One is Type I failure (minor failure) which can be removed by a minimal repair or a complete repair and the other is Type II failure (catastrophic failure) which can be removed only by a complete repair. During a burn-in process, with burn-in Procedure I, the failed component is repaired completely regardless of the type of failure, whereas, with burn-in Procedure II, only minimal repair is done for the Type I failure and a complete repair is performed for the Type II failure. In field use, the component is replaced by a new burned-in component at the ‘field use age’ T or at the time of the first Type II failure, whichever occurs first. Under the model, the problems of determining optimal burn-in time and optimal replacement policy are considered. The two burn-in procedures are compared in cases when both the procedures are applicable.
We consider an M/G/1 queue with the special feature that the speed of the server alternates between two constant values sL and sH > sL. The high-speed periods are exponentially distributed, and the low-speed periods have a general distribution. Our main results are: (i) for the case that the distribution of the low-speed periods has a rational Laplace–Stieltjes transform, we obtain the joint distribution of the buffer content and the state of the server speed; (ii) for the case that the distribution of the low-speed periods and/or the service request distribution is regularly varying at infinity, we obtain explicit asymptotics for the tail of the buffer content distribution. The two cases in which the offered traffic load is smaller or larger than the low service speed are shown to result in completely different asymptotics.
We consider a stochastic model for the spread of an SIR (susceptible → infective → removed) epidemic among a closed, finite population that contains several types of individuals and is partitioned into households. The infection rate between two individuals depends on the types of the transmitting and receiving individuals and also on whether the infection is local (i.e., within a household) or global (i.e., between households). The exact distribution of the final outcome of the epidemic is outlined. A branching process approximation for the early stages of the epidemic is described and made fully rigorous, by considering a sequence of epidemics in which the number of households tends to infinity and using a coupling argument. This leads to a threshold theorem for the epidemic model. A central limit theorem for the final outcome of epidemics which take off is derived, by exploiting an embedding representation.
We consider the problem of estimating the rate of convergence to stationarity of a continuous-time, finite-state Markov chain. This is done via an estimator of the second-largest eigenvalue of the transition matrix, which in turn is based on conventional inference in a parametric model. We obtain a limiting distribution for the eigenvalue estimator. As an example we treat an M/M/c/c queue, and show that the method allows us to estimate the time to stationarity τ within a time comparable to τ.
In this paper, results on spectrally negative Lévy processes are used to study the ruin probability under some risk processes. These processes include the compound Poisson process and the gamma process, both perturbed by diffusion. In addition, the first time the risk process hits a given level is also studied. In the case of classical risk process, the joint distribution of the ruin time and the first recovery time is obtained. Some results in this paper have appeared before (e.g., Dufresne and Gerber (1991), Gerber (1990), dos Reis (1993)). We revisit them from the Lévy process theory's point of view and in a unified and simple way.
We study a Markovian model for a perishable inventory system with random input and an external source of obsolescence: at Poisson random times the whole current content of the system is spoilt and must be scrapped. The system can be described by its virtual death time process. We derive its stationary distribution in closed form and find an explicit formula for the Laplace transform of the cycle length, defined as the time between two consecutive item arrivals in an empty system. The results are used to compute several cost functionals. We also derive these functionals under the corresponding heavy traffic approximation, which is modeled using a Brownian motion in [0,1] reflected at 0 and 1 and restarted at 1 at the Poisson disaster times.
We consider a reflected superposition of a Brownian motion and a compound Poisson process as a model for the workload process of a queueing system with two types of customers under heavy traffic. The distributions of the duration of a busy cycle and the maximum workload during a cycle are determined in closed form.
The multiplexing of variable bit rate traffic streams in a packet network gives rise to two types of queueing. On a small time-scale, the rates at which the sources send is more or less constant, but there is queueing due to simultaneous packet arrivals (packet-level effect). On a somewhat larger time-scale, queueing is the result of a relatively high number of sources sending at a rate that is higher than their average rate (burst-level effect). This paper explores these effects. In particular, we give asymptotics of the overflow probability in the combined packet/burst scale model. It is shown that there is a specific size of the buffer (i.e. the ‘critical buffer size’) below which packet-scale effects are dominant, and above which burst-scale effects essentially determine the performance—strikingly, there is a sharp demarcation: theso-called ‘phase transition’. The results are asymptotic in the number of sources n. We scale buffer space B and link rate C by n, to nb and nc, respectively; then we let n grow large. Applying large deviations theory we show that in this regime the overflow probability decays exponentially in the number of sources n. For small buffers the corresponding decay rate can be calculated explicitly, for large buffers we derive an asymptote (linear in b). The results for small and large buffers give rise to an approximation for the decay rate (with general b), as well as for the critical buffer size. A numerical example (multiplexing of voice streams) confirms the accuracy of these approximations.
Some consequences of restarting stochastic search algorithms are studied. It is shown under reasonable conditions that restarting when certain patterns occur yields probabilities that the goal state has not been found by the nth epoch which converge to zero at least geometrically fast in n. These conditions are shown to hold for restarted simulated annealing employing a local generation matrix, a cooling schedule Tn ∼ c/n and restarting after a fixed number r + 1 of duplications of energy levels of states when r is sufficiently large. For simulated annealing with logarithmic cooling these probabilities cannot decrease to zero this fast. Numerical comparisons between restarted simulated annealing and several modern variations on simulated annealing are also presented and in all cases the former performs better.