We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Consider the total service time of a job on an unreliable server under preemptive-repeat-different and preemptive-resume service disciplines. With identical initial conditions, for both cases, we notice that the distributions of the total service time under these two disciplines coincide, when the original service time (without interruptions due to server failures) is exponential and independent of the server reliability. We show that this fact under varying server reliability is a characterization of the exponential distribution. Further we show, under the same initial conditions, that the coincidence of the mean values also leads to the same characterization.
We suggest a new universal method of stochastic simulation, allowing us to generate rather efficiently random vectors with arbitrary densities in a connected open region or on its boundary. Our method belongs to the class of dynamic Monte Carlo procedures and is based on a special construction of a Markov chain on the boundary of the region. Its remarkable feature is that this chain admits a simple simulation, based on a universal (depending only on the dimensionality of the space) stochastic driver.
We consider the Mx/G/∞ queue in which customers in a batch belong to k different types, and a customer of type i requires a non-negative service time with general distribution function Bi(s) (1 ≦ i ≦ k). The number of customers in a batch is stochastic. The joint probability generating function of the number of customers of type i being served at a fixed time t > 0 is derived by the method of collective marks.
Consider the GI/G/1 queueing system with traffic intensity 1 and let wk and lk denote the actual waiting time of the kth unit and the number of units present in the system at the kth arrival including the kth unit, respectively. Furthermore let τ denote the number of units served during the first busy period and μ the intensity of the service. It is shown that as k →∞, where a is some known constant, , , and are independent, is a Brownian meander and is a Wiener process. A similar result is also given for the difference of virtual waiting time and queue length processes. These results are also extended to a wider class of queueing systems than GI/G/1 queues and a scheme of series of queues.
Suppose that a device is subjected to shocks governed by a counting process N = {N(t), t ≧0}. The probability that the device survives beyond time t is then H̄(t)=Σk=0∞Q̄ℙ[N(t)=k], where Q̄k is the probability of surviving k shocks. It is known that H is NBU if the interarrivals Uk, ∊ ℕ+, are independent and NBU, and Q̄k+j ≦ Q̄k· Q̄j holds whenever k, j ∊ ℕ. Similar results hold for the classes of the NBUE and HNBUE distributions. We show that some other ageing classes have similar properties.
A problem of regrinding and recycling worn train wheels leads to a Markov population process with distinctive properties, including a product-form equilibrium distribution. A convenient framework for analyzing this process is via the notion of dynamic reversal, a natural extension of ordinary (time) reversal. The dynamically reversed process is of the same type as the original process, which allows a simple derivation of some important properties. The process seems not to belong to any class of Markov processes for which stationary distributions are known.
A central limit theorem for cumulative processes was first derived by Smith (1955). No remainder term was given. We use a different approach to obtain such a term here. The rate of convergence is the same as that in the central limit theorems for sequences of independent random variables.
We derive two kinds of rate conservation laws for describing the time-dependent behavior of a process defined with a stationary marked point process and starting at time 0. These formulas are called TRCLs (time-dependent rate conservation laws). It is shown that TRCLs are useful to study the transient behaviors of risk and storage processes with stationary claim and supply processes and with a general premium and release rates, respectively. Detailed discussions are given for the severity for the risk process, and for the workload process of a single-server queue.
We present two forms of weak majorization, namely, very weak majorization and p-weak majorization that can be used as sample path criteria in the analysis of queueing systems. We demonstrate how these two criteria can be used in making comparisons among the joint queue lengths of queueing systems with blocking and/or multiple classes, by capturing an interesting interaction between state and performance descriptors. As a result, stochastic orderings on performance measures such as the cumulative number of losses can be derived. We describe applications that involve the determination of optimal policies in the context of load-balancing and scheduling.
Let ℱ be a countable plane triangulation embedded in ℝ2 in such a way that no bounded region contains more than finitely many vertices, and let Pp be Bernoulli (p) product measure on the vertex set of ℱ. Let E be the event that a fixed vertex belongs to an infinite path whose vertices alternate states sequentially. It is shown that the AB percolation probability function θΑΒ (p) = Pp(E) is non-decreasing in p for 0 ≦ p ≦ ½. By symmetry, θ AΒ(p) is therefore unimodal on [0, 1]. This result partially verifies a conjecture due to Halley and stands in contrast to the examples of Łuczak and Wierman.
By an argument which involves matching sample paths, some useful equations for the probability distribution of the fundamental period in the MAP/G/1 queue are derived with less calculational effort than in earlier proofs. It is further shown that analogous equations hold for the MAP/SM/1 queueing model. These results are then used to derive explicit formulas for the mean vectors of the number served during and the duration of the fundamental period.
A rumour model due to Maki and Thompson (1973) is slightly modified to incorporate a continuous-time random contact process and varying individual behaviours in front of the rumour. Two important measures of the final extent of the rumour are provided by the ultimate number of people who have heard the rumour, and the total personal time units during which the rumour is spread. Our purpose in this note is to derive the exact joint distribution of these two statistics. That will be done by constructing a family of martingales for the rumour process and then using a particular family of Gontcharoff polynomials.
For a network of G/∞ service facilities, the transient joint distribution of the facility populations is shown by new simple methods to have a simple Poisson product form with simple explicit formulas for the means. In the network it is assumed that: (a) each facility has an infinite number of servers; (b) the service time distributions are general; (c) external traffic is non-homogeneous in time; (d) arrivals have random or deterministic routes through the network possibly returning to the same facility more than once; (e) arrivals use the facilities on their route sequentially or in parallel (as in the case of a circuitswitched telecommunication network). The results have relevance to communication networks and manufacturing systems.
In this contribution we consider an M/M/1 queueing model with general server vacations. Transient and steady state analysis are carried out in discrete time by combinatorial methods. Using weak convergence of discrete-parameter Markov chains we also obtain formulas for the corresponding continuous-time queueing model. As a special case we discuss briefly a queueing system with a T-policy operating.
In this paper we study the behavior of a delayed compound renewal process, S, about some fixed level, L. Normally, a jump process S increases at random times τ1, τ2, …, in random increments until it crosses L. S would then be terminated in a random number v of phases at time τv. In many applications, a more general termination scenario assumes that S may evolve either through v or σ random phases, whichever of the two is smaller (denoted by T). The number T of actual phases is called the termination index, and we evaluate a joint functional of T, the termination time τ T and the termination level ST. We also seek information about the process S a step before its termination, and derive a joint functional for all relevant processes. Examples of these processes and their applications to various stochastic models are discussed.
Certain last-exit and first-passage probabilities for random walks are approximated via a heuristic method suggested by a ladder variable argument. They yield satisfactory approximations of the first- and second-order moments of the queue length within a busy period of an M/D/1 queue. The approximation is applied to the wider class of random walks that arise in studying M/GI/1 queues. For gamma-distributed service times the queue length distribution is independent of the arrival rate. For other distributions where the arrival rate affects the queue length distribution, we have to use conjugate distributions in order to exploit a local central limit property. The limit underlying the approximation has the nature of a Brownian excursion.
The source of the problem lies in recent queueing inference work; the connection with Takács' interests comes from both queueing theory and the ballot theorem.
We consider single-server queueing systems that are modulated by a discrete-time Markov chain on a countable state space. The underlying stochastic process is a Markov random walk (MRW) whose increments can be expressed as differences between service times and interarrival times. We derive the joint distributions of the waiting and idle times in the presence of the modulating Markov chain. Our approach is based on properties of the ladder sets associated with this MRW and its time-reversed counterpart. The special case of a Markov-modulated M/M/1 queueing system is then analysed and results analogous to the classical case are obtained.
We consider the standard single-server queue with unlimited waiting space and the first-in first-out service discipline, but without any explicit independence conditions on the interarrival and service times. We find conditions for the steady-state waiting-time distribution to have asymptotics of the form x–1 log P(W> x) → –θ ∗as x → ∞for θ ∗ > 0. We require only stationarity of the basic sequence of service times minus interarrival times and a Gärtner–Ellis condition for the cumulant generating function of the associated partial sums, i.e. n–1 log E exp (θSn) → ψ (θ) as n → ∞, plus regularity conditions on the decay rate function ψ. The asymptotic decay rate θ is the root of the equation ψ (θ) = 0. This result in turn implies a corresponding asymptotic result for the steady-state workload in a queue with general non-decreasing input. This asymptotic result covers the case of multiple independent sources, so that it provides additional theoretical support for a concept of effective bandwidths for admission control in multiclass queues based on asymptotic decay rates.
For positive recurrent nearest-neighbour, semi-homogeneous random walks on the lattice {0, 1, 2, …} X {0, 1, 2, …} the bivariate generating function of the stationary distribution is analysed for the case where one-step transitions to the north, north-east and east at interior points of the state space all have zero probability. It is shown that this generating function can be represented by meromorphic functions. The construction of this representation is exposed for a variety of one-step transition vectors at the boundary points of the state space.
A Markov chain of M/G/1 type, which arose in a problem of communications engineering, is analyzed by a combination of matrix analysis and appropriate series expansions. The highly explicit results obtainable for this model owe much to analytic methods introduced by Professor Takács. Special features of the numerical implementation of the proposed solution are also discussed. The simplifications proper to this model can also be used in some related, more complex models to be discussed elsewhere.