We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
In this paper, we consider kth-order two-state Markov chains {Xi} with stationary transition probabilities. For k = 1, we construct in detail an upper bound for the total variation d(Sn, Y) = Σx |𝐏(Sn = x) − 𝐏(Y = x)|, where Sn = X1 + · ··+ Xn and Y is a compound Poisson random variable. We also show that, under certain conditions, d(Sn, Y) converges to 0 as n tends to ∞. For k = 2, the corresponding results are given without derivation. For general k ≧ 3, a conjecture is proposed.
It is well known that most commonly used discrete distributions fail to belong to the domain of maximal attraction for any extreme value distribution. Despite this negative finding, C. W. Anderson showed that for a class of discrete distributions including the negative binomial class, it is possible to asymptotically bound the distribution of the maximum. In this paper we extend Anderson's result to discrete-valued processes satisfying the usual mixing conditions for extreme value results for dependent stationary processes. We apply our result to obtain bounds for the distribution of the maximum based on negative binomial autoregressive processes introduced by E. McKenzie and Al-Osh and Alzaid. A simulation study illustrates the bounds for small sample sizes.
This paper deals with a class of discrete-time Markov chains for which the invariant measures can be expressed in terms of generalized continued fractions. The representation covers a wide class of stochastic models and is well suited for numerical applications. The results obtained can easily be extended to continuous-time Markov chains.
We study conditions for the existence of non-trivial quasi-stationary distributions for the birth-and-death chain with 0 as absorbing state. We reduce our problem to a continued fractions one that can be solved by using extensions of classical results of this theory. We also prove that there exist normalized quasi-stationary distributions if and only if 0 is geometrically absorbing.
We shall establish functional limit laws for the concentration of the various species in simple chemical reactions. These results allow us to conclude that, under quite general conditions, the concentration has an approximate normal distribution. We provide estimates for the mean and the variance which are valid at all stages of the reaction, in particular, the non-equilibrium phase. We also provide a detailed comparison of our results with the earlier work of Dunstan and Reynolds ([7], [8]).
The paper puts forward steady-state Markov chain models for the Heine and Euler distributions. The models for oil exploration strategies that were discussed by Benkherouf and Bather (1988) are reinterpreted as current-age models for discrete renewal processes. Steady-state success-runs processes with non-zero probabilities that a trial is abandoned, Foster processes, and equilibrium random walks corresponding to elective M/M/1 queues are also examined.
In this paper we show that to each distance d defined on the finite state space S of a strongly ergodic Markov chain there corresponds a coefficient ρd of ergodicity based on the Wasserstein metric. For a class of stochastically monotone transition matrices P, the infimum over all such coefficients is given by the spectral radius of P – R, where R = limkPk and is attained. This result has a probabilistic interpretation of a control of the speed of convergence of by the metric d and is linked to the second eigenvalue of P.
We generalize a two-type mutation process in which particles reproduce by binary fission, inheriting the parental type, but which can mutate with small probability during their lifetimes to the opposite type. The generalization allows an arbitrary offspring distribution. The branching process structure of this scheme is exploited to obtain a variety of limit theorems, some of which extend known results for the binary case. In particular, practically usable asymptotic normality results are obtained when the initial population size is large.
We provide a general framework for interconnecting a collection of quasi-reversible nodes in such a way that the resulting process exhibits a product-form invariant measure. The individual nodes can be quite general, although some degree of internal balance will be assumed. Any of the nodes may possess a feedback mechanism. Indeed, we pay particular attention to a class of feedback queues, characterized by the fact that their state description allows one to maintain a record of the order in which events occur. We also examine in some detail the problem of determining for which values of the arrival rates a node does exhibit quasi-reversibility.
We consider an increasing supercritical branching process in a random environment and obtain bounds on the Laplace transform and distribution function of the limiting random variable. There are two possibilities that can be distinguished depending on the nature of the component distributions of the environment. If the minimum family size of each is 1, the growth will be as a power depending on a parameter α. If the minimum family sizes of some are greater than 1, it will be exponential, depending on a parameter γ. We obtain bounds on the distribution function analogous to those found for the simple Galton-Watson case. It is not possible to obtain exact estimates and we are only able to obtain bounds to within ε of the parameters.
In this paper we connect various topological and probabilistic forms of stability for discrete-time Markov chains. These include tightness on the one hand and Harris recurrence and ergodicity on the other. We show that these concepts of stability are largely equivalent for a major class of chains (chains with continuous components), or if the state space has a sufficiently rich class of appropriate sets (‘petite sets').
We use a discrete formulation of Dynkin's formula to establish unified criteria for these stability concepts, through bounding of moments of first entrance times to petite sets. This gives a generalization of Lyapunov–Foster criteria for the various stability conditions to hold. Under these criteria, ergodic theorems are shown to be valid even in the non-irreducible case. These results allow a more general test function approach for determining rates of convergence of the underlying distributions of a Markov chain, and provide strong mixing results and new versions of the central limit theorem and the law of the iterated logarithm.
The classical one-compartment model with no input or pure death process is shown to be a limiting case of a ‘binomial cascade' model which has the same mean and in which particles exit the compartment in binomial clusters. The transition probabilities of the binomial cascade process are derived in closed form. The model is easily modified to allow Poisson input into the compartment. Distributional results are given for this model also. In particular, it is shown that the M/M/∞ queue is a limiting case.
In this paper we introduce and define for the first time the concept of a non-homogeneous semi-Markov system (NHSMS). The problem of finding the expected population stucture is studied and a method is provided in order to find it in closed analytic form with the basic parameters of the system. Moreover, the problem of the expected duration structure in the state is studied. It is also proved that all maintainable expected duration structures by recruitment control belong to a convex set the vertices of which are specified. Finally an illustration is provided of the present results in a manpower system.
The M/G/1 queue with batch arrivals and a queueing discipline which is a generalization of processor sharing is studied by means of Crump–Mode–Jagers branching processes. A number of theorems are proved, including investigation of heavy traffic and overloaded queues. Most of the results obtained are also new for the M/G/1 queue with processor sharing. By use of a limiting procedure we also derive new results concerning M/G/1 queues with shortest residual processing time discipline.
We consider a Wiener process between a reflecting and an absorbing barrier and derive a series solution for the transition density of the process and for the density of the time to absorption. If the drift is towards the reflecting barrier, the variance is not too large, and the distance of the barriers is not too small, the leading term of the series derives from imaginary solutions of the basic eigenvalue equation of this problem. It is shown that these leading terms often make the dominant contribution to the complete series. Finally, we consider previous attempts by Fürth [3], Cox and Miller [1], and by Goel and Richter-Dyn [5] to solve the stated problem and point out, in some detail, why their solutions are wrong.
If φ is a convex function and X a random variable then (by Jensen's inequality) ψ φ (X) = Eφ (X) – φ (EX) is non-negative and 0 iff either φ is linear in the range of X or X is degenerate. So if φ is not linear then ψ φ (X) is a measure of non-degeneracy of the random variable X. For φ (x) = x2, ψ φ (X) is simply the variance V(X) which is additive in the sense that V(X + Y) = V(X) + V(Y) if X and Y are uncorrelated. In this note it is shown that if φ ″(·) is monotone non-increasing then ψ φ is sub-additive for all (X, Y) such that EX ≧ 0, P(Y ≧ 0) = 1 and E(X | Y) = EX w.p.l, and is additive essentially only if φ is quadratic. Thus, it confirms the unique role of variance as a measure of non-degeneracy. An application to branching processes is also given.
In this paper, distributional questions which arise in certain mathematical finance models are studied: the distribution of the integral over a fixed time interval [0, T] of the exponential of Brownian motion with drift is computed explicitly, with the help of computations previously made by the author for Bessel processes. The moments of this integral are obtained independently and take a particularly simple form. A subordination result involving this integral and previously obtained by Bougerol is recovered and related to an important identity for Bessel functions. When the fixed time T is replaced by an independent exponential time, the distribution of the integral is shown to be related to last-exit-time distributions and the fixed time case is recovered by inverting Laplace transforms.
The complete set of eigenvalues is found for the (unlabeled) infinitely-many-neutral-alleles diffusion model. The transition density for the process, originally derived by Griffiths, is rederived as an eigenfunction expansion.
This paper gives an overview of recurrence and ergodicity properties of a Markov chain. Two new notions for ergodicity and recurrence are introduced. They are called μ -geometric ergodicity and μ -geometric recurrence respectively. The first condition generalises geometric as well as strong ergodicity. Our key theorem shows that μ -geometric ergodicity is equivalent to weak μ -geometric recurrence. The latter condition is verified for the time-discretised two-centre open Jackson network. Hence, the corresponding two-dimensional Markov chain is μ -geometrically and geometrically ergodic, but not strongly ergodic. A consequence of μ -geometric ergodicity with μ of product-form is the convergence of the Laplace-Stieltjes transforms of the marginal distributions. Consequently all moments converge.
The existence of a class of multitype measure branching processes is deduced from a single-type model introduced by Li [8], which extends the work of Gorostiza and Lopez-Mimbela [5] and shows that the study of a multitype process can sometimes be reduced to that of a single-type one.