We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
A measure-valued diffusion approximation to a two-level branching structure was introduced in Dawson and Hochberg (1991) where it was shown that conditioned on non-extinction at time t, and appropriately rescaled, the process converges as t → ∞to a non-trivial limiting distribution. Here we discuss a different approach to conditioning on non-extinction (popular in one-level branching) and relate the two limiting distributions.
This paper is concerned with the problem of estimation for the diffusion coefficient of a diffusion process on R, in a non-parametric situation. The drift function can be unknown and considered as a nuisance parameter. We propose an estimator of σ based on discrete observation of the diffusion X throughout a given finite time interval. We describe the asymptotic behaviour of this estimator when the step of discretization tends to zero. We prove consistency and asymptotic normality, the rate of convergence to the normal law being a random variable linked to the local time of the diffusion or to its suitable discrete approximation. This can also be interpreted as a convergence to a mixture of normal law.
The tail behaviour of the limit of the normalized population size in the simple supercritical branching process, W, is studied. Most of the results concern those cases when a tail of the distribution function of W decays exponentially quickly. In essence, knowledge of the behaviour of transforms can be combined with some ‘large-deviation' theory to get detailed information on the oscillation of the distribution function of W near zero or at infinity. In particular we show how an old result of Harris (1948) on the asymptotics of the moment-generating function of W translates to tail behaviour.
A two-parameter Ehrenfest urn model is derived according to the approach taken by Karlin and McGregor [7] where Krawtchouk polynomials are used. Furthermore, formulas for the mean passage times of finite homogeneous Markov chains with general tridiagonal transition matrices are given. In the special case of the Ehrenfest model they have quite a different structure as compared with those of Blom [2] or Kemperman [9].
Brownian flow systems, i.e. multidimensional Brownian motion with regulating barriers, can model queueing and inventory systems in which the behavior of different queues is correlated because of shared input processes. The behavior of such systems is typically difficult to describe exactly. We show how Brownian models of such systems, conditioned on one queue length exceeding a large value, decompose asymptotically into smaller subsystems. This conditioning induces a change in drift of the system's net input process and its components. The results here are analogous to results for jump-Markov queues recently obtained by Shwartz and Weiss. The Brownian setting leads to a simple description of the component processes' asymptotic behaviour, as well as to explicit distributional results.
This paper considers the absorption of a non-decreasing compound Poisson process of finite order in a general upper boundary. The problem is relevant in fields such as risk theory, Kolmogorov–Smirnov statistics and sequential analysis. The probability of absorption and first-passage times are given in terms of a generating function which depends on the boundary only and can be computed readily. Absorption is certain or not as the asymptotic slope of the boundary is greater or less than the expected increase of the process in unit time. The case of the linear boundary is considered in detail.
This paper describes a set of stochastic processes that is useful for modeling and analyzing a new genetic mapping method. Some of the processes are Markov chains, and some are best described as functions of Markov chains. The central issue is boundary-crossing probabilities, which correspond to p-values for the existence of genes for particular traits. The methods elaborated by Aldous (1989) provide very accurate approximate p-values, as spot-checked against simulations.
The supercritical Markov branching process is examined in the case where the minimal version of the process has strictly substochastic transition laws. This provides a nice example of the general construction theory for discrete-state Markov processes.
Entrance laws corresponding to the minimal process are characterised. Limit properties of the processes constructed from these entrance laws are examined. All such processes which are honest and cannot hit zero are ergodic. Otherwise these processes are λ-positive and limit theorems conditional on not having left the positive states are given.
A connection is made with recent work on the general construction problem when a λ-subinvariant measure is given. The case where immigration is allowed is mentioned.
In this paper, optimal stopping problems for semi-Markov processes are studied in a fairly general setting. In such a process transitions are made from state to state in accordance with a Markov chain, but the amount of time spent in each state is random. The times spent in each state follow a general renewal process. They may depend on the present state as well as on the state into which the next transition is made.
Our goal is to maximize the expected net return, which is given as a function of the state at time t minus some cost function. Discounting may or may not be considered. The main theorems (Theorems 3.5 and 3.11) are expressions for the optimal stopping time in the undiscounted and discounted case. These theorems generalize results of Zuckerman [16] and Boshuizen and Gouweleeuw [3]. Applications are given in various special cases.
The results developed in this paper can also be applied to semi-Markov shock models, as considered in Taylor [13], Feldman [6] and Zuckerman [15].
During the last few years, several variants of P. Lévy's formula for the stochastic area of complex Brownian motion have been obtained. These are of interest in various domains of applied probability, particularly in relation to polymer studies. The method used by most authors is the diagonalization procedure of Paul Lévy.
Here we derive one such variant of Lévy's formula, due to Chan, Dean, Jansons and Rogers, via a change of probability method, which reduces the computation of Laplace transforms of Brownian quadratic functionals to the computations of the means and variances of some adequate Gaussian variables.
We then show that with the help of linear algebra and invariance properties of the distribution of Brownian motion, we are able to derive simply three other variants of Lévy's formula.
In this paper we extend the results of Meyn and Tweedie (1992b) from discrete-time parameter to continuous-parameter Markovian processes Φ evolving on a topological space.
We consider a number of stability concepts for such processes in terms of the topology of the space, and prove connections between these and standard probabilistic recurrence concepts. We show that these structural results hold for a major class of processes (processes with continuous components) in a manner analogous to discrete-time results, and that complex operations research models such as storage models with state-dependent release rules, or diffusion models such as those with hypoelliptic generators, have this property. Also analogous to discrete time, ‘petite sets', which are known to provide test sets for stability, are here also shown to provide conditions for continuous components to exist.
New ergodic theorems for processes with irreducible and countably reducible skeleton chains are derived, and we show that when these conditions do not hold, then the process may be decomposed into an uncountable orbit of skeleton chains.
The steady-state analysis of a quasi-birth-death process is possible by matrix geometric procedures in which the root to a quadratic matrix equation is found. A recent method that can be used for analyzing quasi-birth–death processes involves expanding the state space and using a linear matrix equation instead of the quadratic form. One of the difficulties of using the linear matrix equation approach regards the boundary conditions and obtaining the norming equation. In this paper, we present a method for calculating the boundary values and use the operator-machine interference problem as a vehicle to compare the two approaches for solving quasi-birth-death processes.
A continuous-time Markov chain on the non-negative integers is called skip-free to the left (right) if the governing infinitesimal generator A = (aij) has the property that aij = 0 for j ≦ i ‒ 2 (i ≦ j – 2). If a Markov chain is skip-free both to the left and to the right, it is called a birth-death process. Quasi-limiting distributions of birth–death processes have been studied in detail in their own right and from the standpoint of finite approximations. In this paper, we generalize, to some extent, results for birth-death processes to Markov chains that are skip-free to the left in continuous time. In particular the decay parameter of skip-free Markov chains is shown to have a similar representation to the birth-death case and a result on convergence of finite quasi-limiting distributions is obtained.
In Part I we developed stability concepts for discrete chains, together with Foster–Lyapunov criteria for them to hold. Part II was devoted to developing related stability concepts for continuous-time processes. In this paper we develop criteria for these forms of stability for continuous-parameter Markovian processes on general state spaces, based on Foster-Lyapunov inequalities for the extended generator.
Such test function criteria are found for non-explosivity, non-evanescence, Harris recurrence, and positive Harris recurrence. These results are proved by systematic application of Dynkin's formula.
We also strengthen known ergodic theorems, and especially exponential ergodic results, for continuous-time processes. In particular we are able to show that the test function approach provides a criterion for f-norm convergence, and bounding constants for such convergence in the exponential ergodic case.
We apply the criteria to several specific processes, including linear stochastic systems under non-linear feedback, work-modulated queues, general release storage processes and risk processes.
In this paper we introduce a multilevel birth-death particle system and consider its diffusion approximation which can be characterized as a M([R+)-valued process. The tightness of rescaled processes is proved and we show that the limiting M(R+)-valued process is the unique solution of the M([R+)-valued martingale problem for the limiting generator. We also study the moment structures of the limiting diffusion process.
The skeleton of a (super-) critical Galton-Watson process with offspring mean 1 + r, r ≧ 0, and finite offspring variance, is considered. When r = 0 it is trivial. If r > 0 is small and the time unit is taken as α /r generations (α > 0) then the skeleton can be approximated by a Yule (linear pure birth) process of rate α. This approximation can be used to study the evolution of genetic types over a long period of time in an exponentially growing population.
A new approach is used to obtain the transient probabilities of the M/M/1 queueing system. The first step of this approach deals with the generating function of the transient probabilities of the uniformized Markov chain associated with this queue. The second step consists of the inversion of this generating function. A new analytical expression of the transient probabilities of the M/M/1 queue is then obtained.
We consider lumpability for continuous-time Markov chains and provide a simple probabilistic proof of necessary and sufficient conditions for strong lumpability, valid in circumstances not covered by known theory. We also consider the following marginalisability problem. Let {X{t)} = {(X1(t), X2(t), · ··, Xm(t))} be a continuous-time Markov chain. Under what conditions are the marginal processes {X1(t)}, {X2(t)}, · ··, {Xm(t)} also continuous-time Markov chains? We show that this is related to lumpability and, if no two of the marginal processes can jump simultaneously, then they are continuous-time Markov chains if and only if they are mutually independent. Applications to ion channel modelling and birth–death processes are discussed briefly.
Quasi-birth-death processes are commonly used Markov chain models in queueing theory, computer performance, teletraffic modeling and other areas. We provide a new, simple algorithm for the matrix-geometric rate matrix. We demonstrate that it has quadratic convergence. We show theoretically and through numerical examples that it converges very fast and provides extremely accurate results even for almost unstable models.
A stochastic process, called reallocatable GSMP (RGSMP for short), is introduced in order to study insensitivity of its stationary distribution. RGSMP extends GSMP with interruptions, and is applicable to a wide range of queues, from the standard models such as BCMP and Kelly's network queues to new ones such as their modifications with interruptions and Serfozo's (1989) non-product form network queues, and can be used to study their insensitivity in a unified way. We prove that RGSMP supplemented by the remaining lifetimes is product-form decomposable, i.e. its stationary distribution splits into independent components if and only if a version of the local balance equations hold, which implies insensitivity of the RGSMP scheme in a certain extended sense. Various examples of insensitive queues are given, which include new results. Our proofs are based on the characterization of a stationary distribution for SCJP (self-clocking jump process) of Miyazawa (1991).