We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
In this paper, we consider the question of which convergence properties of Markov chains are preserved under small perturbations. Properties considered include geometric ergodicity and rates of convergence. Perturbations considered include roundoff error from computer simulation. We are motivated primarily by interest in Markov chain Monte Carlo algorithms.
We show in this paper how the Laplace transform θ* of the duration θ of an excursion by the occupation process {Λt} of an M/M/∞ system above a given threshold can be obtained by means of continued fraction analysis. The representation of θ* by a continued fraction is established and the [m−1/m] Padé approximants are computed by means of well known orthogonal polynomials, namely associated Charlier polynomials. It turns out that the continued fraction considered is an S fraction and as a consequence the Stieltjes transform of some spectral measure. Then, using classic asymptotic expansion properties of hypergeometric functions, the representation of the Laplace transform θ* by means of Kummer's function is obtained. This allows us to recover an earlier result obtained via complex analysis and the use of the strong Markov property satisfied by the occupation process {Λt}. The continued fraction representation enables us to further characterize the distribution of the random variable θ.
In this paper we consider a position–velocity Ornstein-Uhlenbeck process in an external gradient force field pushing it toward a smoothly imbedded submanifold of . The force is chosen so that is asymptotically stable for the associated deterministic flow. We examine the asymptotic behavior of the system when the force intensity diverges together with the diffusion and the damping coefficients, with appropriate speed. We prove that, under some natural conditions on the initial data, the sequence of position processes is relatively compact, any limit process is constrained on , and satisfies an explicit stochastic differential equation which, for compact , has a unique solution.
Dependability evaluation is a basic component in the assessment of the quality of repairable systems. We develop a model taking simultaneously into account the occurrence of failures and repairs, together with the observation of user-defined success events. The model is built from a Markovian description of the behavior of the system. We obtain the distribution function of the joint number of observed failures and of delivered services on a fixed mission period of the system. In particular, the marginal distribution of the number of failures can be directly related to the distribution of the Markovian arrival process extensively used in queueing theory. We give both the analytical expressions of the considered distributions and the algorithmic solutions for their evaluation. An asymptotic analysis is also provided.
We describe two chain-transformations which explain and extend identities for order statistics and quantiles proved by Wendel, Port and, more recently, by Dassios.
Recently, Asmussen and Koole (Journal of Applied Probability30, pp. 365–372) showed that any discrete or continuous time marked point process can be approximated by a sequence of arrival streams modulated by finite state continuous time Markov chains. If the original process is customer (time) stationary then so are the approximating processes. Also, the moments in the stationary case converge. For discrete marked point processes we construct a sequence of discrete processes modulated by discrete time finite state Markov chains. All the above features of approximating sequences of Asmussen and Koole continue to hold. For discrete arrival sequences (to a queue) which are modulated by a countable state Markov chain we form a different sequence of approximating arrival streams by which, unlike in the Asmussen and Koole case, even the stationary moments of waiting times can be approximated. Explicit constructions for the output process of a queue and the total input process of a discrete time Jackson network with these characteristics are obtained.
This paper provides a detailed stochastic analysis of leucocyte cell movement based on the dynamics of a rigid body. The cell's behavior is studied in two relevant anisotropic environments displaying adhesion mediated movement (haptotaxis) and stimulus mediated movement (chemotaxis). This behavior is modeled by diffusion processes on three successively longer time scales, termed locomotion, translocation, and migration.
The gating mechanism of a single ion channel is usually modelled by a continuous-time Markov chain with a finite state space, partitioned into two classes termed ‘open’ and ‘closed’. It is possible to observe only which class the process is in. A burst of channel openings is defined to be a succession of open sojourns separated by closed sojourns all having duration less than t0. Let N(t) be the number of bursts commencing in (0, t]. Thenare measures of the degree of temporal clustering of bursts. We develop two methods for determining the above measures. The first method uses an embedded Markov renewal process and remains valid when the underlying channel process is semi-Markov and/or brief sojourns in either the open or closed classes of state are undetected. The second method uses a ‘backward’ differential-difference equation.
The observed channel process when brief sojourns are undetected can be modelled by an embedded Markov renewal process, whose kernel is shown, by exploiting connections with bursts when all sojourns are detected, to satisfy a differential-difference equation. This permits a unified derivation of both exact and approximate expressions for the kernel, and leads to a thorough asymptotic analysis of the kernel as the length of undetected sojourns tends to zero.
A new Markov process is introduced, describing growth or spread in two dimensions, via the aggregation of particles or the filling of cells. States of the process are configurations of part of the boundary of the growing aggregate, and transitions are captures or escapes of single particles. For suitably chosen transition rates, the process is dynamically reversible, leading to an explicit stationary distribution and a statistical description of the boundary. The growth rate is calculated and growth behaviour described. Different asymptotic relations between transition rates lead to different growth patterns or regimes. Besides the regimes familiar in polymer crystal growth, several new ones are described. The aggregate can have a porous structure resembling thin solid films deposited from vapour. Two measures of porosity, one for the boundary and one for the bulk, are calculated. The process is relevant to growing colonies of bacteria or the like, to the spread of epidemics and grass or forest fires, and to voter models.
We consider weak lumpability of finite homogeneous Markov chains, which is when a lumped Markov chain with respect to a partition of the initial state space is also a homogeneous Markov chain. We show that weak lumpability is equivalent to the existence of a direct sum of polyhedral cones that is positively invariant by the transition probability matrix of the original chain. It allows us, in a unified way, to derive new results on lumpability of reducible Markov chains and to obtain spectral properties associated with lumpability.
Consider the following self-organizing rule called POS(i): after a book in the jth position of a shelf is borrowed, it is moved up one position if ji, and is moved to the ith position if j > i. This is a family of move-forward rules, with POS(l) being the move-to-front rule and POS(n − 1) being the transposition rule where n is the number of books to be organized. We derive explicitly the stationary distribution under the POS(i) rule and show that its search cost compares favorably with that of move-to-front rule under any book access probabilities p1, p1, ···, pn.
A spatial process is considered in which two general birth-death processes are linked by migration of individuals. We examine conditions for weak symmetry and regularity, and develop necessary and sufficient conditions for recurrence. The results are easily extended to the k-process case.
In this paper we study a Volterra integral equation of the second kind, including two arbitrary continuous functions, in order to determine first-passage-time probability density functions through time-dependent boundaries for time-non-homogeneous one-dimensional diffusion processes with natural boundaries. These results generalize those which were obtained for time-homogeneous diffusion processes by Giorno et al. [3], and for some particular classes of time-non-homogeneous diffusion processes by Gutiérrez et al. [4], [5].
Bidimensional processes defined by dx(t) = ρ (x, y)dt and dy(t) = m(x, y)dt + [2v(x, y)]1/2dW(t),where W(t) is a Wiener process, are considered. Let T(x, y) be the first time the process (x(t), y(t)), starting from (x, y), hits the boundary of a given region in . A theorem is proved that gives necessary and sufficient conditions for a given complex function to be considered as the moment generating function of T(x, y) for some bidimensional diffusion process. Examples are given where the theorem is used to construct explicit solutions to first hitting time problems and to compute the infinitesimal moments that correspond to the chosen moment generating function.
Recently Miyazawa and Taylor (1997) proposed a new class of queueing networks with batch arrival batch service and assemble-transfer features. In such networks customers arrive and are served in batches, and may change size when a batch transfers from one node to another. With the assumption of an additional arrival process at each node when it is empty, they obtain a simple product-form steady-state probability distribution, which is a (stochastic) upper bound for the original network. This paper shows that this class of network possesses a set of non-standard partial balance equations, and it is demonstrated that the condition of the additional arrival process introduced by Miyazawa and Taylor is there precisely to satisfy the partial balance equations, i.e. it is necessary and sufficient not only for having a product form solution, but also for the partial balance equations to hold.
For Markov chains of M/G/1 type that are not skip-free to the left, the corresponding G matrix is shown to have special structure and be determined by its first block row. An algorithm that takes advantage of this structure is developed for computing G. For non-skip-free M/G/1 type Markov chains, the algorithm significantly reduces the computational complexity of calculating the G matrix, when compared with reblocking to a system that is skip-free to the left and then applying usual iteration schemes to find G. A similar algorithm to calculate the R matrix for G/M/1 type Markov chains that are not skip-free to the right is also described.
This paper is a study of the error in approximating the global maximum of a Brownian motion on the unit interval by observing the value at randomly chosen points. One point of view is to look at the error from random sampling for a given fixed Brownian sample path; another is to look at the error with both the path and observations random. In the first case we show that for almost all Brownian paths the error, normalized by multiplying by the square root of the number of observations, does not converge in distribution, while in the second case the normalized error does converge in distribution. We derive the limiting distribution of the normalized error averaged over all paths.
In this paper the L2-convergence of a superadditive bisexual Galton–Watson branching process is studied. Necessary and sufficient conditions for the convergence of the suitably normed process are given. In the final section, a result about one of the most important bisexual models is proved.
Extreme value results for a class of shot noise processes with heavy tailed amplitudes are considered. For a process of the form, , where {τ k} are the points of a renewal process and {Ak} are i.i.d. with d.f. having a regularly varying tail, the limiting behavior of the maximum is determined. The extremal index is computed and any value in (0, 1) is possible. Two-dimensional point processes of the form are shown to converge to a compound Poisson point process limit. As a corollary to this result, the joint limiting distribution of high local maxima is obtained.