We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
A class of non-linear stochastic models is introduced. Phase transitions, critical points and the domain of attraction are discussed in detail for some concrete examples. For one of the examples the explicit expression for the domain of attraction and the rates of convergence are obtained.
In this paper we extend the class of zero-order threshold autoregressive models to a much richer class of mixture models. The new class has the important property of duality which, as we show, corresponds to time reversal. We are then able to obtain the time reversals of the zero-order threshold models and to characterise the time-reversible members of this subclass. These turn out to be quite trivial. The time-reversible models of the more general class do not suffer in this way. The complete stationary distributional structure is given, as are various moments, in particular the autocovariance function. This is shown to be of ARMA type. Finally we give two examples, the second of which extends from the finite to the countable mixture case. The general theory for this extension will be given elsewhere.
In this paper, we discuss a variety of methods for computing the Wiener-Hopf factorization of a finite Markov chain associated to a fluctuating additive functional. The importance of this is that the equilibrium law of a fluid model can be expressed in terms of these Wiener–Hopf factors. The diagonalization methods considered are actually quite efficient, and provide an effective solution to the problem.
Krafft and Schaefer [14] considered a two-parameter Ehrenfest urn model and found the n-step transition probabilities using representations by Krawtchouk polynomials. For a special case of the model Palacios [17] calculated some of the expected first-passage times. This note investigates a generalization of the two-parameter Ehrenfest urn model where the transition probabilities pi,i+1 and pi,i+1 are allowed to be quadratic functions of the current state i. The approach used in this paper is based on the integral representations of Karlin and McGregor [9] and can also be used for Markov chains with an infinite state space.
For a Markov chain in n = 2, drift vectors (conditional expected jumps) on the interior and the boundaries distinguish between recurrence and transience. The result of this paper is that the analogous proposition in the n = 3 case fails.
In [14] a necessary and sufficient condition was obtained for there to exist uniquely a Q-process with a specified invariant measure, under the assumption that Q is a stable, conservative, single-exit matrix. The purpose of this note is to demonstrate that, for an arbitrary stable and conservative q-matrix, the same condition suffices for the existence of a suitable Q-process, but that this process might not be unique. A range of examples is considered, including pure-birth processes, a birth process with catastrophes, birth-death processes and the Markov branching process with immigration.
We consider the two-dimensional process {X(t), V(t)} where {V(t)} is Brownian motion with drift, and {X(t)} is its integral. In this note we derive the joint density function of T and V(T) where T is the time at which the process {X(t)} first returns to its initial value. A series expansion of the marginal density of T is given in the zero-drift case. When V(0) and the drift are both positive there is a positive probability that {Χ (t)} never returns to its initial value. We show how this probability grows for small drift. Finally, using the Kontorovich–Lebedev transform pair we obtain the escape probability explicitly for arbitrary values of the drift parameter.
Consider the convex hull of n independent, identically distributed points in the plane. Functionals of interest are the number of vertices Nn, the perimeter Ln and the area An of the convex hull. We study the asymptotic behaviour of these three quantities when the points are standard normally distributed. In particular, we derive the variances of Nn, Ln and An for large n and prove a central limit theorem for each of these random variables. We enlarge on a method developed by Groeneboom (1988) for uniformly distributed points supported on a bounded planar region. The process of vertices of the convex hull is of central importance. Poisson approximation and martingale techniques are used.
The paper deals with large trunk line systems of the type appearing in telephone networks. There are many nodes or input sources, each pair of which is connected by a trunk line containing many individual circuits. Traffic arriving at either end of a trunk line wishes to communicate to the node at the other end. If the direct route is full, a rerouting might be attempted via an alternative route containing several trunks and connecting the same endpoints. The basic questions concern whether to reroute, and if so how to choose the alternative path. If the network is ‘large’ and fully connected, then the overflow traffic which is offered for rerouting to any trunk comes from many other trunks in the network with no one dominating. In this case one expects that some sort of averaging method can be used to approximate the rerouting requests and hence simplify the analysis. Essentially, the overflow traffic that a trunk offers the network for rerouting is in some average sense similar to the overflow traffic offered to that trunk. Indeed, a formalization of this idea involves the widely used (but generally heuristic) ‘fixed point' approximation method. One sets up the fixed point equations for appropriate rerouting strategies and then solves them to obtain an approximation to the system loss. In this paper we work in the heavy traffic regime, where the external offered traffic to any trunk is close to the service capacity of that trunk. It is shown that, as the number of links and circuits within each link go to infinity and for a variety of rerouting strategies, the system can be represented by an averaged limit. This limit is a reflected diffusion of the McKean–Vlasov (propagation of chaos) type, where the driving terms depend on the mean values of the solution of the equation. The averages occur due to the symmetry of the network and the averaging effects of the many interactions. This provides a partial justification for the fixed point method. The concrete dynamical systems flavor of the approach and the representations of the limit processes provide a useful way of visualizing the system and promise to be useful for the development of numerical methods and further analysis.
We have two aims in this paper. First, we generalize the well-known theory of matrix-geometric methods of Neuts to more complicated Markov chains. Second, we use the theory to solve a last-come-first-served queue with a generalized preemptive resume (LCFS-GPR) discipline. The structure of the Markov chain considered in this paper is one in which one of the variables can take values in a countable set, which is arranged in the form of a tree. The other variable takes values from a finite set. Each node of the tree can branch out into d other nodes. The steady-state solution of this Markov chain has a matrix product-form, which can be expressed as a function of d matrices Rl,· ··, Rd. We then use this theory to solve a multiclass LCFS-GPR queue, in which the service times have PH-distributions and arrivals are according to the Markov modulated Poisson process. In this discipline, when a customer's service is preempted in phase j (due to a new arrival), the resumption of service at a later time could take place in a phase which depends on j. We also obtain a closed form solution for the stationary distribution of an LCFS-GPR queue when the arrivals are Poisson. This result generalizes the known result on a LCFS preemptive resume queue, which can be obtained from Kelly's symmetric queue.
We present two sufficient conditions for detection of optimal and non-optimal actions in (ergodic) average-cost MDPs. They are easily interpreted and can be implemented as detection tests in both policy iteration and linear programming methods. An efficient implementation of a recent new policy iteration scheme is discussed.
This paper shows how to calculate solutions to Poisson's equation for the waiting time sequence of the recurrent M/G/l queue. The solutions are used to construct martingales that permit us to study additive functionals associated with the waiting time sequence. These martingales provide asymptotic expressions, for the mean of additive functionals, that reflect dependence on the initial state of the process. In addition, we show how to explicitly calculate the scaling constants that appear in the central limit theorems for additive functionals of the waiting time sequence.
In this research, we present a statistical theory, and an algorithm, to identify one-pixel-wide closed object boundaries in gray-scale images. Closed-boundary identification is an important problem because boundaries of objects are major features in images. In spite of this, most statistical approaches to image restoration and texture identification place inappropriate stationary model assumptions on the image domain. One way to characterize the structural components present in images is to identify one-pixel-wide closed boundaries that delineate objects. By defining a prior probability model on the space of one-pixel-wide closed boundary configurations and appropriately specifying transition probability functions on this space, a Markov chain Monte Carlo algorithm is constructed that theoretically converges to a statistically optimal closed boundary estimate. Moreover, this approach ensures that any approximation to the statistically optimal boundary estimate will have the necessary property of closure.
We study a bivariate stochastic process {X(t)} = Z(t))}, where {XE(t)} is a continuous-time Markov chain describing the environment and {Z(t)} is the process of interest. In the context which motivated this study, {Z(t)} models the gating behaviour of a single ion channel. It is assumed that given {XE(t)}, the channel process {Z(t)} is a continuous-time Markov chain with infinitesimal generator at time t dependent on XE(t), and that the environment process {XE{t)} is not dependent on {Z(t)}. We derive necessary and sufficient conditions for {X(t)} to be time reversible, showing that then its equilibrium distribution has a product form which reflects independence of the state of the environment and the state of the channel. In the special case when the environment controls the speed of the channel process, we derive transition probabilities and sojourn time distributions for {Z(t)} by exploiting connections with Markov reward processes. Some of these results are extended to a stationary environment. Applications to problems arising in modelling multiple ion channel systems are discussed. In particular, we present ways in which a multichannel model in a random environment does and does not exhibit behaviour identical to a corresponding model based on independent and identically distributed channels.
We prove that the quasi-invariant measures associated to a Brownian motion with negative drift X form a one-parameter family. The minimal one is a probability measure inducing the transition density of a three-dimensional Bessel process, and it is shown that it is the density of the limit distribution limt→∞Px(X A | τ > t). It is also shown that the minimal quasi-invariant measure of infinite mass induces the density of the limit distribution ) which is the law of a Bessel process with drift.
In the theory of homogeneous Markov chains, states are classified according to their connectivity to other states and this classification leads to a classification of the Markov chains themselves. In this paper we classify Markov set-chains analogously, particularly into ergodic, regular, and absorbing Markov set-chains. A weak law of large numbers is developed for regular Markov set-chains. Examples are used to illustrate analysis of behavior of Markov set-chains.
The Jackson network under study receives batch arrivals at i.i.d. intervals and features Markovian routing and exponentially distributed service times. The system is shown to be stable, in the sense of not being overloaded, if and only if, for each node, the total arrival rate of external and internal customers is less than the service rate. The method of proof is of more general interest.
The algorithm for the transient solution for the denumerable state Markov process with an arbitrary initial distribution is given in this paper. The transient queue length distribution for a general Markovian queueing system can be obtained by this algorithm. As examples, some numerical results are presented.
This paper is concerned with a model for the spread of an epidemic in a closed, homogeneously mixing population in which new infections occur at rate f(x, y) and removals occur at rate g(x, y), where x and y are the numbers of susceptible and infective individuals, respectively, and f and g are arbitrary but specified positive real-valued functions. Sequences of such epidemics, indexed by the initial number of susceptibles n, are considered and conditions are derived under which the epidemic processes converge almost surely to a birth and death process as n tends to infinity. Thus a threshold theorem for such an epidemic model is obtained. The results are extended to models which incorporate immigration and emigration of susceptibles. The theory is illustrated by several examples of models taken from the epidemic literature. Generalizations to multipopulation epidemics are discussed briefly.
Let Φ = {Φ n} be an aperiodic, positive recurrent Markov chain on a general state space, π its invariant probability measure and f ≧ 1. We consider the rate of (uniform) convergence of Ex[g(Φ n)] to the stationary limit π (g) for |g| ≦ f: specifically, we find conditions under which
as n →∞, for suitable subgeometric rate functions r. We give sufficient conditions for this convergence to hold in terms of
(i) the existence of suitably regular sets, i.e. sets on which (f, r)-modulated hitting time moments are bounded, and
(ii) the existence of (f, r)-modulated drift conditions (Foster–Lyapunov conditions).
The results are illustrated for random walks and for more general state space models.