To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
In this paper, we investigate computable lower bounds for the beststrongly ergodic rate of convergence of the transient probability distribution to the stationary distribution for stochastically monotone continuous-time Markov chains and reversible continuous-time Markov chains, using a drift function and the expectation of the first hitting time on some state. We apply these results to birth–death processes, branching processes and population processes.
The TCP window size process can be modeled as a piecewise-deterministic Markov process that increases linearly and experiences downward jumps at Poisson times. We present a transient analysis of this window size process. Our main result is the Laplace transform of the transient moments. Formulae for the integer and fractional moments are derived, as well as an explicit characterization of the speed of convergence to steady state. Central to our approach are the infinitesimal generator and Dynkin's martingale.
We show that the positive Wiener-Hopf factor of a Lévy process with positive jumps having a rational Fourier transform is a rational function itself, expressed in terms of the parameters of the jump distribution and the roots of an associated equation. Based on this, we give the closed form of the ruin probability for a Lévy process, with completely arbitrary negatively distributed jumps, and finite intensity positive jumps with a distribution characterized by a rational Fourier transform. We also obtain results for the ladder process and its Laplace exponent. A key role is played by the analytic properties of the characteristic exponent of the process and by a Baxter-Donsker-type formula for the positive factor that we derive.
Weak local linear approximations have played a prominent role in the construction of effective inference methods and numerical integrators for stochastic differential equations. In this note two weak local linear approximations for stochastic differential equations with jumps are introduced as a generalization of previous ones. Their respective order of convergence is obtained as well.
A comparison theorem is stated for Markov processes in Polish state spaces. We consider a general class of stochastic orderings induced by a cone of real functions. The main result states that stochastic monotonicity of one process and comparability of the infinitesimal generators imply ordering of the processes. Several applications to convex type and to dependence orderings are given. In particular, Liggett's theorem on the association of Markov processes is a consequence of this comparison result.
We describe the random meeting motion of a finite number of investors in markets with friction as a Markov pure-jump process with interactions. Using a sequence of these, we prove a functional law of large numbers relating the large motions with the finite market of the so-called continuum of agents.
The reduced Markov branching process is a stochastic model for the genealogy of an unstructured biological population. Its limit behavior in the critical case is well studied for the Zolotarev-Slack regularity parameter α ∈ (0, 1]. We turn to the case of very heavy-tailed reproduction distribution α = 0 assuming that Zubkov's regularity condition holds with parameter β ∈ (0, ∞). Our main result gives a new asymptotic pattern for the reduced branching process conditioned on nonextinction during a long time interval.
A dual Markov branching process (DMBP) is by definition a Siegmund's predual of some Markov branching process (MBP). Such a process does exist and is uniquely determined by the so-called dual-branching property. Its q-matrix Q is derived and proved to be regular and monotone. Several equivalent definitions for a DMBP are given. The criteria for transience, positive recurrence, strong ergodicity, and the Feller property are established. The invariant distributions are given by a clear formulation with a geometric limit law.
Inference of evolutionary trees and rates from biological sequences is commonly performed using continuous-time Markov models of character change. The Markov process evolves along an unknown tree while observations arise only from the tips of the tree. Rate heterogeneity is present in most real data sets and is accounted for by the use of flexible mixture models where each site is allowed its own rate. Very little has been rigorously established concerning the identifiability of the models currently in common use in data analysis, although nonidentifiability was proven for a semiparametric model and an incorrect proof of identifiability was published for a general parametric model (GTR + Γ + I). Here we prove that one of the most widely used models (GTR + Γ) is identifiable for generic parameters, and for all parameter choices in the case of four-state (DNA) models. This is the first proof of identifiability of a phylogenetic model with a continuous distribution of rates.
We consider decay properties including the decay parameter, invariant measures, invariant vectors, and quasistationary distributions of a Markovian bulk-arriving queue that stops immediately after hitting the zero state. Investigating such behavior is crucial in realizing the busy period and some other related properties of Markovian bulk-arriving queues. The exact value of the decay parameter λC is obtained and expressed explicitly. The invariant measures, invariant vectors, and quasistationary distributions are then presented. We show that there exists a family of invariant measures indexed by λ ∈ [0, λC]. We then show that, under some conditions, there exists a family of quasistationary distributions, also indexed by λ ∈ [0, λC]. The generating functions of these invariant measures and quasistationary distributions are presented. We further show that a stopped Markovian bulk-arriving queue is always λC-transient and some deep properties are revealed. The clear geometric interpretation of the decay parameter is explained. A few examples are then provided to illustrate the results obtained in this paper.
We consider a branching particle system where an individual particle gives birth to a random number of offspring at the place where it dies. The probability distribution of the number of offspring is given by pk, k = 2, 3, …. The corresponding branching process is related to the semilinear partial differential equation for x ∈ ℝd, where A is the infinitesimal generator of a multiplicative semigroup and the pks, k = 2, 3, …, are nonnegative functions such that We obtain sufficient conditions for the existence of global positive solutions to semilinear equations of this form. Our results extend previous work by Nagasawa and Sirao (1969) and others.
Expressions for the joint distribution of the longest and second longest excursions as well as the marginal distributions of the three longest excursions in the Brownian bridge are obtained. The method, which primarily makes use of the weak convergence of the random walk to the Brownian motion, principally gives the possibility to obtain any desired joint or marginal distribution. Numerical illustrations of the results are also given.
We consider a critical discrete-time branching process with generation dependent immigration. For the case in which the mean number of immigrating individuals tends to ∞ with the generation number, we prove functional limit theorems for centered and normalized processes. The limiting processes are deterministically time-changed Wiener, with three different covariance functions depending on the behavior of the mean and variance of the number of immigrants. As an application, we prove that the conditional least-squares estimator of the offspring mean is asymptotically normal, which demonstrates an alternative case of normality of the estimator for the process with nondegenerate offspring distribution. The norming factor is where α(n) denotes the mean number of immigrating individuals in the nth generation.
When {Xn} is an irreducible, stationary, aperiodic Markov chain on the countable state space X = {i, j,…}, the study of long-range dependence of any square integrable functional {Yn} := {yXn} of the chain, for any real-valued function {yi: i ∈ X}, involves in an essential manner the functions Qijn = ∑r=1n(pijr − πj), where pijr = P{Xr = j | X0 = i} is the r-step transition probability for the chain and {πi: i ∈ X} = P{Xn = i} is the stationary distribution for {Xn}. The simplest functional arises when Yn is the indicator sequence for visits to some particular state i, Ini = I{Xn=i} say, in which case limsupn→∞n−1var(Y1 + ∙ ∙ ∙ + Yn) = limsupn→∞n−1 var(Ni(0, n]) = ∞ if and only if the generic return time random variable Tii for the chain to return to state i starting from i has infinite second moment (here, Ni(0, n] denotes the number of visits of Xr to state i in the time epochs {1,…,n}). This condition is equivalent to Qjin → ∞ for one (and then every) state j, or to E(Tjj2) = ∞ for one (and then every) state j, and when it holds, (Qijn / πj) / (Qkkn / πk) → 1 for n → ∞ for any triplet of states i, jk.
The analysis of stochastic loss networks has long been of interest in computer and communications networks and is becoming important in the areas of service and information systems. In traditional settings computing the well-known Erlang formula for blocking probabilities in these systems becomes intractable for larger resource capacities. Using compound point processes to capture stochastic variability in the request process, we generalize existing models in this framework and derive simple asymptotic expressions for the blocking probabilities. In addition, we extend our model to incorporate reserving resources in advance. Although asymptotic, our experiments show an excellent match between derived formulae and simulation results even for relatively small resource capacities and relatively large values of the blocking probabilities.
By establishing general relationships between branching transformations (Harris-Sevastyanov, Lamperti-Ney, time reversals, and Asmussen-Sigman) and Markov chain transforms (Doob's h-transform, time reversal, and the cone dual), we discover a deeper connection between these transformations with harmonic functions and invariant measures for the process itself and its space-time process. We give a classification of the duals into Doob's h-transform, pathwise time reversal, and cone reversal. Explicit results are obtained for the linear fractional offspring distribution. Remarkably, for this case, all reversals turn out to be a Galton-Watson process with a dual reproduction law and eternal particle or some kind of immigration. In particular, we generalize a result of Klebaner and Sagitov (2002) in which only a geometric offspring distribution was considered. A new graphical representation in terms of an associated simple random walk on N2 allows for illuminating picture proofs of our main results concerning transformations of the linear fractional Galton-Watson process.
We consider the generalized version in continuous time of the parking problem of Knuth introduced in Bansaye (2006). Files arrive following a Poisson point process and are stored on a hardware identified with the real line, at the right of their arrival point. Here we study the evolution of the endpoints of the data block straddling 0, which is empty at time 0 and is equal to R at a deterministic time.
An alternative version of the necessary and sufficient condition for almost sure fixation in the conditional branching process model is derived. This formulation provides an insight into why the examples considered in Buckley and Seneta (1983) all have the same condition for fixation.
Algorithms are introduced that produce optimal Markovian couplings for large finite-state-space discrete-time Markov chains with sparse transition matrices; these algorithms are applied to some toy models motivated by fluid-dynamical mixing problems at high Peclét number. An alternative definition of the time-scale of a mixing process is suggested. Finally, these algorithms are applied to the problem of coupling diffusion processes in an acute-angled triangle, and some of the simplifications that occur in continuum coupling problems are discussed.
The transmission control protocol (TCP) is a transport protocol used in the Internet. In Ott (2005), a more general class of candidate transport protocols called ‘protocols in the TCP paradigm’ was introduced. The long-term objective of studying this class is to find protocols with promising performance characteristics. In this paper we study Markov chain models derived from protocols in the TCP paradigm. Protocols in the TCP paradigm, as TCP, protect the network from congestion by decreasing the ‘congestion window’ (i.e. the amount of data allowed to be sent but not yet acknowledged) when there is packet loss or packet marking, and increasing it when there is no loss. When loss of different packets are assumed to be independent events and the probability p of loss is assumed to be constant, the protocol gives rise to a Markov chain {Wn}, where Wn is the size of the congestion window after the transmission of the nth packet. For a wide class of such Markov chains, we prove weak convergence results, after appropriate rescaling of time and space, as p → 0. The limiting processes are defined by stochastic differential equations. Depending on certain parameter values, the stochastic differential equation can define an Ornstein-Uhlenbeck process or can be driven by a Poisson process.