To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
A sequence of first-order integer-valued autoregressive (INAR(1)) processes is investigated, where the autoregressive-type coefficient converges to 1. It is shown that the limiting distribution of the conditional least squares estimator for this coefficient is normal and the rate of convergence is n3/2. Nearly critical Galton–Watson processes with unobservable immigration are also discussed.
The aim of this paper is to define the entropy of a finite semi-Markov process. We define the entropy of the finite distributions of the process, and obtain explicitly its entropy rate by extending the Shannon–McMillan–Breiman theorem to this class of nonstationary continuous-time processes. The particular cases of pure jump Markov processes and renewal processes are considered. The relative entropy rate between two semi-Markov processes is also defined.
For Markov processes on the positive integers with the origin as an absorbing state, Ferrari, Kesten, Martínez and Picco studied the existence of quasi-stationary and limiting conditional distributions by characterizing quasi-stationary distributions as fixed points of a transformation Φ on the space of probability distributions on {1, 2, …}. In the case of a birth–death process, the components of Φ(ν) can be written down explicitly for any given distribution ν. Using this explicit representation, we will show that Φ preserves likelihood ratio ordering between distributions. A conjecture of Kryscio and Lefèvre concerning the quasi-stationary distribution of the SIS logistic epidemic follows as a corollary.
Let T = (T1, T2,…)be a sequence of real random variables with∑j=1∞1|Tj|>0 <∞ almost surely. We consider the following equation for distributions μ: W ≅ ∑j=1∞TjWj, where W, W1, W2,… have distribution μ and T, W1, W2,… are independent. We show that the representation of general solutions is a mixture of certain infinitely divisible distributions. This result can be applied to investigate the existence of symmetric solutions for Tj ≥ 0: essentially under the condition that E ∑j=1∞Tj2 log+Tj2 < ∞, the existence of nontrivial symmetric solutions is exactly determined, revealing a connection with the existence of positive solutions of a related fixed-point equation. Furthermore, we derive results about a special class of canonical symmetric solutions including statements about Lebesgue density and moments.
This paper is a first study of two-person zero-sum games for denumerable continuous-time Markov chains determined by given transition rates, with an average payoff criterion. The transition rates are allowed to be unbounded, and the payoff rates may have neither upper nor lower bounds. In the spirit of the ‘drift and monotonicity’ conditions for continuous-time Markov processes, we give conditions on the controlled system's primitive data under which the existence of the value of the game and a pair of strong optimal stationary strategies is ensured by using the Shapley equations. Also, we present a ‘martingale characterization’ of a pair of strong optimal stationary strategies. Our results are illustrated with a birth-and-death game.
We consider the following ordering for stochastic processes as introduced by Irle and Gani (2001). A process (Yt)t is said to be slower in level crossing than a process (Zt)t if it takes (Yt)t stochastically longer than (Zt)t to exceed any given level. In Irle and Gani (2001), this ordering was investigated for Markov chains in discrete time. Here these results are carried over to semi-Markov processes with particular attention to birth-and-death processes and also to Wiener processes.
We study the uniform ergodicity of Markov processes (Zn, n ≥ 1) of order 2 with a general state space (Z, 𝒵). Markov processes of order higher than 1 were defined in the literature long ago, but scarcely treated in detail. We take as the basis for our considerations the natural transition probability Q of such a process. A Markov process of order 2 is transformed into one of order 1 by combining two consecutive variables Z2n–1 and Z2n into one variable Yn with values in the Cartesian product space (Z × Z, 𝒵 ⊗ 𝒵). Thus, a Markov process (Yn, n ≥ 1) of order 1 with transition probability R is generated. Uniform ergodicity for the process (Zn, n ≥ 1) is defined in terms of the same property for (Yn, n ≥ 1). We give some conditions on the transition probability Q which transfer to R and thus ensure the uniform ergodicity of (Zn, n ≥ 1). We apply the general results to study the uniform ergodicity of Markov processes of order 2 which arise in some nonlinear time series models and as sequences of smoothed values in sequential smoothing procedures of Markovian observations. As for the time series models, Markovian noise sequences are covered.
This work is devoted to asymptotic properties of singularly perturbed Markov chains in discrete time. The motivation stems from applications in discrete-time control and optimization problems, manufacturing and production planning, stochastic networks, and communication systems, in which finite-state Markov chains are used to model large-scale and complex systems. To reduce the complexity of the underlying system, the states in each recurrent class are aggregated into a single state. Although the aggregated process may not be Markovian, its continuous-time interpolation converges to a continuous-time Markov chain whose generator is a function determined by the invariant measures of the recurrent states. Sequences of occupation measures are defined. A mean square estimate on a sequence of unscaled occupation measures is obtained. Furthermore, it is proved that a suitably scaled sequence of occupation measures converges to a switching diffusion.
We prove a result concerning the joint distribution of alleles at linked loci on a chromosome drawn from the population at stationarity. For a neutral locus, the allele is a draw from the stationary distribution of the mutation process. Furthermore, this allele is independent of the alleles at different loci on any chromosomes in the population.
Consider a sequence of outcomes from Markov dependent two-state (success-failure) trials. In this paper, the exact distributions are derived for three longest-run statistics: the longest failure run, longest success run, and the maximum of the two. The method of finite Markov chain imbedding is used to obtain these exact distributions, and their bounds and large deviation approximation are also studied. Numerical comparisons among the exact distributions, bounds, and approximations are provided to illustrate the theoretical results. With some modifications, we show that the results can be easily extended to Markov dependent multistate trials.
Continuous-space-time branching processes (CSBP) are investigated in order to model random energy cascades. CSBPs are based on spectrally positive Lévy processes and, as such, are characterized by their corresponding Laplace exponents. Special emphasis is put on the CSBPs of Feller, Lamperti and Neveu and on their Poisson point process representations. The Neveu model (either supercritical or subcritical) is of particular interest in physics for its connection with the random energy model of Derrida, as revisited by Ruelle. Exploiting some connections between the partition functions of energy and the Poisson-Dirichlet distributions of Pitman and Yor, some information on the zero-temperature limit is extracted. Finally, for the subcritical versions of the three models, we compute the distribution of some of their interesting features: extinction time and probability, area under the profile (total energy) and width (maximal energy).
This paper studies the first passage times to flat boundaries for a double exponential jump diffusion process, which consists of a continuous part driven by a Brownian motion and a jump part with jump sizes having a double exponential distribution. Explicit solutions of the Laplace transforms, of both the distribution of the first passage times and the joint distribution of the process and its running maxima, are obtained. Because of the overshoot problems associated with general jump diffusion processes, the double exponential jump diffusion process offers a rare case in which analytical solutions for the first passage times are feasible. In addition, it leads to several interesting probabilistic results. Numerical examples are also given. The finance applications include pricing barrier and lookback options.
Denote by αt(μ) the probability law of At(μ) =∫0texp(2(Bs+μ s))ds for a Brownian motion{Bs, s ≥ 0}. It is well known that αt(μ) is of interest in a number of domains, e.g. mathematical finance, diffusion processes in random environments, stochastic analysis on hyperbolic spaces and so on, but that it has complicated expressions. Recently, Dufresne obtained some remarkably simple expressions for αt(0) andαt(1), as well as an equally remarkable relationship betweenαt(μ) andαt(ν) for two different drifts μ and ν. In this paper, hinging on previous results about αt(μ), we give different proofs of Dufresne's results and present extensions of them for the processes{At(μ), t ≥ 0}.
We study the genealogical structure of a population with stochastically fluctuating size. If such fluctuations, after suitable rescaling, can be approximated by a nice continuous-time process, we prove weak convergence in the Skorokhod topology of the scaled ancestral process to a stochastic time change of Kingman's coalescent, the time change being given by an additive functional of the limiting backward size process.
A new structure with the special property that an instantaneous reflection barrier is imposed on the ordinary birth—death processes is considered. An easy-checking criterion for the existence of such Markov processes is first obtained. The uniqueness criterion is then established. In the nonunique case, all the honest processes are explicitly constructed. Ergodicity properties for these processes are investigated. It is proved that honest processes are always ergodic without necessarily imposing any extra conditions. Equilibrium distributions for all these ergodic processes are established. Several examples are provided to illustrate our results.
Consider N towers each made up of a number of counters. At each step a tower is chosen at random, a counter removed which is then added to another tower also chosen at random. The probability distribution for the time needed to empty one of the towers is obtained in the case N = 3. Arguments are set forward as to why no simple formulae can be expected for N > 3. An asymptotic expression for the mean time before one of the towers becomes empty is derived in the case of four towers when they all initially contain a comparably large number of counters. We then study related problems, in particular the ruin problem for three players. Here we use simple martingale methodology as well as a solution proposed by T. S. Ferguson for a slightly modified problem. Throughout the paper it is our main objective to shed light on the reasons why the case N > 3 is so substantially different from the case N ≤ 3.
This paper is devoted to a study of the integral of the workload process of the single server queue, in particular during one busy period. Firstly, we find asymptotics of the area 𝒜 swept under the workload process W(t) during the busy period when the service time distribution has a regularly varying tail. We also investigate the case of a light-tailed service time distribution. Secondly, we consider the problem of obtaining an explicit expression for the distribution of 𝒜. In the general GI/G/1 case, we use a sequential approximation to find the Laplace—Stieltjes transform of 𝒜. In the M/M/1 case, this transform is obtained explicitly in terms of Whittaker functions. Thirdly, we consider moments of 𝒜 in the GI/G/1 queue. Finally, we show asymptotic normality of .
This paper studies the law of any real powers of the integral of geometric Brownian motion over finite time intervals. As its main results, an apparently new integral representation is derived and its interrelations with the integral representations for these laws originating by Yor and by Dufresne are established. In fact, our representation is found to furnish what seems to be a natural bridge between these other two representations. Our results are obtained by enhancing the Hartman-Watson Ansatz of Yor, based on Bessel processes and the Laplace transform, by complex analytic techniques. Systematizing this idea in order to overcome the limits of Yor's theory seems to be the main methodological contribution of the paper.
We consider random recursive fractals and prove fine results about their local behaviour. We show that for a class of random recursive fractals the usual multifractal spectrum is trivial in that all points have the same local dimension. However, by examining the local behaviour of the measure at typical points in the set, we establish the size of fine fluctuations in the measure. The results are proved using a large deviation principle for a class of general branching processes which extends the known large deviation estimates for the supercritical Galton-Watson process.
We study irreducible time-homogenous Markov chains with finite state space in discrete time. We obtain results on the sensitivity of the stationary distribution and other statistical quantities with respect to perturbations of the transition matrix. We define a new closeness relation between transition matrices, and use graph-theoretic techniques, in contrast with the matrix analysis techniques previously used.