To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
In this paper we present some necessary conditions for the uniqueness, recurrence, and ergodicity of a class of multidimensional Q-processes, using the dual Yan-Chen comparison method. Then the coupling method is used to study the multidimensional processes in a specific space. As applications, three models of particle systems are illustrated.
In this paper we consider a discrete-time process which grows according to a random walk with nonnegative increments between crash times at which it collapses to 0. We assume that the probability of crashing depends on the level of the process. We study the stochastic stability of this growth-collapse process. Special emphasis is given to the case in which the probability of crashing tends to 0 as the level of the process increases. In particular, we show that the process may exhibit long-range dependence and that the crash sizes may have a power law distribution.
In this paper we propose a new genetic algorithm specifically based on mutation and selection in order to maximize a fitness function. This mutation-selection algorithm behaves as a gradient algorithm which converges to local maxima. In order to obtain convergence to global maxima we propose a new algorithm which is built by randomly perturbing the selection operator of the gradient-like algorithm. The perturbation is controlled by only one parameter: that which allows the selection pressure to be governed. We use the Markov model of the perturbed algorithm to prove its convergence to global maxima. The arguments used in the proofs are based on Freidlin and Wentzell's (1984) theory and large deviation techniques also applied in simulated annealing. Our main results are that (i) when the population size is greater than a critical value, the control of the selection pressure ensures the convergence to the global maxima of the fitness function, and (ii) the convergence also occurs when the population is the smallest possible, i.e. 1.
In this paper we investigate sufficient conditions that ensure the optimality of threshold strategies for optimal stopping problems with finite or perpetual maturities. Our result is based on a local-time argument that enables us to give an alternative proof of the smooth-fit principle. Moreover, we present a class of optimal stopping problems for which the propagation of convexity fails.
In this paper we study two distributions, namely the distribution of the waiting times until given numbers of occurrences of compound patterns and the distribution of the numbers of occurrences of compound patterns in a fixed number of trials. We elucidate the interrelation between these two distributions in terms of the generating functions. We provide perspectives on the problems related to compound patterns in statistics and probability. As an application, the waiting time problem of counting runs of specified lengths is considered in order to illustrate how the distributions of waiting times can be derived from our theoretical results.
We complete a paper written by Edward Pollak in 1974 on a multitype branching process the generating functions of whose birth law are fractional linear functions with the same denominator. The main tool is a parameterization of these functions adapted using the mean matrix M and an element w of the first quadrant. We use this opportunity to give a self-contained presentation of Pollak's theory.
This paper establishes new Foster-type criteria for a Markov chain on a general state space to be Harris recurrent, positive Harris recurrent, or geometrically ergodic. The criteria are based on drift conditions involving stopping times rather than deterministic steps. Meyn and Tweedie (1994) developed similar criteria involving random-sized steps, independent of the Markov chain under study. They also posed an open problem of finding criteria involving stopping times. Our results essentially solve that problem. We also show that the assumption of ψ-irreducibility is not needed when stating our drift conditions for positive Harris recurrence or geometric ergodicity.
As an extension of the discrete-time case, this note investigates the variance of the total cumulative reward for continuous-time Markov reward chains with finite state spaces. The results correspond to discrete-time results. In particular, the variance growth rate is shown to be asymptotically linear in time. Expressions are provided to compute this growth rate.
We propose a simple model for interaction between gene candidates in the two strands of bacterial DNA (deoxyribonucleic acid). Our model assumes that ‘final’ genes appear in one of the two strands, that they do not overlap (in bacteria there is only a small percentage of overlap), and that the final genes maximize the occupancy rate, which is defined to be the proportion of the genome occupied by coding zones. We are more concerned with describing the organization and distribution of genes in bacterial DNA than with the very hard problem of identifying genes. To this end, an algorithm for selecting the final genes according to the previously outlined maximization criterion is proposed. We study the graphical and probabilistic properties of the model resulting from applying the maximization procedure to a Markovian representation of the genic and intergenic zones within the DNA strands, develop theoretical bounds on the occupancy rate (which, in our view, is a rather intractable quantity), and use the model to compute quantities of relevance to the Escherichia coli genome and compare these to annotation data. Although this work focuses on genomic modelling, we point out that the proposed model is not restricted to applications in this setting. It also serves to model other resource allocation problems.
We consider an epidemic model where the spread of the epidemic can be described by a discrete-time Galton-Watson branching process. Between times n and n + 1, any infected individual is detected with unknown probability π and the numbers of these detected individuals are the only observations we have. Detected individuals produce a reduced number of offspring in the time interval of detection, and no offspring at all thereafter. If only the generation sizes of a Galton-Watson process are observed, it is known that one can only estimate the first two moments of the offspring distribution consistently on the explosion set of the process (and, apart from some lattice parameters, no parameters that are not determined by those moments). Somewhat surprisingly, in our context, where we observe a binomially distributed subset of each generation, we are able to estimate three functions of the parameters consistently. In concrete situations, this often enables us to estimate π consistently, as well as the mean number of offspring. We apply the estimators to data for a real epidemic of classical swine fever.
In this paper we develop a constructive structure theory for a class of exponential functionals of Brownian motion which includes Asian option values. This is done in two stages of differing natures. As a first step, the functionals are represented as Laguerre reduction series obtained from main results of Schröder (2006), this paper's companion paper. These reduction series are new and given in terms of the negative moments of the integral of geometric Brownian motion, whose structure theory is developed in a second step. Providing a new angle on these processes, this is done by establishing connections with theta functions. Integral representations and computable formulae for the negative moments are thus derived and then shown to furnish highly efficient ways for computing the negative moments. Application of this paper's Laguerre reduction series in numerical examples suggests that one of the most efficient methods for the explicit valuation of Asian options is obtained. The paper also provides mathematical background results referred to in Schröder (2005c).
We give a criterion for extinction or local extinction of branching symmetric α-stable processes in terms of the principal eigenvalue for time-changed processes of symmetric α-stable processes. Here the branching rate and the branching mechanism are spatially dependent. In particular, the branching rate is allowed to be singular with respect to the Lebesgue measure. We apply this criterion to some branching processes.
In this paper we introduce the concepts of instantaneous reversibility and instantaneous entropy production rate for inhomogeneous Markov chains with denumerable state spaces. The following statements are proved to be equivalent: the inhomogeneous Markov chain is instantaneously reversible; it is in detailed balance; its entropy production rate vanishes. In particular, for a time-periodic birth-death chain, which can be regarded as a simple version of a physical model (Brownian motors), we prove that its rotation number is 0 when it is instantaneously reversible or periodically reversible. Hence, in our model of Markov chains, the directed transport phenomenon of Brownian motors can occur only in nonequilibrium and irreversible systems.
It was recently proved by Jelenković and Radovanović (2004) that the least-recently-used (LRU) caching policy, in the presence of semi-Markov-modulated requests that have a generalized Zipf's law popularity distribution, is asymptotically insensitive to the correlation in the request process. However, since the previous result is asymptotic, it remains unclear how small the cache size can become while still retaining the preceding insensitivity property. In this paper, assuming that requests are generated by a nearly completely decomposable Markov-modulated process, we characterize the critical cache size below which the dependency of requests dominates the cache performance. This critical cache size is small relative to the dynamics of the modulating process, and in fact is sublinear with respect to the sojourn times of the modulated chain that determines the dependence structure.
We study the genealogy of so-called immortal branching processes, i.e. branching processes where each individual upon death is replaced by at least one new individual, and conclude that their marginal distributions are compound geometric. The result also implies that the limiting distributions of properly scaled supercritical branching processes are compound geometric. We exemplify our results with an expression for the marginal distribution for a class of branching processes that have recently appeared in the theory of coalescent processes and continuous stable random trees. The limiting distribution can be expressed in terms of the Fox H-function, and in special cases by the Meijer G-function.
We consider the random motion of a particle that moves with constant finite speed in the space ℝ4 and, at Poisson-distributed times, changes its direction with uniform law on the unit four-sphere. For the particle's position, X(t) = (X1(t), X2(t), X3(t), X4(t)), t > 0, we obtain the explicit forms of the conditional characteristic functions and conditional distributions when the number of changes of directions is fixed. From this we derive the explicit probability law, f(x, t), x ∈ ℝ4, t ≥ 0, of X(t). We also show that, under the Kac condition on the speed of the motion and the intensity of the switching Poisson process, the density, p(x,t), of the absolutely continuous component of f(x,t) tends to the transition density of the four-dimensional Brownian motion with zero drift and infinitesimal variance σ2 = ½.
We consider a diffusion process X(t) with a one-sided Brownian potential starting from the origin. The limiting behavior of the process as time goes to infinity is studied. For each t > 0, the sample space describing the random potential is divided into two parts, Ãt and B̃t, both having probability ½, in such a way that our diffusion process X(t) exhibits quite different limiting behavior depending on whether it is conditioned on Ãt or on B̃t (t → ∞). The asymptotic behavior of the maximum process of X(t) is also investigated. Our results improve those of Kawazu, Suzuki, and Tanaka (2001).
The northeast model is a spin system on the two-dimensional integer lattice that evolves according to the following rule: whenever a site's southerly and westerly nearest neighbors have spin 1, it may reset its own spin by tossing a p-coin; at all other times, its spin remains frozen. It is proved that the northeast model has a phase transition at pc = 1 - βc, where βc is the critical parameter for oriented percolation. For p < pc, the trivial measure, δ0, that puts mass one on the configuration with all spins set at 0 is the unique ergodic, translation-invariant, stationary measure. For p ≥ pc, the product Bernoulli-p measure on configuration space is the unique nontrivial, ergodic, translation-invariant, stationary measure for the system, and it is mixing. For p > ⅔, it is shown that there is exponential decay of correlations.
We compare dependence in stochastically monotone Markov processes with partially ordered Polish state spaces using the concordance and supermodular orders. We show necessary and sufficient conditions for the concordance order to hold both in terms of the one-step transition probabilities for discrete-time processes and in terms of the corresponding infinitesimal generators for continuous-time processes. We give examples showing that a stochastic monotonicity assumption is not necessary for such orderings. We indicate relations between dependence orderings and, variously, the asymptotic variance-reduction effect in Monte Carlo Markov chains, Cheeger constants, and positive dependence for Markov processes.