We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
A compound Poisson process whose randomized time is an independent Poisson process is called a compound Poisson process with Poisson subordinator. We provide its probability distribution, which is expressed in terms of the Bell polynomials, and investigate in detail both the special cases in which the compound Poisson process has exponential jumps and normal jumps. Then for the iterated Poisson process we discuss some properties and provide convergence results to a Poisson process. The first-crossing time problem for the iterated Poisson process is finally tackled in the cases of (i) a decreasing and constant boundary, where we provide some closed-form results, and (ii) a linearly increasing boundary, where we propose an iterative procedure to compute the first-crossing time density and survival functions.
We deal with a random graph model evolving in discrete time steps by duplicating and deleting the edges of randomly chosen vertices. We prove the existence of an almost surely asymptotic degree distribution, with stretched exponential decay; more precisely, the proportion of vertices of degree d tends to some positive number cd > 0 almost surely as the number of steps goes to ∞, and cd ~ (eπ)1/2d1/4e-2√d holds as d → ∞.
Full likelihood inference under Kingman's coalescent is a computationally challenging problem to which importance sampling (IS) and the product of approximate conditionals (PAC) methods have been applied successfully. Both methods can be expressed in terms of families of intractable conditional sampling distributions (CSDs), and rely on principled approximations for accurate inference. Recently, more general Λ- and Ξ-coalescents have been observed to provide better modelling fits to some genetic data sets. We derive families of approximate CSDs for finite sites Λ- and Ξ-coalescents, and use them to obtain ‘approximately optimal’ IS and PAC algorithms for Λ-coalescents, yielding substantial gains in efficiency over existing methods.
Retransmission-based failure recovery represents a primary approach in existing communication networks that guarantees data delivery in the presence of channel failures. Recent work has shown that, when data sizes have infinite support, retransmissions can cause long (-tailed) delays even if all traffic and network characteristics are light-tailed. In this paper we investigate the practically important case of bounded data units 0 ≤ Lb ≤ b under the condition that the hazard functions of the distributions of data sizes and channel statistics are proportional. To this end, we provide an explicit and uniform characterization of the entire body of the retransmission distribution ℙ[Nb > n] in both n and b. Our main discovery is that this distribution can be represented as the product of a power law and gamma distribution. This rigorous approximation clearly demonstrates the coupling of a power law distribution, dominating the main body, and the gamma distribution, determining the exponential tail. Our results are validated via simulation experiments and can be useful for designing retransmission-based systems with the required performance characteristics. From a broader perspective, this study applies to any other system, e.g. computing, where restart mechanisms are employed after a job processing failure.
The infinite source Poisson arrival model with heavy-tailed workload distributions has attracted much attention, especially in the modeling of data packet traffic in communication networks. In particular, it is well known that under suitable assumptions on the source arrival rate, the centered and scaled cumulative workload input process for the underlying processing system can be approximated by fractional Brownian motion. In many applications one is interested in the stabilization of the work inflow to the system by modifying the net input rate, using an appropriate admission control policy. In this paper we study a natural family of admission control policies which keep the associated scaled cumulative workload input asymptotically close to a prespecified linear trajectory, uniformly over time. Under such admission control policies and with natural assumptions on arrival distributions, suitably scaled and centered cumulative workload input processes are shown to converge weakly in the path space to the solution of a d-dimensional stochastic differential equation driven by a Gaussian process. It is shown that the admission control policy achieves moment stabilization in that the second moment of the solution to the stochastic differential equation (averaged over the d-stations) is bounded uniformly for all times. In one special case of control policies, as time approaches ∞, we obtain a fractional version of a stationary Ornstein-Uhlenbeck process that is driven by fractional Brownian motion with Hurst parameter H > ½.
We consider a dynamic metapopulation involving one large population of size N surrounded by colonies of size εNN, usually called peripheral isolates in ecology, where N → ∞ and εN → 0 in such a way that εNN → ∞. The main population, as well as the colonies, independently send propagules to found new colonies (emigration), and each colony independently, eventually merges with the main population (fusion). Our aim is to study the genealogical history of a finite number of lineages sampled at stationarity in such a metapopulation. We make assumptions on model parameters ensuring that the total outer population has size of the order of N and that each colony has a lifetime of the same order. We prove that under these assumptions, the scaling limit of the genealogical process of a finite sample is a censored coalescent where each lineage can be in one of two states: an inner lineage (belonging to the main population) or an outer lineage (belonging to some peripheral isolate). Lineages change state at constant rate and (only) inner lineages coalesce at constant rate per pair. This two-state censored coalescent is also shown to converge weakly, as the landscape dynamics accelerate, to a time-changed Kingman coalescent.
Under proportional transaction costs, a price process is said to have a consistent price system, if there is a semimartingale with an equivalent martingale measure that evolves within the bid-ask spread. We show that a continuous, multi-asset price process has a consistent price system, under arbitrarily small proportional transaction costs, if it satisfies a natural multi-dimensional generalization of the stickiness condition introduced by Guasoni (2006).
Gaussian particles provide a flexible framework for modelling and simulating three-dimensional star-shaped random sets. In our framework, the radial function of the particle arises from a kernel smoothing, and is associated with an isotropic random field on the sphere. If the kernel is a von Mises-Fisher density, or uniform on a spherical cap, the correlation function of the associated random field admits a closed form expression. The Hausdorff dimension of the surface of the Gaussian particle reflects the decay of the correlation function at the origin, as quantified by the fractal index. Under power kernels we obtain particles with boundaries of any Hausdorff dimension between 2 and 3.
We study the Bayesian disorder problem for a negative binomial process. The aim is to determine a stopping time which is as close as possible to the random and unknown moment at which a sequentially observed negative binomial process changes the value of its characterizing parameter p ∈ (0, 1). The solution to this problem is explicitly derived through the reduction of the original optimal stopping problem to an integro-differential free-boundary problem. A careful analysis of the free-boundary equation and of the probabilistic nature of the boundary point allows us to specify when the smooth fit principle holds and when it breaks down in favour of the continuous fit principle.
The drawdown process of a one-dimensional regular diffusion process X is given by X reflected at its running maximum. The drawup process is given by X reflected at its running minimum. We calculate the probability that a drawdown precedes a drawup in an exponential time-horizon. We then study the law of the occupation times of the drawdown process and the drawup process. These results are applied to address problems in risk analysis and for option pricing of the drawdown process. Finally, we present examples of Brownian motion with drift and three-dimensional Bessel processes, where we prove an identity in law.
The standard Hawkes process is constructed from a homogeneous Poisson process and uses the same exciting function for different generations of offspring. We propose an extension of this process by considering different exciting functions. This consideration may be important in a number of fields; e.g. in seismology, where main shocks produce aftershocks with possibly different intensities. The main results are devoted to the asymptotic behavior of this extension of the Hawkes process. Indeed, a law of large numbers and a central limit theorem are stated. These results allow us to analyze the asymptotic behavior of the process when unpredictable marks are considered.
Consider a real-valued discrete-time stationary and ergodic stochastic process, called the noise process. For each dimension n, we can choose a stationary point process in ℝn and a translation invariant tessellation of ℝn. Each point is randomly displaced, with a displacement vector being a section of length n of the noise process, independent from point to point. The aim is to find a point process and a tessellation that minimizes the probability of decoding error, defined as the probability that the displaced version of the typical point does not belong to the cell of this point. We consider the Shannon regime, in which the dimension n tends to ∞, while the logarithm of the intensity of the point processes, normalized by dimension, tends to a constant. We first show that this problem exhibits a sharp threshold: if the sum of the asymptotic normalized logarithmic intensity and of the differential entropy rate of the noise process is positive, then the probability of error tends to 1 with n for all point processes and all tessellations. If it is negative then there exist point processes and tessellations for which this probability tends to 0. The error exponent function, which denotes how quickly the probability of error goes to 0 in n, is then derived using large deviations theory. If the entropy spectrum of the noise satisfies a large deviations principle, then, below the threshold, the error probability goes exponentially fast to 0 with an exponent that is given in closed form in terms of the rate function of the noise entropy spectrum. This is obtained for two classes of point processes: the Poisson process and a Matérn hard-core point process. New lower bounds on error exponents are derived from this for Shannon's additive noise channel in the high signal-to-noise ratio limit that hold for all stationary and ergodic noises with the above properties and that match the best known bounds in the white Gaussian noise case.
We consider a class of Gaussian processes which are obtained as height processes of some (d + 1)-dimensional dynamic random interface model on ℤd. We give an estimate of persistence probability, namely, large T asymptotics of the probability that the process does not exceed a fixed level up to time T. The interaction of the model affects the persistence probability and its asymptotics changes depending on the dimension d.
Let {X(s, t): s, t ≥ 0} be a centred homogeneous Gaussian field with almost surely continuous sample paths and correlation function r(s, t) = cov(X(s, t), X(0, 0)) such that r(s, t) = 1 - |s|α1 - |t|α2 + o(|s|α1 + |t|α2), s, t → 0, with α1, α2 ∈ (0, 2], and r(s, t) < 1 for (s, t) ≠ (0, 0). In this contribution we derive an asymptotic expansion (as u → ∞) of P(sup(sn1(u),tn2(u)) ∈[0,x]∙[0,y]X(s, t) ≤ u), where n1(u)n2(u) =
u2/α1+2/α2Ψ(u), which holds uniformly for (x, y) ∈ [A, B]2 with A, B two positive constants and Ψ the survival function of an N(0, 1) random variable. We apply our findings to the analysis of extremes of homogeneous Gaussian fields over more complex parameter sets and a ball of random radius. Additionally, we determine the extremal index of the discretised random field determined by X(s, t).
The main result of this paper is the solution to the optimal stopping problem of maximizing the variance of a geometric Lévy process. We call this problem the variance problem. We show that, for some geometric Lévy processes, we achieve higher variances by allowing randomized stopping. Furthermore, for some geometric Lévy processes, the problem has a solution only if randomized stopping is allowed. When randomized stopping is allowed, we give a solution to the variance problem. We identify the Lévy processes for which the allowance of randomized stopping times increases the maximum variance. When it does, we also solve the variance problem without randomized stopping.
In this paper we propose a strategy that gives an optimal lower bound of the average gain for the two-envelope problem within the McDonnell and Abbott (2009) and McDonnell et al. (2011) framework. We obtain this result with partial information about the probability distribution of the envelope's contents.
We study the convergence of centered and normalized sums of independent and identically distributed random elements of the space D of càdlàg functions endowed with Skorokhod's J1
topology, to stable distributions in D. Our results are based on the concept of regular variation on metric spaces and on point process convergence. We provide some applications; in particular, to the empirical process of the renewal-reward process.
In this paper we analyse the fractional Poisson process where the state probabilities pkνk(t), t ≥ 0, are governed by time-fractional equations of order 0 < νk ≤ 1 depending on the number k of events that have occurred up to time t. We are able to obtain explicitly the Laplace transform of pkνk(t) and various representations of state probabilities. We show that the Poisson process with intermediate waiting times depending on νk differs from that constructed from the fractional state equations (in the case of νk = ν, for all k, they coincide with the time-fractional Poisson process). We also introduce a different form of fractional state-dependent Poisson process as a weighted sum of homogeneous Poisson processes. Finally, we consider the fractional birth process governed by equations with state-dependent fractionality.
In this paper we apply the recently established Wiener-Hopf Monte Carlo simulation technique for Lévy processes from Kuznetsov et al. (2011) to path functionals; in particular, first passage times, overshoots, undershoots, and the last maximum before the passage time. Such functionals have many applications, for instance, in finance (the pricing of exotic options in a Lévy model) and insurance (ruin time, debt at ruin, and related quantities for a Lévy insurance risk process). The technique works for any Lévy process whose running infimum and supremum evaluated at an independent exponential time can be sampled from. This includes classic examples such as stable processes, subclasses of spectrally one-sided Lévy processes, and large new families such as meromorphic Lévy processes. Finally, we present some examples. A particular aspect that is illustrated is that the Wiener-Hopf Monte Carlo simulation technique (provided that it applies) performs much better at approximating first passage times than a ‘plain’ Monte Carlo simulation technique based on sampling increments of the Lévy process.
Let W = {Wn: n ∈ N} be a sequence of random vectors in Rd, d ≥ 1. In this paper we consider the logarithmic asymptotics of the extremes of W, that is, for any vector q > 0 in Rd, we find that logP(there exists n ∈ N: Wnuq) as u → ∞. We follow the approach of the restricted large deviation principle introduced in Duffy (2003). That is, we assume that, for every q ≥ 0, and some scalings {an}, {vn}, (1 / vn)logP(Wn /
an ≥ uq) has a, continuous in q, limit JW(q). We allow the scalings {an} and {vn} to be regularly varying with a positive index. This approach is general enough to incorporate sequences W, such that the probability law of Wn / an satisfies the large deviation principle with continuous, not necessarily convex, rate functions. The equations for these asymptotics are in agreement with the literature.