We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
This paper focuses on the averaging principle concerning the fast–slow McKean–Vlasov stochastic differential equations driven by mixed fractional Brownian motion with Hurst parameter $\tfrac{1}{2} < H < 1$. The integral associated with Brownian motion is the standard Itô integral, while the integral with respect to fractional Brownian motion is a generalized Riemann–Stieltjes integral. Under the non-Lipschitz condition and certain appropriate assumptions regarding the coefficients, we initially establish the existence and uniqueness theorem for the fast–slow McKean–Vlasov stochastic differential equation driven by mixed fractional Brownian motion. Subsequently, we demonstrate the averaging principle of the fast–slow McKean–Vlasov stochastic differential equations, signifying that the slow stochastic differential equation converges to the associated averaged equation in terms of mean-square convergence.
The aim of this article is to study the asymptotic behaviour of non-autonomous stochastic lattice systems. We first show the existence and uniqueness of a pullback measure attractor. Moreover, when deterministic external forcing terms are periodic in time, we show the pullback measure attractors are periodic. We then study the upper semicontinuity of pullback measure attractors as the noise intensity goes to zero. Pullback asymptotic compact for a family of probability measures with respect to probability distributions of the solutions is demonstrated by using uniform a priori estimates for far-field values of solutions.
In this paper, we present a sufficient framework to exhibit the sample path-wise asymptotic flocking dynamics of the Cucker–Smale model with unit-speed constraint and the randomly switching network topology. We employ a matrix formulation of the given equation, which allows us to evaluate the diameter of velocities with respect to the adjacency matrix of the network. Unlike the previous result on the randomly switching Cucker–Smale model, the unit-speed constraint disallows the system to be considered as a nonautonomous linear ordinary differential equation on velocity vector, which forces us to get a weaker form of the flocking estimate than the result for the original Cucker–Smale model.
Based on biochemical kinetics, a stochastic model to characterize wastewater treatment plants and dynamics of river water quality under the influence of random fluctuations is proposed in this paper. This model describes the interaction between dissolved oxygen (DO) and biochemical oxygen demand (BOD), and is in the form of stochastic differential equations driven by multiplicative Gaussian noises. The stochastic persistence problem for the model of the system is analysed. Further, a numerical simulation of the stationary probability distributions of BOD and OD by approximations of the stochastic process solution is presented. These results have implications for the prediction and control of pollutants.
We study a class of ordinary differential equations with a non-Lipschitz point singularity that admits non-unique solutions through this point. As a selection criterion, we introduce stochastic regularizations depending on a parameter $\nu $: the regularized dynamics is globally defined for each $\nu> 0$, and the original singular system is recovered in the limit of vanishing $\nu $. We prove that this limit yields a unique statistical solution independent of regularization when the deterministic system possesses a chaotic attractor having a physical measure with the convergence to equilibrium property. In this case, solutions become spontaneously stochastic after passing through the singularity: they are selected randomly with an intrinsic probability distribution.
We consider a collection of Markov chains that model the evolution of multitype biological populations. The state space of the chains is the positive orthant, and the boundary of the orthant is the absorbing state for the Markov chain and represents the extinction states of different population types. We are interested in the long-term behavior of the Markov chain away from extinction, under a small noise scaling. Under this scaling, the trajectory of the Markov process over any compact interval converges in distribution to the solution of an ordinary differential equation (ODE) evolving in the positive orthant. We study the asymptotic behavior of the quasi-stationary distributions (QSD) in this scaling regime. Our main result shows that, under conditions, the limit points of the QSD are supported on the union of interior attractors of the flow determined by the ODE. We also give lower bounds on expected extinction times which scale exponentially with the system size. Results of this type when the deterministic dynamical system obtained under the scaling limit is given by a discrete-time evolution equation and the dynamics are essentially in a compact space (namely, the one-step map is a bounded function) have been studied by Faure and Schreiber (2014). Our results extend these to a setting of an unbounded state space and continuous-time dynamics. The proofs rely on uniform large deviation results for small noise stochastic dynamical systems and methods from the theory of continuous-time dynamical systems.
In general, QSD for Markov chains with absorbing states and unbounded state spaces may not exist. We study one basic family of binomial-Poisson models in the positive orthant where one can use Lyapunov function methods to establish existence of QSD and also to argue the tightness of the QSD of the scaled sequence of Markov chains. The results from the first part are then used to characterize the support of limit points of this sequence of QSD.
The stochastic resonance phenomenon implies “positive” changing of a system behaviour when noise is added to the system. The phenomenon has found numerous applications in physics, neuroscience, biology, medicine, mechanics and other fields. The present paper concerns this phenomenon for parametrically excited stochastic systems, i.e. systems that feature deterministic input signals that affect their parameters, e.g. stiffness, damping or mass properties. Parametrically excited systems are now widely used for signal sensing, filtering and amplification, particularly in micro- and nanoscale applications. And noise and uncertainty can be essential for systems at this scale. Thus, these systems potentially can exhibit stochastic resonance. In the present paper, we use a “deterministic” approach to describe the stochastic resonance phenomenon that implies replacing noise by deterministic high-frequency excitations. By means of the approach, we show that stochastic resonance can occur for parametrically excited systems and determine the corresponding resonance conditions.
We investigate the dynamics of a susceptible infected recovered (SIR) epidemic model on small networks with different topologies, as a stepping stone to determining how the structure of a contact network impacts the transmission of infection through a population. For an SIR model on a network of $N$ nodes, there are $3^{N}$ configurations that the network can be in. To simplify the analysis, we group the states together based on the number of nodes in each infection state and the symmetries of the network. We derive analytical expressions for the final epidemic size of an SIR model on small networks composed of three or four nodes with different topological structures. Differential equations which describe the transition of the network between states are also derived and solved numerically to confirm our analysis. A stochastic SIR model is numerically simulated on each of the small networks with the same initial conditions and infection parameters to confirm our results independently. We show that the structure of the network, degree of the initial infectious node, number of initial infectious nodes and the transmission rate all significantly impact the final epidemic size of an SIR model on small networks.
In Achlioptas processes, starting from an empty graph, in each step two potential edges are chosen uniformly at random, and using some rule one of them is selected and added to the evolving graph. The evolution of the rescaled size of the largest component in such variations of the Erdős–Rényi random graph process has recently received considerable attention, in particular for Bollobás's ‘product rule’. In this paper we establish the following result for rules such as the product rule: the limit of the rescaled size of the ‘giant’ component exists and is continuous provided that a certain system of differential equations has a unique solution. In fact, our result applies to a very large class of Achlioptas-like processes.
Our proof relies on a general idea which relates the evolution of stochastic processes to an associated system of differential equations. Provided that the latter has a unique solution, our approach shows that certain discrete quantities converge (after appropriate rescaling) to this solution.
In the original article [LMS J. Comput. Math. 15 (2012) 71–83], the authors use a discrete form of the Itô formula, developed by Appleby, Berkolaiko and Rodkina [Stochastics 81 (2009) no. 2, 99–127], to show that the almost sure asymptotic stability of a particular two-dimensional test system is preserved when the discretisation step size is small. In this Corrigendum, we identify an implicit assumption in the original proof of the discrete Itô formula that, left unaddressed, would preclude its application to the test system of interest. We resolve this problem by reproving the relevant part of the discrete Itô formula in such a way that confirms its applicability to our test equation. Thus, we reaffirm the main results and conclusions of the original article.
We perform an almost sure linear stability analysis of the θ-Maruyama method, selecting as our test equation a two-dimensional system of Itô differential equations with diagonal drift coefficient and two independent stochastic perturbations which capture the stabilising and destabilising roles of feedback geometry in the almost sure asymptotic stability of the equilibrium solution. For small values of the constant step-size parameter, we derive close-to-sharp conditions for the almost sure asymptotic stability and instability of the equilibrium solution of the discretisation that match those of the original test system. Our investigation demonstrates the use of a discrete form of the Itô formula in the context of an almost sure linear stability analysis.
Continuous-time discrete-state random Markov chains generated by a random linear differential equation with a random tridiagonal matrix are shown to have a random attractor consisting of singleton subsets, essentially a random path, in the simplex of probability vectors. The proof uses comparison theorems for Carathéodory random differential equations and the fact that the linear cocycle generated by the Markov chain is a uniformly contractive mapping of the positive cone into itself with respect to the Hilbert projective metric. It does not involve probabilistic properties of the sample path and is thus equally valid in the nonautonomous deterministic context of Markov chains with, say, periodically varying transition probabilities, in which case the attractor is a periodic path.
We study linear jump parameter systems of differential and difference equations whose coefficients depend on the state of a semi-Markov process. We derive systems of equations for the first two moments of the random solutions of these jump parameter systems, and illustrate how moment equations can be used in examining their asymptotic stability.
Non-linear stochastic systems driven by white noise are analysed from the viewpoint of non-linear oscillation theory. Under various familiar hypotheses concerning dissipative and restorative dynamical forces, the existence and uniqueness, asymptotic growth, and oscillatory behavior of the solutions are demonstrated.
We study the equation dY(t)/dt = f(Y(t), Eh(Y(t))) for random initial conditions, where E denotes the expected value. It turns out that in contrast to the deterministic case local Lipschitz continuity of f and h are not sufficient to ensure uniqueness of the solutions. Finally we also state some sufficient conditions for uniqueness.
Recommend this
Email your librarian or administrator to recommend adding this to your organisation's collection.