The book is devoted to stochastic evolution equations with Lévy noise. Such equations are important because, roughly speaking, stochastic dynamical systems, or equivalently Markov processes, can be represented as solutions to such equations. In this introductory chapter, it is shown how that is the case. To motivate better the construction of the associated stochastic equations, the chapter starts with discrete-time systems.
Discrete-time dynamical systems
A deterministic discrete-time dynamical system consists of a set E, usually equipped with a σ-field ε of subsets of itself, and a mapping F, usually measurable, acting from E into E. If the position of the system at time t = 0, 1, …, is denoted by X(t) then by definition X(t + 1) = F(X(t)), t = 0, 1, … The sequences (X(t), t = 0, 1, …) are the so-called trajectories or paths of the dynamical system, and their asymptotic properties are of prime interest in the theory. The set E is called the state space and the transformation F determines the dynamics of the system.
If the present state x determines only the probability P(x, Γ) that at the next moment the system will be in the set Γ then one says that the system is stochastic. Thus a stochastic dynamical system consists of the state space E, a σ-field ε and a function P = P(x, Γ), x ∈ E, Γ ∈ ε, such that, for each Γ ∈ ε, P(·, Γ) is a measurable function and, for each x ∈ E, P(x, ·) is a probability measure. We call P the transition function or transition probability.