The calculus of deterministic functions revolves around continuous functions, derivatives, and integrals. These concepts all involve the notion of limits. See the appendix for a review of continuity, differentiation, and integration. In this chapter the same concepts are treated for random processes. We've seen four different senses in which a sequence of random variables can converge: almost surely (a.s.), in probability (p.), in mean square (m.s.), and in distribution (d.). Of these senses, we will use the mean square sense of convergence the most, and make use of the correlation version of the Cauchy criterion for m.s. convergence, and the associated facts that for m.s. convergence, the means of the limits are the limits of the means, and correlations of the limits are the limits of correlations (Proposition 2.11 and Corollaries 2.12 and 2.13). Ergodicity and the Karhunen–Loéve expansions are discussed as applications of integration of random processes.
Continuity of random processes
The topic of this section is the definition of continuity of a continuous-time random process, with a focus on continuity defined using m.s. convergence. Chapter 2 covers convergence of sequences. Limits for deterministic functions of a continuous variable can be defined in either of two equivalent ways. Specifically, a function f on ℝ has a limit y at to, written as lims→to f(s) = y, if either of the two equivalent conditions is true:
(1) (Definition based on ε and δ) Given ε > 0, there exists δ > 0 so |f(s) − y|≤ ε whenever |s − to| ≤ δ.
(2) (Definition based on sequences) f(sn) → y for any (sn) such that sn → to.
Let us check that (1) and (2) are equivalent. Suppose (1) is true, and let (sn) be such that sn → to. Let ε > 0 and then let δ be as in condition (1). Since sn → to, it follows that there exists no so that |sn − to| ≤ δ for all n ≥ no.