Probability

The Analysis of Data, volume 1

Random Processes: Random Processes and Measure Theory*

6.5. Random Processes and Measure Theory*

Consider a probability measure space $(\Omega,\mathcal{F},\P)$ and a random process $\mathcal{X}:\Omega\to\mathcal{H}$. In the case of discrete-time RPs, $\mathcal{H}=\R^{\infty}$ and we can consider the RP as a measurable function $\mathcal{X}:(\Omega,\mathcal{F})\to (\R^{\infty},\mathcal{B}(\R^{\infty}))$.

In some cases we are only interested in evaluating probabilities involving a subset of the random variables $\{X_t: t\in A\subset J\}$. In these cases, it is sometimes more convenient to work with a $\sigma$-algebra on $\Omega$ that is coarser than $\mathcal{F}$. For example, if we are only interested in probabilities involving $X_k$, we can replace $\mathcal{F}$ with a coarser $\sigma$-algebra $\mathcal{L}$ that makes $X_k:(\Omega,\mathcal{L})\to (\R,\mathcal{B}(\R))$ a random variable. The coarsest such $\sigma$-algebra is denoted $\sigma(X_k)$. We define this concept below and extend it to a collection of random variables.

Definition 6.5.1. Given a function $X:\Omega\to \R$, we denote the smallest $\sigma$-algebra $\mathcal{L}$ on $\Omega$ that makes $X$ a measurable function $X:(\Omega,\mathcal{L})\to (\R,\mathcal{B}(\R))$ as $\sigma(X)$. In other words, \[ \sigma(X) = \{X^{-1}(B): B\in\mathcal{B}(\R)\}.\]

Definition 6.5.1 generalizes to collection of random variables as follows.

Definition 6.5.2. Given a collection of functions $X_{\theta}:\Omega\to \R$ indexed by $\theta\in\Theta$, we denote the smallest $\sigma$-algebra $\mathcal{L}$ on $\Omega$ that makes $X_{\theta}:(\Omega,\mathcal{L})\to (\R,\mathcal{B}(\R))$, $\theta\in\Theta$ measurable functions as $\sigma(\{X_{\theta}, \theta\in\Theta\})$. In other words, $\sigma(\{X_{\theta}, \theta\in\Theta\})$ is the intersection of all $\sigma$-algebras under which $X_{\theta}, \theta\in\Theta$ are measurable.

Next, we prove Kolmogorov's extension theorem (Proposition 6.2.1) for discrete-time random processes ($J=\mathbb{N}$).

Proof of Kolmogorov's Extension Theorem for Discrete-Time Processes. We separate the proof to two parts: uniqueness and existence.

In the uniqueness part, we assume two distributions $\P$ and $\P'$ on $(\Omega,\mathcal{F})$ that agree on all finite dimensional marginals and show that $\P=\P'$ almost everywhere. Note that the set of measurable cylinders is a $\pi$-system that generates the Borel $\sigma$-algebra of $\R^{\infty}$ (see Section F.5 and Example B.4.4, and that $\P=\P'$ on all finite dimensional marginals and therefore $\P=\P'$ on all measurable cylinders. It follows from Corollary E.3.1 that $\P=\P'$ almost everywhere establishing uniqueness.

To show existence, we assume that we have collection of finite dimensional marginal distributions over $X_n: n\in\mathbb{N}$ that do not contradict each other, and verify that $\mathcal{X}=(X_n:n\in\mathbb{N})$ is a random process. The agreement of the marginals of $\mathcal{X}$ and the given marginal distributions over $X_n, n\in\mathbb{N}$ holds by definition of $\mathcal{X}$. It remains to show that $\mathcal{X}:(\Omega,\mathcal{F})\to (\R^{\infty},\mathcal{B}(\R^{\infty}))$ is a measurable function. Recall that $\mathcal{B}(\R^{\infty})$ is generated by open sets in $\R^{\infty}$, which are unions of finite intersections of measurable cylinders. Using the second countability of $\R^{\infty}$ we can ensure that the union contains at most a countably infinite number of terms. We thus have for an open set $A$ in $\R^{\infty}$, the expression $A=\cup_{i\in\mathbb{N}} \cap_{j=1}^{k_i} A_{ij}$ where $A_{ij}$ are measurable cylinders. It follows that \[ \mathcal{X}^{-1}(\cup_{i\in\mathbb{N}} \cap_{j=1}^{k_i} A_{ij}) = \cup_{i\in\mathbb{N}} \cap_{j=1}^{k_i} \mathcal{X}^{-1}(A_{ij})\] which is a countable union of finite intersections of Borel sets in $\mathcal{F}$, and therefore $\mathcal{X}^{-1}(A)\in\mathcal{F}$.