You are currently browsing the tag archive for the ‘integration’ tag.

In Notes 0, we introduced the notion of a measure space ${\Omega = (\Omega, {\mathcal F}, \mu)}$, which includes as a special case the notion of a probability space. By selecting one such probability space ${(\Omega,{\mathcal F},\mu)}$ as a sample space, one obtains a model for random events and random variables, with random events ${E}$ being modeled by measurable sets ${E_\Omega}$ in ${{\mathcal F}}$, and random variables ${X}$ taking values in a measurable space ${R}$ being modeled by measurable functions ${X_\Omega: \Omega \rightarrow R}$. We then defined some basic operations on these random events and variables:

• Given events ${E,F}$, we defined the conjunction ${E \wedge F}$, the disjunction ${E \vee F}$, and the complement ${\overline{E}}$. For countable families ${E_1,E_2,\dots}$ of events, we similarly defined ${\bigwedge_{n=1}^\infty E_n}$ and ${\bigvee_{n=1}^\infty E_n}$. We also defined the empty event ${\emptyset}$ and the sure event ${\overline{\emptyset}}$, and what it meant for two events to be equal.
• Given random variables ${X_1,\dots,X_n}$ in ranges ${R_1,\dots,R_n}$ respectively, and a measurable function ${F: R_1 \times \dots \times R_n \rightarrow S}$, we defined the random variable ${F(X_1,\dots,X_n)}$ in range ${S}$. (As the special case ${n=0}$ of this, every deterministic element ${s}$ of ${S}$ was also a random variable taking values in ${S}$.) Given a relation ${P: R_1 \times \dots \times R_n \rightarrow \{\hbox{true}, \hbox{false}\}}$, we similarly defined the event ${P(X_1,\dots,X_n)}$. Conversely, given an event ${E}$, we defined the indicator random variable ${1_E}$. Finally, we defined what it meant for two random variables to be equal.
• Given an event ${E}$, we defined its probability ${{\bf P}(E)}$.

These operations obey various axioms; for instance, the boolean operations on events obey the axioms of a Boolean algebra, and the probabilility function ${E \mapsto {\bf P}(E)}$ obeys the Kolmogorov axioms. However, we will not focus on the axiomatic approach to probability theory here, instead basing the foundations of probability theory on the sample space models as discussed in Notes 0. (But see this previous post for a treatment of one such axiomatic approach.)

It turns out that almost all of the other operations on random events and variables we need can be constructed in terms of the above basic operations. In particular, this allows one to safely extend the sample space in probability theory whenever needed, provided one uses an extension that respects the above basic operations; this is an important operation when one needs to add new sources of randomness to an existing system of events and random variables, or to couple together two separate such systems into a joint system that extends both of the original systems. We gave a simple example of such an extension in the previous notes, but now we give a more formal definition:

Definition 1 Suppose that we are using a probability space ${\Omega = (\Omega, {\mathcal F}, \mu)}$ as the model for a collection of events and random variables. An extension of this probability space is a probability space ${\Omega' = (\Omega', {\mathcal F}', \mu')}$, together with a measurable map ${\pi: \Omega' \rightarrow \Omega}$ (sometimes called the factor map) which is probability-preserving in the sense that

$\displaystyle \mu'( \pi^{-1}(E) ) = \mu(E) \ \ \ \ \ (1)$

for all ${E \in {\mathcal F}}$. (Caution: this does not imply that ${\mu(\pi(F)) = \mu'(F)}$ for all ${F \in {\mathcal F}'}$ – why not?)

An event ${E}$ which is modeled by a measurable subset ${E_\Omega}$ in the sample space ${\Omega}$, will be modeled by the measurable set ${E_{\Omega'} := \pi^{-1}(E_\Omega)}$ in the extended sample space ${\Omega'}$. Similarly, a random variable ${X}$ taking values in some range ${R}$ that is modeled by a measurable function ${X_\Omega: \Omega \rightarrow R}$ in ${\Omega}$, will be modeled instead by the measurable function ${X_{\Omega'} := X_\Omega \circ \pi}$ in ${\Omega'}$. We also allow the extension ${\Omega'}$ to model additional events and random variables that were not modeled by the original sample space ${\Omega}$ (indeed, this is one of the main reasons why we perform extensions in probability in the first place).

Thus, for instance, the sample space ${\Omega'}$ in Example 3 of the previous post is an extension of the sample space ${\Omega}$ in that example, with the factor map ${\pi: \Omega' \rightarrow \Omega}$ given by the first coordinate projection ${\pi(i,j) := i}$. One can verify that all of the basic operations on events and random variables listed above are unaffected by the above extension (with one caveat, see remark below). For instance, the conjunction ${E \wedge F}$ of two events can be defined via the original model ${\Omega}$ by the formula

$\displaystyle (E \wedge F)_\Omega := E_\Omega \cap F_\Omega$

or via the extension ${\Omega'}$ via the formula

$\displaystyle (E \wedge F)_{\Omega'} := E_{\Omega'} \cap F_{\Omega'}.$

The two definitions are consistent with each other, thanks to the obvious set-theoretic identity

$\displaystyle \pi^{-1}( E_\Omega \cap F_\Omega ) = \pi^{-1}(E_\Omega) \cap \pi^{-1}(F_\Omega).$

Similarly, the assumption (1) is precisely what is needed to ensure that the probability ${\mathop{\bf P}(E)}$ of an event remains unchanged when one replaces a sample space model with an extension. We leave the verification of preservation of the other basic operations described above under extension as exercises to the reader.

Remark 2 There is one minor exception to this general rule if we do not impose the additional requirement that the factor map ${\pi}$ is surjective. Namely, for non-surjective ${\pi}$, it can become possible that two events ${E, F}$ are unequal in the original sample space model, but become equal in the extension (and similarly for random variables), although the converse never happens (events that are equal in the original sample space always remain equal in the extension). For instance, let ${\Omega}$ be the discrete probability space ${\{a,b\}}$ with ${p_a=1}$ and ${p_b=0}$, and let ${\Omega'}$ be the discrete probability space ${\{ a'\}}$ with ${p'_{a'}=1}$, and non-surjective factor map ${\pi: \Omega' \rightarrow \Omega}$ defined by ${\pi(a') := a}$. Then the event modeled by ${\{b\}}$ in ${\Omega}$ is distinct from the empty event when viewed in ${\Omega}$, but becomes equal to that event when viewed in ${\Omega'}$. Thus we see that extending the sample space by a non-surjective factor map can identify previously distinct events together (though of course, being probability preserving, this can only happen if those two events were already almost surely equal anyway). This turns out to be fairly harmless though; while it is nice to know if two given events are equal, or if they differ by a non-null event, it is almost never useful to know that two events are unequal if they are already almost surely equal. Alternatively, one can add the additional requirement of surjectivity in the definition of an extension, which is also a fairly harmless constraint to impose (this is what I chose to do in this previous set of notes).

Roughly speaking, one can define probability theory as the study of those properties of random events and random variables that are model-independent in the sense that they are preserved by extensions. For instance, the cardinality ${|E_\Omega|}$ of the model ${E_\Omega}$ of an event ${E}$ is not a concept within the scope of probability theory, as it is not preserved by extensions: continuing Example 3 from Notes 0, the event ${E}$ that a die roll ${X}$ is even is modeled by a set ${E_\Omega = \{2,4,6\}}$ of cardinality ${3}$ in the original sample space model ${\Omega}$, but by a set ${E_{\Omega'} = \{2,4,6\} \times \{1,2,3,4,5,6\}}$ of cardinality ${18}$ in the extension. Thus it does not make sense in the context of probability theory to refer to the “cardinality of an event ${E}$“.

On the other hand, the supremum ${\sup_n X_n}$ of a collection of random variables ${X_n}$ in the extended real line ${[-\infty,+\infty]}$ is a valid probabilistic concept. This can be seen by manually verifying that this operation is preserved under extension of the sample space, but one can also see this by defining the supremum in terms of existing basic operations. Indeed, note from Exercise 24 of Notes 0 that a random variable ${X}$ in the extended real line is completely specified by the threshold events ${(X \leq t)}$ for ${t \in {\bf R}}$; in particular, two such random variables ${X,Y}$ are equal if and only if the events ${(X \leq t)}$ and ${(Y \leq t)}$ are surely equal for all ${t}$. From the identity

$\displaystyle (\sup_n X_n \leq t) = \bigwedge_{n=1}^\infty (X_n \leq t)$

we thus see that one can completely specify ${\sup_n X_n}$ in terms of ${X_n}$ using only the basic operations provided in the above list (and in particular using the countable conjunction ${\bigwedge_{n=1}^\infty}$.) Of course, the same considerations hold if one replaces supremum, by infimum, limit superior, limit inferior, or (if it exists) the limit.

In this set of notes, we will define some further important operations on scalar random variables, in particular the expectation of these variables. In the sample space models, expectation corresponds to the notion of integration on a measure space. As we will need to use both expectation and integration in this course, we will thus begin by quickly reviewing the basics of integration on a measure space, although we will then translate the key results of this theory into probabilistic language.

As the finer details of the Lebesgue integral construction are not the core focus of this probability course, some of the details of this construction will be left to exercises. See also Chapter 1 of Durrett, or these previous blog notes, for a more detailed treatment.

### Recent Comments

 Anonymous on Quantitative bounds for critic… Andrew Verras on Analysis II Issa Rice on Analysis II Terence Tao on Quantitative bounds for critic… Anonymous on The Collatz conjecture, Little… BabaYaga on Quantitative bounds for critic… BabaYaga on Quantitative bounds for critic… m.r.s on Quantitative bounds for critic… m.r.s on Quantitative bounds for critic… Anonymous on Quantitative bounds for critic… Anonymous on Quantitative bounds for critic… Terence Tao on Quantitative bounds for critic… Anonymous on Quantitative bounds for critic… Anonymous on Quantitative bounds for critic… Anonymous on Quantitative bounds for critic…