To define an expectation conditional on an event in a probability space is essentially no more than defining a conditional probability, then constructing the expectation as an integral with respect to this measure. At an abstract level, it is often more useful to define an expectation conditional on a sigma-algebra. Informally, we want to define the expectation conditional on every event in a sigma-algebra simultaneously. Most importantly, the result is measurable with respect to this sigma-algebra, so is a random variable itself under suitable conditions.

We want to proof that such a construction exists. We take a -measurable random variable, and consider a sub-sigma-algebra . We want to define , which is integrable, -measurable, and such that

, for all .

We also want to show that this conditional expectation is, up to null events, unique. When is generated by a countable collection of disjoint events, then we can define

and verify that this satisfies the conditions.

We proceed in the general case. *Uniqueness* is easy. Suppose have satisfying the conditions. Then the event . Substituting into the definition gives:

But and so we conclude that , and almost surely. Of course the reverse argument applies also, and so a.s.

For *existence*, we exploit a property of Hilbert spaces. We initially assume . We can decompose the host space as

and $X=Y+Z$

in this orthogonal projection. The operator on this space is , and so

,

From this, we conclude that is suitable. For what follows, observe that is -measurable, and so by a similar argument to before, a.s. implies a.s.

For general , set , and . By our previous observation, everything relevant is almost surely non-negative, and we can apply monotone convergence to both sides of the relation

to obtain the definition, and take to check integrability. Separating general into positive and negative parts gives the result for generally supported random variables.