Theorem 4.4.1 If $X$ and $Y$ are any two random variables, then \[EX = E(E(X|Y)),\] provided that the expectations exist.
A random variable $X$ is said to have a mixture distribution if the distribution of $X$ depends on a quantity that also has a distribution.
Theorem 4.4.2 (Conditional Variance Identity) For any two random variables $X$ and $Y,$ \[Var(X) = E(var(X|Y)) + Var(E(X|Y)),\] provided that the expectations exist.