Conditional Distributions and Independence

Let (X,Y) be a discrete bivariate random vector with joint pmf f(x,y) and marginal pmfs fX(x) and fY(y). For any x such that P(X=x)=fX(x)>0, the conditional pmf of Y given that X=x is the function of y denoted by f(y|x) and defined by f(y|x)=P(Y=y|X=x)=f(x,y)fX(x). For any y such that P(Y=y)=fY(y)>0, the conditional pmf of X given that Y=y is the function of x denoted by f(x|y) and defined by f(x|y)=P(X=x|Y=y)=f(x,y)fY(y).

Let (X|Y) be a continuous bivariate random vector with joint pdf f(x,y) and marginal pdfs fX(x) and fY(y). For any x such that fX(x)>0, the conditional pdf of Y given that X=x is the function of y denoted by f(y|x) and defined by f(y|x)=f(x,y)fY(y), For any y such that fY(y)>0, the conditional pdf of X given that Y=y is the function of x denoted by f(x|y) and defined by f(x|y)=f(x,y)fX(x),

If g(Y) is a function of Y, then the conditional expected value of g(Y) given that X=x is denoted by E(g(Y)|x) and is given by E(g(Y)|x)=yg(y)f(y|x) and E(g(Y)|x)=g(y)f(y|x)dy in the discrete and continuous cases, respectively.

The variance of the probability distribution described by f(y|x) is called the conditional variance of Y given X=x and given by Var(Y|x)=E(Y2|x)(E(Y|x))2.

Let X(,Y) be a bivariate random vector with joint pdf or pmf f(x,y) and marginal pdf or pmfs fX(x) and fY(y). Then X and Y are called independent random variables if, for every x,yR, f(x,y)=fX(x)fY(y).

If X and Y are independent, the conditional pdf of Y given X=x is

f(y|x)=f(x,y)fX(x)=fX(x)fY(y)fX(x)=fY(y), regardless of the value of x.

Lemma 4.2.1 Let (X,Y) be a bivariate random vector with joint pdf or pmf f(x,y). Then X and Y are independent random variables if and only if there exist functions g(x) and h(y) such that, for every x,yR, f(x,y)=g(x)h(y).

Theorem 4.2.2 Let X and Y be independent random variables.
a) For any AR and BR,P(XA,YB)=P(XA)P(YB); that is, the events {XA} and {YB} are independent events.
b) Let g(x) be a function only of x and h(y) be a function only of y. Then E(g(X)h(Y))=(Eg(X))(Eh(Y)).

Theorem 4.2.3 Let X and Y be independent random variables with moment generating functions MX(t) and MY(t). Then the moment generating function of the random variable Z=X+Y is given by MZ(t)=MX(t)MY(t)

Theorem 4.2.4 Let XN(μ,σ) and YN(γ,τ2) be independent normal random variables. Then the random variable Z=X+Y has a N(μ+γ,σ2+τ2) distribution.