Proving Theorem 2.3.2 utilized the assumption that we could differentiate under the integral sign. Since it is a useful technique in theoretical statistics, we characterize the conditions under which interchanging derivatives and integrals is possible.
We wish to calculate
\[\frac{d}{d\theta} \int_{a(\theta)}^{b(\theta)} f(x, \theta) \, dx\]
where $-infty < a(\theta), b(\theta) < \infty$ for all $\theta.$ The rule for differentiating the above is called Leibnitz’s Rule and is an application of the Fundamental Theorem of Calculus and the chain rule.
i. $|h(x,y)| \leq g(x)$ for all $x$ and $y,$
ii. $\int_{-\infty}^{\infty} g(x) \, dx < \infty.$
Then \[\lim_{y \rightarrow y_0} \int_{-\infty}^{\infty} h(x,y) \, dx = \int_{-\infty}^{\infty} \lim_{y \rightarrow y_0} h(x,y) \, dx.\]
i. $\displaystyle\left\vert \frac{f(x, \theta_0 + \delta) - f(x, \theta_0)}{\delta}\right\vert \leq g(x,\theta_0),$ for all $x$ and $|\delta |\leq \delta_0.$
ii. $\int_{-\infty}^{\infty} g(x,\theta_0) \, dx < \infty.$
Then \[\frac{d}{d\theta}\int_{-\infty}^{\infty} f(x,\theta) \, dx \bigg\vert_{\theta=\theta_0} = \int_{-\infty}^{\infty} \left[\frac{\partial}{\partial\theta} f(x, \theta)\bigg\vert_{\theta=\theta_0}\right]\,dx.\]
\begin{equation} \label{eq:interchange} \frac{d}{d\theta} \int_{-\infty}^{\infty} f(x,\theta) \, dx = \int_{-\infty}^{\infty} \frac{\partial}{\partial \theta} f(x,\theta) \, dx. \end{equation} \begin{equation} \label{eq:bound} \left\vert\frac{\partial}{\partial\theta}f(x,\theta)\right\vert_{\theta=\theta^{\prime}}\bigg\vert\leq g(x,\theta) \;\;\;\;\text{ for all $\theta^{\prime}$ such that $|\theta^{\prime}-\theta | \leq \delta_0$.} \end{equation}
i. $\frac{\partial}{\partial\theta}h(\theta,x)$ is continuous in $\theta$ for each $x,$
ii. $\sum_{x=0}^{\infty}\frac{\partial}{\partial\theta} h(\theta,x)$ converges uniformly on every closed bounded subinterval of $(a,b).$
Then \[\frac{d}{d\theta}\sum_{x=0}^{\infty}h(\theta,x) = \sum_{x=0}^{\infty}\frac{\partial}{\partial\theta} h(\theta,x).\]