In short

The Leibniz rule tells you how to differentiate a definite integral with respect to a parameter. In its simplest form (fixed limits), you just move the derivative inside the integral. When the limits themselves depend on the parameter, two extra boundary terms appear — each one is the integrand evaluated at a moving limit, multiplied by that limit's derivative.

Here is a function defined in an unusual way:

F(x) = \int_0^x t^2 \, dt

What is F'(x)? You could evaluate the integral first: F(x) = x^3/3. Then F'(x) = x^2. Simple enough.

But look at what just happened. The derivative of \int_0^x t^2 \, dt turned out to be x^2 — the integrand itself, with t replaced by x. You integrated, then differentiated, and the two operations cancelled, leaving you right back where you started.

That is not a coincidence. It is the Fundamental Theorem of Calculus at work: differentiation undoes integration. But what if the setup is more complicated? What if the upper limit is not just x but x^2? What if both limits move? What if the integrand itself contains the parameter?

The Leibniz rule answers all of these questions in one formula.

The simplest case: one moving limit

Start with the case you just saw. Define

F(x) = \int_a^x f(t) \, dt

where a is a constant and f is continuous. The Fundamental Theorem of Calculus says:

F'(x) = f(x)

Differentiation undoes integration. The derivative of the "area so far" function is the height of the curve at the current point.

The curve $f(t) = t^2$. The area under the curve from $0$ to $x$ is $F(x) = \int_0^x t^2\,dt$. The rate at which this area grows as $x$ increases is exactly the height of the curve at $x$, which is $f(x) = x^2$.

Now suppose the upper limit is not x but some function g(x). For example:

F(x) = \int_0^{x^2} t^2 \, dt

The integral runs from 0 to x^2. As x changes, the upper limit x^2 changes, and the integral changes accordingly. To find F'(x), use the chain rule: think of the integral as a composition. Let G(u) = \int_0^u t^2 \, dt, so F(x) = G(x^2). By the chain rule:

F'(x) = G'(x^2) \cdot \frac{d}{dx}(x^2) = (x^2)^2 \cdot 2x = 2x^5

The pattern: differentiate the integral as if the upper limit were a plain variable (giving the integrand evaluated at the upper limit), then multiply by the derivative of the upper limit.

The full Leibniz rule

Now for the general case. Suppose both limits and the integrand itself depend on a parameter x:

F(x) = \int_{a(x)}^{b(x)} f(x, t) \, dt

Leibniz integral rule

If f(x, t), \frac{\partial f}{\partial x}(x, t), a(x), and b(x) are all continuous and differentiable, then

\frac{d}{dx}\int_{a(x)}^{b(x)} f(x,t)\,dt = f\bigl(x,\, b(x)\bigr)\cdot b'(x) \;-\; f\bigl(x,\, a(x)\bigr)\cdot a'(x) \;+\; \int_{a(x)}^{b(x)} \frac{\partial f}{\partial x}(x,t)\,dt

Three terms, each with a clear geometric meaning:

  1. f(x, b(x)) \cdot b'(x) — the integrand at the upper limit, scaled by how fast the upper limit is moving. If the upper limit moves to the right, you are adding a thin strip of area; its height is the integrand's value at the boundary, and its width is b'(x)\,dx.

  2. -f(x, a(x)) \cdot a'(x) — the integrand at the lower limit, with a minus sign. If the lower limit moves to the right, you are removing area from the left end of the region.

  3. \int_{a(x)}^{b(x)} \frac{\partial f}{\partial x}\,dt — the change due to the integrand itself shifting. Even if the limits are fixed, the integrand depends on x, so as x changes, every point of the integrand moves up or down. This term captures the cumulative effect of that shift.

Special cases

Why the formula is true

Break the problem into pieces. Write

F(x) = \int_{a(x)}^{b(x)} f(x, t)\,dt

and define an auxiliary function G(x, u, v) = \int_u^v f(x, t)\,dt, where u = a(x) and v = b(x). By the multivariable chain rule:

F'(x) = \frac{\partial G}{\partial x} + \frac{\partial G}{\partial v} \cdot v'(x) + \frac{\partial G}{\partial u} \cdot u'(x)

Now compute each partial derivative. \frac{\partial G}{\partial x} means: hold u and v fixed and differentiate under the integral sign. This gives \int_u^v \frac{\partial f}{\partial x}\,dt.

\frac{\partial G}{\partial v} means: hold x and u fixed, and differentiate with respect to the upper limit. By the Fundamental Theorem, this is f(x, v). Similarly, \frac{\partial G}{\partial u} = -f(x, u) (the minus sign because changing the lower limit subtracts area).

Substituting back u = a(x) and v = b(x):

F'(x) = \int_{a(x)}^{b(x)} \frac{\partial f}{\partial x}\,dt + f(x, b(x)) \cdot b'(x) - f(x, a(x)) \cdot a'(x)

which is the Leibniz rule.

Worked examples

Example 1: Variable upper limit — find $F'(x)$ when $F(x) = \int_1^{x^3} \frac{\sin t}{t}\,dt$

Step 1. Identify the pieces. The integrand f(t) = \frac{\sin t}{t} does not depend on x. The lower limit is the constant 1. The upper limit is b(x) = x^3.

Why: since f does not contain x and the lower limit is fixed, only the "upper boundary" term survives in the Leibniz rule.

Step 2. Apply the formula: F'(x) = f(b(x)) \cdot b'(x).

F'(x) = \frac{\sin(x^3)}{x^3} \cdot 3x^2

Why: evaluate the integrand at t = x^3 to get \frac{\sin(x^3)}{x^3}, then multiply by the derivative of the upper limit, \frac{d}{dx}(x^3) = 3x^2.

Step 3. Simplify.

F'(x) = \frac{3x^2 \sin(x^3)}{x^3} = \frac{3\sin(x^3)}{x}

Why: cancel one factor of x from x^2/x^3.

Step 4. Check at a specific point. At x = 1: F'(1) = \frac{3\sin(1)}{1} = 3\sin 1 \approx 2.52. This says the area under \frac{\sin t}{t} from 1 to t is growing at rate \approx 2.52 when t passes through 1 — which is plausible, since the integrand at t = 1 is \sin 1 \approx 0.84, and the upper limit is racing away at rate 3x^2 = 3.

Result: F'(x) = \dfrac{3\sin(x^3)}{x}.

The solid red curve is $F'(x) = \frac{3\sin(x^3)}{x}$. For comparison, the dashed curve shows the integrand $\frac{\sin t}{t}$ itself. The derivative $F'(x)$ oscillates with increasing frequency because the upper limit $x^3$ sweeps through the oscillations of $\sin t$ ever faster as $x$ grows.

The graph shows F'(x) oscillating rapidly for larger x, which makes sense: as the upper limit x^3 races through the oscillations of \sin t / t, the accumulated area alternately grows and shrinks.

Example 2: Both limits variable — find $F'(x)$ when $F(x) = \int_{x}^{x^2} e^{-t^2}\,dt$

Step 1. Identify the pieces. The integrand f(t) = e^{-t^2} does not depend on x. The lower limit is a(x) = x. The upper limit is b(x) = x^2.

Why: both limits are functions of x, so both boundary terms contribute. The "parameter inside" term vanishes because f does not involve x.

Step 2. Apply the Leibniz rule: F'(x) = f(b(x)) \cdot b'(x) - f(a(x)) \cdot a'(x).

F'(x) = e^{-(x^2)^2} \cdot 2x \;-\; e^{-x^2} \cdot 1 = 2x\,e^{-x^4} - e^{-x^2}

Why: at the upper limit t = x^2, the integrand is e^{-x^4}, multiplied by b'(x) = 2x. At the lower limit t = x, the integrand is e^{-x^2}, multiplied by a'(x) = 1.

Step 3. Check the sign at x = 1. F'(1) = 2e^{-1} - e^{-1} = e^{-1} \approx 0.368. Positive, which makes sense: at x = 1, the upper limit x^2 = 1 equals the lower limit x = 1, so F(1) = 0. But the upper limit is racing away faster (b'(1) = 2) than the lower limit (a'(1) = 1), so the interval is widening and the integral is growing.

Step 4. Check at x = 2. F'(2) = 4e^{-16} - e^{-4} \approx 4(1.1 \times 10^{-7}) - 0.0183 \approx -0.0183. Negative — the integrand e^{-t^2} is extremely small for t near 4 (the upper limit at x = 2), so the upper boundary contributes almost nothing, and the lower boundary is eating area away.

Result: F'(x) = 2x\,e^{-x^4} - e^{-x^2}.

The derivative $F'(x) = 2xe^{-x^4} - e^{-x^2}$. It starts at $-1$ (when $x = 0$, the lower limit is not moving but the integrand at $t = 0$ is $e^0 = 1$, giving $-1$), crosses zero, peaks slightly above $0.37$ near $x = 1$, and then decays back toward zero.

The graph reveals a natural story: for small x, the lower boundary term dominates and F' is negative. Near x = 1, the upper boundary catches up and F' turns positive. For large x, both exponentials die out and F' approaches zero.

Applications

Evaluating integrals that depend on a parameter

One powerful application of the Leibniz rule is to compute a family of integrals at once by treating the answer as a function of a parameter and setting up a differential equation.

Consider I(\alpha) = \int_0^\infty e^{-\alpha t} \frac{\sin t}{t}\,dt for \alpha > 0. This integral is hard to evaluate directly. But differentiate with respect to \alpha:

I'(\alpha) = \int_0^\infty \frac{\partial}{\partial \alpha}\left(e^{-\alpha t} \frac{\sin t}{t}\right) dt = -\int_0^\infty e^{-\alpha t} \sin t \, dt

The t in the denominator was cancelled by the t produced by differentiating e^{-\alpha t}. The remaining integral is a standard Laplace transform:

I'(\alpha) = -\frac{1}{1 + \alpha^2}

Integrate: I(\alpha) = -\arctan \alpha + C. To find C, note that as \alpha \to \infty, I(\alpha) \to 0 (the exponential kills the integrand). So 0 = -\pi/2 + C, giving C = \pi/2. Therefore:

I(\alpha) = \frac{\pi}{2} - \arctan \alpha = \arctan\frac{1}{\alpha}

Setting \alpha = 0 (taking a limit) recovers the famous Dirichlet integral: \int_0^\infty \frac{\sin t}{t}\,dt = \frac{\pi}{2}.

Continuity of integrals with variable limits

If F(x) = \int_a^{g(x)} f(t)\,dt and f is continuous, then F is continuous — and in fact differentiable — wherever g is differentiable. This is useful for proving that solutions to certain differential equations are smooth.

Common confusions

Going deeper

The main content above covers the mechanics you need for JEE-level problems. What follows is a proof sketch for the general case, and a discussion of the conditions under which differentiation under the integral sign is valid.

When can you swap d/dx and \int?

The key condition is uniform convergence. If \frac{\partial f}{\partial x}(x, t) is continuous on a rectangle [x_0 - \delta, x_0 + \delta] \times [a, b], then \int_a^b \frac{\partial f}{\partial x}\,dt is continuous in x, and the swap is valid. For improper integrals (where b = \infty or where f has a singularity), you need the integral \int_a^\infty \frac{\partial f}{\partial x}\,dt to converge uniformly in x — meaning the tail of the integral can be made small independently of x.

This is why the Dirichlet integral computation above requires care: \int_0^\infty e^{-\alpha t} \sin t\,dt converges uniformly for \alpha \geq \epsilon > 0, but not at \alpha = 0. To recover the value at \alpha = 0, you take a limit — which is valid because I(\alpha) turns out to be continuous at \alpha = 0.

Leibniz rule in higher dimensions

The one-dimensional Leibniz rule generalises. If F(\mathbf{x}) = \int_{D(\mathbf{x})} f(\mathbf{x}, t)\,dt where D is a region that depends on a vector of parameters \mathbf{x}, the derivative picks up boundary contributions proportional to the speed of the boundary and the value of f on it. This generalisation — called the Reynolds transport theorem in fluid mechanics — is the foundation of conservation laws in physics.

Connection to differential equations

Many classical differential equations are solved by defining F(x) = \int g(x, t)\,dt and using the Leibniz rule to show that F satisfies the equation. The Laplace transform, the Fourier transform, and the method of Green's functions all rely on differentiating under the integral sign. Ramanujan, in particular, was famous for producing startling integral identities by this method — he would differentiate a known integral with respect to a parameter, evaluate the result, and obtain a new identity that seemed to come from nowhere.

Where this leads next