In short
A functional equation describes a function by a relationship it must satisfy — like f(x + y) = f(x) + f(y) — without giving a formula directly. When the function is known to be differentiable, you can differentiate the equation to find f'(x), which often pins down f itself. The key techniques are: substituting special values, differentiating with respect to one variable while holding the other fixed, and using the limit definition of the derivative.
Here is a puzzle. A function f satisfies two conditions:
What is f?
No formula is given. The function is described only by a rule — what it does when you add its inputs — and one piece of information about its derivative. From this alone, you can determine f completely.
Try a few substitutions. Set y = 0: you get f(x) = f(x) \cdot f(0). If f(x) \neq 0, this forces f(0) = 1. Good — you know one value already.
Now use the derivative. By definition, f'(0) = \lim_{h \to 0} \frac{f(0 + h) - f(0)}{h} = \lim_{h \to 0} \frac{f(h) - 1}{h} = 2.
This tells you how f behaves near zero. Can you get the derivative at every point? Start from the functional equation and compute f'(x) directly:
So f'(x) = 2f(x). A function whose derivative is a constant multiple of itself is an exponential: f(x) = e^{2x}. You can verify: f(0) = 1, f'(0) = 2, and f(x + y) = e^{2(x+y)} = e^{2x} \cdot e^{2y} = f(x) \cdot f(y). Everything checks out.
That is the method. A functional equation tells you something about the function's structure. Differentiation converts that structural information into a differential equation — an equation involving f' — which you can then solve.
This approach works because a functional equation constrains a function globally (what it does at x + y in terms of what it does at x and y), while a derivative is a local property (what the function does at a single point). Differentiating bridges the two: it turns the global constraint into a local equation — a differential equation — that you can integrate to recover the function.
The core technique: differentiate and substitute
The strategy has three moves, used in almost every problem.
Move 1: Substitute special values. Setting one variable to 0 (or to x, or to -x) in the functional equation often reveals f(0), f(-x) in terms of f(x), or other constraints. Do this first, before any calculus.
Move 2: Differentiate with respect to one variable. In a functional equation like f(x + y) = f(x) + f(y), treat x as a constant and differentiate both sides with respect to y (or vice versa). The left side becomes f'(x + y). The right side simplifies because f(x) is constant with respect to y.
Move 3: Substitute back. After differentiating, set y = 0 (or another convenient value). This often collapses the equation into a differential equation involving only f and f' evaluated at a single point.
These three moves — substitute, differentiate, substitute again — form a loop. You repeat until you have enough information to pin down f.
The Cauchy functional equation
The simplest functional equation is
This is called the Cauchy functional equation. It says: the function is additive. The output of a sum is the sum of the outputs.
Finding all continuous solutions
Step 1. Set y = 0: f(x) = f(x) + f(0), so f(0) = 0.
Step 2. Set y = x: f(2x) = 2f(x). Set y = 2x: f(3x) = f(x) + f(2x) = 3f(x). By induction, f(nx) = nf(x) for all positive integers n.
Step 3. Put x = 1: f(n) = n \cdot f(1). Let c = f(1). So f equals cx at every positive integer.
Step 4. For rationals: from f(nx) = nf(x), set x = p/q and n = q: f(p) = q \cdot f(p/q), so f(p/q) = p \cdot f(1)/q = c(p/q). The function is f(x) = cx at every rational number.
Step 5. For negative values: set y = -x in the original equation: f(0) = f(x) + f(-x) = 0, so f(-x) = -f(x). The function is odd. Combined with Step 4, f(r) = cr for all rationals r, positive and negative.
Step 6. If f is continuous (or even just measurable, or monotone), the rational values force the function on all of \mathbb{R}: f(x) = cx for all x. The rationals are dense in the reals, and a continuous function that equals cx on a dense set must equal cx everywhere.
Solutions of the Cauchy functional equation
If f : \mathbb{R} \to \mathbb{R} satisfies f(x + y) = f(x) + f(y) for all x, y and f is continuous, then f(x) = cx for some constant c. If f is also differentiable, then f'(x) = c for all x.
Without the continuity assumption, there exist wild, discontinuous solutions (constructed using the axiom of choice) that are nothing like straight lines. But in every exam context, you can assume the solution is f(x) = cx.
The multiplicative cousin
Replace addition with multiplication: f(xy) = f(x) + f(y). This says the function turns products into sums — that is exactly what a logarithm does. If f is continuous and defined for x > 0, the only solution is f(x) = c \ln x for some constant c.
How to see this: set x = y = 1, getting f(1) = 2f(1), so f(1) = 0. Now substitute x = e^u, y = e^v. Define g(u) = f(e^u). Then g(u + v) = f(e^{u+v}) = f(e^u \cdot e^v) = f(e^u) + f(e^v) = g(u) + g(v). This is Cauchy's equation for g, so g(u) = cu, meaning f(e^u) = cu, meaning f(x) = c \ln x.
Finding f(x) from f'(x) conditions
Many JEE problems give a functional equation and one condition on the derivative (like f'(0) or f'(1)), and ask you to find f entirely. The method always follows the same pattern.
The exponential functional equation
You saw this at the start. Differentiate with respect to y and set y = 0:
This is the ODE f' = kf, whose solution is f(x) = f(0) e^{kx}. From the functional equation, f(0) = 1 (set x = y = 0: f(0) = f(0)^2, so f(0) = 1 since f(0) = 0 would make f identically zero). Therefore f(x) = e^{kx}.
The logarithmic functional equation
Differentiate with respect to y (treating x as constant):
Set y = 1: x \cdot f'(x) = f'(1) = 1, so f'(x) = 1/x.
Integrating: f(x) = \ln|x| + C. Since f(1) = 0 (set x = y = 1 in the original equation), C = 0. So f(x) = \ln x (for x > 0).
The power-type functional equation
Differentiate both sides with respect to y (treating x as constant):
Set y = 1: x \cdot f'(x) = f(x) \cdot f'(1) = n \cdot f(x). This gives \frac{f'(x)}{f(x)} = \frac{n}{x}.
Integrate: \ln|f(x)| = n \ln|x| + C, so f(x) = Ax^n. From the functional equation, f(1) = f(1)^2, so f(1) = 1 (assuming f is not identically zero), which gives A = 1. Therefore f(x) = x^n.
This is the fourth member of the Cauchy family. The derivative condition f'(1) = n tells you the exponent — the function is a power function with that exact power.
Derivative from a limit condition
A trickier type: you are given
and asked to find f'(x). This is not quite the standard derivative (which uses f(x+h) - f(x)). Rewrite:
The first term approaches f'(x) as h \to 0. For the second term, substitute h \to -h: \frac{f(x) - f(x-h)}{h} = \frac{f(x + (-h)) - f(x)}{-(-h)}, which also approaches f'(x). So if f is differentiable at x, the whole expression approaches 2f'(x), and therefore f'(x) = g(x)/2.
The converse is not true: the limit \frac{f(x+h) - f(x-h)}{h} can exist even when f'(x) does not. Take f(x) = |x| at x = 0: the numerator |h| - |-h| = 0 for all h, so the limit is 0. But f'(0) does not exist. This "symmetric derivative" is weaker than the actual derivative.
A common exam pattern: limit involving f(a + h)
Here is a pattern that appears frequently. You are told that f(a) and f'(a) are known, and asked to evaluate
where x \to a. Rewrite the numerator: af(x) - xf(a) = a[f(x) - f(a)] - f(a)(x - a). Divide by (x - a):
As x \to a, the first term approaches a \cdot f'(a) and the second is a constant. So the limit is a \cdot f'(a) - f(a).
This is not a functional equation per se, but the technique is identical: rewrite the expression so that the derivative definition appears inside it, then use the known derivative.
Example 1: Finding $f(x)$ from $f(x + y) = f(x) + f(y) + 2xy$
A differentiable function f satisfies f(x + y) = f(x) + f(y) + 2xy for all x, y, and f'(0) = 3. Find f(x).
Step 1. Substitute x = y = 0.
Why: setting both variables to zero is always the first move. It gives you f(0) immediately.
Step 2. Compute f'(x) from the definition.
Why: using the functional equation to replace f(x+h) with f(x) + f(h) + 2xh. The f(x) terms cancel, leaving only f(h) and 2xh.
Step 3. Evaluate \lim_{h \to 0} \frac{f(h)}{h}.
Since f(0) = 0, this limit is \lim_{h \to 0} \frac{f(h) - f(0)}{h} = f'(0) = 3.
Why: the limit is exactly the derivative of f at 0, which is given.
Step 4. Combine.
Integrate: f(x) = 3x + x^2 + C. Since f(0) = 0, C = 0.
Why: integrate f'(x) = 3 + 2x to get f(x), and use f(0) = 0 to pin down the constant.
Result: f(x) = x^2 + 3x.
Verification: f(x + y) = (x+y)^2 + 3(x+y) = x^2 + 2xy + y^2 + 3x + 3y. And f(x) + f(y) + 2xy = (x^2 + 3x) + (y^2 + 3y) + 2xy = x^2 + 2xy + y^2 + 3x + 3y. They match.
The answer is a parabola — which is fitting, because the 2xy term in the functional equation is the cross term of (x + y)^2, hinting at a quadratic. The derivative condition f'(0) = 3 fixed the linear part.
Example 2: Finding $f(x)$ from $f(xy) = xf(y) + yf(x)$
A differentiable function f satisfies f(xy) = xf(y) + yf(x) for all x, y > 0, and f(e) = 1. Find f(x).
Step 1. Substitute y = 1.
Why: for this to hold for all x > 0, the coefficient f(1) must be zero.
Step 2. Differentiate with respect to y, treating x as a constant.
Why: on the left, chain rule gives f'(xy) \cdot x. On the right, xf(y) differentiates to xf'(y) and yf(x) differentiates to f(x) (since f(x) is a constant with respect to y).
Step 3. Set y = 1.
Rearranging: x \cdot f'(x) - f(x) = x \cdot f'(1).
Why: setting y = 1 collapses the equation into an ODE involving f and f' at the same point x.
Step 4. Find f'(1).
Divide by x: f'(x) - \frac{f(x)}{x} = f'(1).
This is a first-order linear ODE. Its general solution is f(x) = f'(1) \cdot x \ln x + Cx. Since f(1) = 0, C = 0. So f(x) = f'(1) \cdot x \ln x.
Use f(e) = 1: 1 = f'(1) \cdot e \cdot 1, so f'(1) = 1/e.
Why: the condition f(e) = 1 pins down the multiplicative constant, leaving no freedom.
Result: f(x) = \dfrac{x \ln x}{e}.
Verification: f(xy) = \frac{xy \ln(xy)}{e} = \frac{xy(\ln x + \ln y)}{e}. And xf(y) + yf(x) = \frac{xy \ln y}{e} + \frac{xy \ln x}{e} = \frac{xy(\ln x + \ln y)}{e}. They match.
The function x \ln x appears naturally in information theory (it is related to entropy) and in the study of algorithms (it governs the cost of comparison-based sorting). Here it emerged purely from the algebraic structure of the functional equation.
Common confusions
-
"The functional equation uniquely determines f." Not always. The Cauchy equation f(x+y) = f(x) + f(y) has infinitely many discontinuous solutions. To get a unique answer, you need an extra condition: differentiability, continuity, or monotonicity. In exam problems this condition is always stated — if it says "a differentiable function f," that is doing real work, not just decoration.
-
"Differentiate with respect to x and y simultaneously." No. The variables x and y are independent. You differentiate with respect to one variable (usually y), treating the other as a constant. This gives you one equation. If you need another, differentiate with respect to the other variable separately.
-
"\lim_{h \to 0} \frac{f(h)}{h} = f'(0) always." Only if f(0) = 0. The limit \frac{f(h)}{h} as h \to 0 is \frac{f(h) - 0}{h - 0}, which equals f'(0) only when f(0) = 0. If f(0) \neq 0, the limit is \pm \infty or does not exist.
-
"The functional equation approach works without differentiability." The substitution steps (finding f(0), f(-x), etc.) work without assuming differentiability. But the differentiation step requires it. If the problem does not say f is differentiable, you need a different approach — typically induction over the rationals, combined with continuity.
Going deeper
If you came here to learn how to solve functional-equation problems using derivatives, you have the core technique. The rest of this section explores why the method works and what can go wrong.
Why differentiating a functional equation is valid
When you write f(x + y) = f(x) \cdot f(y) and differentiate both sides with respect to y, you are applying a theorem: if two functions of y are equal for all y, their derivatives are also equal (provided both are differentiable). This is just the statement that differentiation respects equality. The reason you can treat x as a constant is that the equation holds for every fixed x — you are differentiating the family of equations parametrised by x.
The role of the axiom of choice
The Cauchy functional equation f(x+y) = f(x) + f(y) has solutions other than f(x) = cx. These "pathological" solutions are constructed by choosing a Hamel basis for \mathbb{R} over \mathbb{Q} — a set of real numbers such that every real number is a finite rational linear combination of basis elements. The axiom of choice guarantees such a basis exists, but you cannot write one down explicitly. On each basis element, you assign f an arbitrary value; additivity then determines f everywhere. The result is a function whose graph is dense in the plane — it passes arbitrarily close to every point in \mathbb{R}^2.
These pathological solutions are not measurable and not monotone. Any regularity condition — continuity, monotonicity, boundedness on an interval, or even measurability — rules them out and forces f(x) = cx. In practice, you will never encounter them in a calculation or an exam. But they are a vivid example of how the axiom of choice produces objects that are logically sound but geometrically wild.
The Cauchy family: four equations, four solutions
There are four classical functional equations, each with a clean unique continuous solution:
| Functional equation | Continuous solution (x > 0 where needed) |
|---|---|
| f(x + y) = f(x) + f(y) | f(x) = cx |
| f(x + y) = f(x) \cdot f(y) | f(x) = e^{cx} (or f \equiv 0) |
| f(xy) = f(x) + f(y) | f(x) = c \ln x |
| f(xy) = f(x) \cdot f(y) | f(x) = x^c (or f \equiv 0) |
Each of these can be derived from Cauchy's original equation by substitution (as we did for f(xy) = f(x) + f(y) by writing g(u) = f(e^u)). The four equations form a closed family under the substitutions x \to e^x and f \to \log f.
Knowing this table is valuable: when you see a functional equation on an exam, check whether it is one of these four (possibly in disguise). If it is, write down the answer immediately and verify.
Beyond two variables
Functional equations can involve more than two inputs. For example:
This says f of the midpoint is the midpoint of the f-values — the function is midpoint-convex. If f is continuous, midpoint convexity implies full convexity, and if f is additionally differentiable, this equation forces f''(x) \ge 0 everywhere. But the only solutions to the equation (as a functional equation, not an inequality) are the affine functions f(x) = ax + b.
To see this: differentiate with respect to x while holding y fixed. The left side gives \frac{1}{2} f'\!\left(\frac{x+y}{2}\right) and the right side gives \frac{f'(x)}{2}. So f'\!\left(\frac{x+y}{2}\right) = f'(x) for all y — meaning f' is constant.
When differentiation gives you an ODE you cannot solve
Not every functional equation leads to a simple ODE. Consider
This is the d'Alembert functional equation. Differentiating with respect to y and setting y = 0 gives f'(x) - f'(x) = 2f(x) \cdot f'(0)... which simplifies to 0 = 2f(x) f'(0). If f is not identically zero, this forces f'(0) = 0.
Differentiating twice with respect to y and setting y = 0 gives
This is a second-order ODE. If f''(0) = -\omega^2 < 0, the solution is f(x) = \cos(\omega x). If f''(0) = \omega^2 > 0, the solution is f(x) = \cosh(\omega x). The functional equation forced you to go to the second derivative to extract useful information — the first derivative step was a dead end.
The lesson: if the first differentiation gives you something trivial (0 = 0 or a tautology), differentiate again. Each differentiation extracts one more piece of structural information from the equation.
Where this leads next
- Differentiation — Introduction — the definition of the derivative and the limit definition used throughout this article.
- Rules of Differentiation — product rule, quotient rule, and chain rule, which you need to differentiate the functional equation's terms.
- Derivatives of Basic Functions — the derivatives of e^x, \ln x, x^n, and the trig functions, used when identifying the solution from the ODE.
- Differentiability — the formal condition that makes the differentiation step valid: the left and right derivatives must exist and be equal.
- Differentiation of Special Functions — what happens when the function in the equation involves absolute values, floor functions, or piecewise definitions.