In short
A function is differentiable at a point when the limit in the derivative definition exists — that is, when the left and right derivatives are equal. Differentiability is a stronger condition than continuity: every differentiable function is continuous, but a continuous function can fail to be differentiable at corners, cusps, and vertical tangents.
Take the function f(x) = |x|. Its graph is a perfect V, meeting at the origin. The function is continuous there — no jumps, no holes, the graph passes through (0, 0) without a break.
Now try to draw a tangent line at the origin. From the left, the graph is a straight line with slope -1. From the right, it is a straight line with slope +1. At the corner itself, which slope should the tangent have? There is no answer. The two arms of the V pull in different directions, and there is no single line that serves as the tangent.
This is the distinction between continuity and differentiability. A continuous function has a connected graph — no jumps. A differentiable function has a smooth graph — no corners, no cusps, no sudden changes of direction. The second condition is strictly stronger than the first.
Left and right derivatives
The derivative at a point x = a is defined as
This limit asks h to approach 0 from both sides — positive and negative. Just as you can split a limit into left-hand and right-hand limits, you can split the derivative.
Left and right derivatives
The right-hand derivative (or right derivative) of f at a is
The left-hand derivative (or left derivative) of f at a is
The derivative f'(a) exists if and only if both one-sided derivatives exist and are equal: f'_+(a) = f'_-(a).
The right-hand derivative uses only points to the right of a (i.e., h > 0). Geometrically, it is the slope of the tangent as approached from the right side. The left-hand derivative does the same from the left.
For f(x) = |x| at a = 0:
Right derivative: h > 0, so |0 + h| = h.
Left derivative: h < 0, so |0 + h| = |h| = -h.
Since f'_+(0) = 1 \neq -1 = f'_-(0), the derivative f'(0) does not exist. The function is not differentiable at x = 0.
At every other point, |x| is differentiable. For x > 0, |x| = x, and f'(x) = 1. For x < 0, |x| = -x, and f'(x) = -1. Only at x = 0 does differentiability fail.
Differentiability implies continuity
Here is one of the most important theorems in differential calculus: if a function is differentiable at a point, it is automatically continuous there.
Theorem: Differentiability implies continuity
If f is differentiable at x = a, then f is continuous at x = a.
Proof
Suppose f'(a) exists. You need to show that \lim_{x \to a} f(x) = f(a), which is the definition of continuity at a.
Write f(x) - f(a) in a revealing way:
This is just multiplying and dividing by (x - a) — valid for x \neq a. Now take the limit as x \to a:
The limit of a product is the product of the limits (when both exist). The first factor has limit f'(a) (that is the definition of the derivative). The second factor has limit 0 (since x - a \to 0). So
This means \lim_{x \to a} f(x) = f(a). So f is continuous at a. \square
The proof is three lines, and each line does exactly one thing. The key insight is the factorisation: the difference f(x) - f(a) can be written as a product of two things, one of which (the difference quotient) has a finite limit (because f is differentiable) and the other of which (x - a) goes to zero. A finite number times zero is zero.
The contrapositive
The contrapositive of "differentiable \implies continuous" is: if f is not continuous at a, then f is not differentiable at a. This gives you a quick test: if you spot a jump or a hole in the graph, you know immediately that the derivative does not exist there. You do not need to check the limit — discontinuity alone is enough to kill differentiability.
Continuity does not imply differentiability
The converse of the theorem is false. A function can be continuous at a point without being differentiable there. You have already seen the simplest example: f(x) = |x| at x = 0.
There are several ways a continuous function can fail to be differentiable. Here are the three main ones.
Corners
A corner (or kink) occurs when the graph changes direction abruptly. The left and right derivatives both exist but disagree. The function |x| at the origin is the classic example.
Another example: f(x) = |x - 2| + 1 has a corner at x = 2. The left derivative there is -1 and the right derivative is +1.
Cusps
A cusp occurs when the graph comes to a sharp point and the one-sided derivatives diverge to \pm\infty. Take f(x) = x^{2/3}. At x = 0:
As h \to 0^+, this goes to +\infty. As h \to 0^-, h^{1/3} is negative, so this goes to -\infty. Neither one-sided derivative exists as a finite number. The graph has a cusp at the origin — both arms approach vertically, but from opposite directions.
Vertical tangents
A vertical tangent occurs when the one-sided derivatives both go to +\infty (or both to -\infty). Take f(x) = x^{1/3} (the cube root). At x = 0:
As h \to 0 from either side, h^{2/3} \to 0^+, so the expression goes to +\infty. The tangent line is vertical — it exists as a geometric line, but its slope is infinite, and the derivative is defined as a finite limit. So f is not differentiable at x = 0.
The key distinction: at a cusp, the two sides diverge in opposite directions (+\infty and -\infty). At a vertical tangent, both sides diverge in the same direction.
A summary of the relationship
Here is the logical picture:
The implication goes one way only. Differentiability is a subset of continuity — a stronger, more exclusive property. Every differentiable function is continuous, but the set of continuous functions is larger and includes functions with corners, cusps, and vertical tangents.
Differentiability on intervals
So far, differentiability has been about a single point. In practice, you often need a function to be differentiable on an entire interval.
Differentiability on an interval
- f is differentiable on an open interval (a, b) if f'(x) exists for every x \in (a, b).
- f is differentiable on a closed interval [a, b] if f'(x) exists for every x \in (a, b), and in addition:
- the right-hand derivative f'_+(a) exists at the left endpoint,
- the left-hand derivative f'_-(b) exists at the right endpoint.
At the endpoints of a closed interval, you can only approach from one side, so only the one-sided derivative is required. This is parallel to the definition of continuity on [a, b], which requires one-sided limits at the endpoints.
A function like f(x) = \sqrt{x} is differentiable on (0, \infty) but not differentiable at x = 0 (the derivative \frac{1}{2\sqrt{x}} blows up as x \to 0^+). So f is differentiable on every interval [\epsilon, b] for \epsilon > 0, but not on [0, b].
A polynomial is differentiable on every interval — indeed, on all of \mathbb{R}. So is \sin x, \cos x, and e^x. These are the "nice" functions of calculus.
Why intervals matter
Many of the major theorems of calculus — Rolle's theorem, the Mean Value Theorem, Taylor's theorem — have hypotheses of the form "continuous on [a, b] and differentiable on (a, b)." Notice the pattern: closed interval for continuity, open interval for differentiability. This is deliberate. Continuity at the endpoints keeps the function well-behaved at the edges. Differentiability on the open interior is where the real action happens — that is where the derivative gives you information about slopes, turning points, and rates of change. The endpoints are allowed to be corners or cusps without breaking the theorem.
Take f(x) = \sqrt{1 - x^2} on [-1, 1]. This is the upper half of the unit circle. It is continuous on [-1, 1], and differentiable on (-1, 1), but at x = -1 and x = 1 the curve is vertical — the derivative blows up. The function is differentiable on the open interval but not at the endpoints. And that is fine for most theorems: they only need differentiability on the interior.
Checking differentiability of piecewise functions
Piecewise-defined functions — functions given by different formulas on different pieces of the domain — are where differentiability questions most often arise. The recipe has two parts.
Part 1: Check continuity. If the function is not continuous at the junction point, it cannot be differentiable there (by the theorem). So first verify that the left-hand limit, right-hand limit, and function value all agree.
Part 2: Check that the left and right derivatives match. If the function is continuous at the junction, compute f'_-(a) and f'_+(a). If they are equal, the function is differentiable there, and the common value is f'(a).
Example 1: A piecewise function that is differentiable
Consider
Step 1. Check continuity at x = 1.
From the left: \lim_{x \to 1^-} x^2 = 1. From the right: \lim_{x \to 1^+} (2x - 1) = 1. Function value: f(1) = 1^2 = 1. All three are equal, so f is continuous at x = 1.
Why: continuity is the necessary first condition. If the pieces don't join up, the derivative has no hope.
Step 2. Compute the left derivative at x = 1.
For x < 1, f(x) = x^2, so f'(x) = 2x. Thus f'_-(1) = 2(1) = 2.
Why: on the left piece, the derivative is straightforward — just differentiate x^2.
Step 3. Compute the right derivative at x = 1.
For x > 1, f(x) = 2x - 1, so f'(x) = 2. Thus f'_+(1) = 2.
Why: the right piece is a straight line with slope 2.
Step 4. Compare.
f'_-(1) = 2 = f'_+(1). The one-sided derivatives match, so f is differentiable at x = 1, and f'(1) = 2.
Why: matching one-sided derivatives means the tangent line transitions smoothly from one piece to the other — no corner, no jump in slope.
Result: f is differentiable at x = 1, with f'(1) = 2.
The parabola x^2 meets the line 2x - 1 at x = 1 with the same value (1) and the same slope (2). Geometrically, the transition is smooth — there is no corner, and a single tangent line exists at the junction.
Example 2: A piecewise function that is NOT differentiable
Consider
Step 1. Check continuity at x = 1.
From the left: \lim_{x \to 1^-} x^2 = 1. From the right: \lim_{x \to 1^+} (3x - 2) = 1. Function value: g(1) = 1. The function is continuous at x = 1.
Why: the pieces join — same value from both sides. Continuity is satisfied.
Step 2. Compute the left derivative.
For x < 1, g(x) = x^2, so g'(x) = 2x. Thus g'_-(1) = 2.
Why: the left piece has slope 2 at x = 1.
Step 3. Compute the right derivative.
For x > 1, g(x) = 3x - 2, so g'(x) = 3. Thus g'_+(1) = 3.
Why: the right piece is a line with slope 3, steeper than the left piece at the junction.
Step 4. Compare.
g'_-(1) = 2 \neq 3 = g'_+(1). The one-sided derivatives do not match, so g is not differentiable at x = 1.
Why: the slopes disagree. The graph has a corner at (1, 1) — the left piece arrives with slope 2 and the right piece departs with slope 3.
Result: g is continuous but not differentiable at x = 1.
The difference between Example 1 and Example 2 is a single number: the slope of the right piece. In Example 1 it was 2 (matching the left), and the function was differentiable. In Example 2 it was 3 (not matching), and the function had a corner. Differentiability at a junction is entirely about whether the slopes agree.
Common confusions
-
"Differentiable and continuous are the same thing." They are not. Differentiability is strictly stronger. Every differentiable function is continuous, but |x| is continuous at 0 without being differentiable there. Think of differentiability as "continuity plus smoothness."
-
"If the left and right limits of f'(x) exist and are equal, then f is differentiable." Be careful. The left and right derivatives f'_-(a) and f'_+(a) are defined by limits of the difference quotient, not by limits of f'(x) as x \to a. In most textbook examples these coincide, but in general they can differ. The correct test uses the difference quotient directly.
-
"A corner means the function is discontinuous." A corner is a failure of differentiability, not of continuity. The graph is connected — it just has a sharp turn. Discontinuity means a jump or a gap; a corner means a sudden change of direction.
-
"If f'(a) does not exist, f is not defined at a." The function can be perfectly well-defined (and even continuous) at a without being differentiable. The derivative is a property of the function at a, not a prerequisite for the function to exist there.
-
"Differentiability on [a, b] requires the two-sided derivative at the endpoints." Only one-sided derivatives are needed at the endpoints, because you can only approach from inside the interval. At x = a, only the right derivative is required; at x = b, only the left.
Going deeper
If you came here to understand what differentiability means and how to check it, you have it — you can stop here. What follows is for readers who want the deeper picture.
How special is non-differentiability?
In the examples above, |x| fails to be differentiable at a single point. You might think this is typical: continuous functions are "usually" differentiable, with perhaps a handful of exceptional points.
That intuition is completely wrong. In 1872, Weierstrass constructed a function that is continuous everywhere but differentiable nowhere. The function is defined as an infinite series:
where 0 < a < 1, b is a positive odd integer, and ab > 1 + \frac{3\pi}{2}. This series converges uniformly (so W is continuous), but the resulting function oscillates so wildly at every scale that no tangent line can exist at any point.
The graph of W is a fractal — it looks jagged no matter how much you zoom in. This is the opposite of what happens with, say, x^2: zoom into a parabola and it looks more and more like a straight line (its tangent). Zoom into Weierstrass's function and it looks just as rough as before.
The Weierstrass function is not a curiosity. It shows that the "typical" continuous function is nowhere differentiable. The smooth, well-behaved functions of calculus — polynomials, trigonometric functions, exponentials — are the special ones. They happen to be the ones that arise naturally in physics and engineering, which is why calculus is so useful. But from the perspective of pure analysis, they are the exceptions.
Differentiability and the tangent line
When f'(a) exists, the tangent line at (a, f(a)) is
and it is the best linear approximation to f near a, in the precise sense that
where \varepsilon(x) \to 0 as x \to a. The error goes to zero faster than the linear term — the tangent line hugs the curve near the point of tangency with an error that is roughly (x - a)^2.
This is sometimes taken as an alternative definition of differentiability: f is differentiable at a if and only if there exists a constant m (which will turn out to be f'(a)) such that f(x) - f(a) - m(x - a) goes to zero faster than x - a does. This viewpoint — differentiability as "the existence of a good linear approximation" — generalises to higher dimensions (where it becomes the Jacobian matrix) and to abstract spaces.
Rolle's theorem (a preview)
Differentiability on an interval unlocks powerful results. Here is one that will appear in the next chapter on applications of derivatives.
Rolle's theorem. If f is continuous on [a, b] and differentiable on (a, b), and if f(a) = f(b), then there exists some c \in (a, b) with f'(c) = 0.
In words: if a differentiable function starts and ends at the same height, somewhere in between it must have a horizontal tangent — a point where the slope is zero. The Extreme Value Theorem guarantees that f has a maximum and minimum on [a, b]; since f(a) = f(b), at least one extreme value occurs in the interior, and at an interior extreme the derivative must be zero.
Rolle's theorem is the starting point for the Mean Value Theorem, which is one of the most-used results in all of calculus. Both require differentiability on the open interval — without it, the conclusion can fail.
Where this leads next
You now know what differentiability means, how to check it, and why it matters. The next articles use differentiability as a hypothesis and explore its consequences.
- Reasons for Non-Differentiability — a detailed catalogue of corners, cusps, vertical tangents, and discontinuities, with graphical analysis.
- Derivatives of Basic Functions — the power rule, proved for all real exponents, plus the derivatives of constants and polynomials.
- Rules of Differentiation — sum, product, and quotient rules, each proved from the limit definition.
- Differentiation of Special Functions — differentiability of |x|, [x] (floor function), \{x\} (fractional part), and piecewise functions.
- Theorems on Continuous Functions — the Intermediate Value Theorem and Extreme Value Theorem, which underpin Rolle's theorem.