In short
The limit of f(x) as x approaches a is the single value that f(x) gets arbitrarily close to as x gets close to a, written \lim_{x \to a} f(x) = L. The limit exists if and only if the left-hand limit (approaching from below) and the right-hand limit (approaching from above) both exist and are equal. A function can have a limit at a point even if the function is not defined there — the limit cares about the approach, not the arrival.
Take the function
and try to evaluate it at x = 1. The numerator is 1 - 1 = 0 and the denominator is 1 - 1 = 0. You get \frac{0}{0}, which is not a number. The function is not defined at x = 1.
But watch what happens when you evaluate f at points near 1:
| x | f(x) = \dfrac{x^2 - 1}{x - 1} |
|---|---|
| 0.9 | 1.9 |
| 0.99 | 1.99 |
| 0.999 | 1.999 |
| 1.001 | 2.001 |
| 1.01 | 2.01 |
| 1.1 | 2.1 |
From the left, f(x) is heading toward 2. From the right, f(x) is also heading toward 2. The function never reaches 2 at x = 1 — it is not defined there — but it gets as close as you want. If you pick x within 0.001 of 1, then f(x) is within 0.001 of 2. If you pick x within 0.0001 of 1, then f(x) is within 0.0001 of 2.
The number 2 is the limit of f(x) as x approaches 1.
The algebra explains why. Factor the numerator: x^2 - 1 = (x - 1)(x + 1). So
The function f is identical to the line y = x + 1 everywhere except at the single point x = 1, where it has a hole. The limit fills the hole.
This is the core idea. A limit does not ask "what is f(a)?" — it asks "what is f(x) heading toward as x gets close to a?" The two questions can have different answers, or one might be defined when the other is not.
The intuitive definition
Limit (intuitive)
The limit of f(x) as x approaches a is L, written
if f(x) can be made as close to L as you like by choosing x sufficiently close to a (but not equal to a).
The phrase "as close as you like" is doing real work. It does not mean "pretty close" or "close enough for engineering." It means: for every degree of closeness you demand, there is a way to achieve it. Want f(x) within 0.01 of L? Choose x within some distance of a, and it works. Want within 0.0001? Tighten the distance. Want within 10^{-100}? Still possible. No matter how tight your demand, the function can meet it — that is what a limit means.
Three critical points:
The value at a does not matter. The limit only cares about the behaviour of f near a, not at a itself. The function might not even be defined at a (as in the example above), or it might be defined but have a different value from the limit. What happens at a is a separate question from what happens near a.
x never equals a. The definition explicitly excludes x = a. You are always looking at points approaching a but never arriving. This is why the \frac{0}{0} problem at x = 1 did not stop you — you never actually evaluated f at 1.
The limit must be a single number. If f(x) bounces between two values as x approaches a, there is no limit. If f(x) heads toward different values from the left and the right, there is no limit. The limit exists only when there is one definite value being approached.
Left-hand and right-hand limits
The last point above leads to a refinement. Sometimes a function behaves differently on the two sides of a point. To capture this, you define one-sided limits.
One-sided limits
The left-hand limit of f(x) as x approaches a (from below) is written
and means: f(x) \to L^- as x approaches a through values less than a.
The right-hand limit of f(x) as x approaches a (from above) is written
and means: f(x) \to L^+ as x approaches a through values greater than a.
Take the function
As x approaches 2 from the left, g(x) approaches 2 + 1 = 3.
As x approaches 2 from the right, g(x) approaches 2 + 3 = 5.
The left-hand limit is 3 and the right-hand limit is 5. They disagree. The function jumps at x = 2, and the two-sided limit does not exist.
When does the limit exist?
Existence of a two-sided limit
The limit \lim_{x \to a} f(x) = L exists if and only if both one-sided limits exist and are equal:
If the one-sided limits exist but are not equal, or if either one-sided limit fails to exist, then the two-sided limit does not exist.
This gives you a practical test: to check whether a limit exists at a point, check the left-hand limit and the right-hand limit separately, then compare them.
The three standard scenarios:
Both sides agree. \lim_{x \to a^-} f(x) = \lim_{x \to a^+} f(x) = L. The limit exists and equals L. This is the "normal" case — most functions you meet at this stage are continuous, meaning the two sides always agree.
The sides disagree. \lim_{x \to a^-} f(x) = L_1 and \lim_{x \to a^+} f(x) = L_2 with L_1 \neq L_2. The limit does not exist. The function has a jump discontinuity at a, like the piecewise function above.
One or both sides blow up or oscillate. If f(x) \to \pm\infty as x \to a from either side, the limit does not exist (as a finite number). If f(x) oscillates without settling down — like \sin(1/x) as x \to 0 — the limit does not exist either.
The case of \sin(1/x) near x = 0
Take f(x) = \sin(1/x). As x approaches 0 from the right, 1/x grows without bound. So \sin(1/x) oscillates between -1 and +1 faster and faster, never settling on any single value.
No matter how close to 0 you look, you can always find points where \sin(1/x) = 1 and other points where \sin(1/x) = -1. The function is perpetually indecisive. Neither the left-hand limit nor the right-hand limit exists, so the two-sided limit certainly does not.
Worked examples
Example 1: A limit that exists at a point where the function is undefined
Evaluate \displaystyle\lim_{x \to 3} \frac{x^2 - 9}{x - 3}.
Step 1. Try direct substitution: \frac{9 - 9}{3 - 3} = \frac{0}{0}. The function is not defined at x = 3.
Why: direct substitution is always worth trying first. If it gives a finite number, that number is the limit (for elementary functions). If it gives \frac{0}{0}, the limit might still exist — you just need to work harder.
Step 2. Factor the numerator: x^2 - 9 = (x - 3)(x + 3).
Why: cancelling the (x - 3) factor is valid because you are taking a limit as x \to 3, not evaluating at x = 3. In the limit, x is always different from 3, so x - 3 \neq 0.
Step 3. Take the limit of the simplified expression.
Why: x + 3 is a continuous function — its limit as x \to 3 is just 3 + 3 = 6.
Step 4. Verify numerically by checking a value close to 3: f(3.01) = \frac{9.0601 - 9}{0.01} = \frac{0.0601}{0.01} = 6.01. Heading toward 6.
Result: \displaystyle\lim_{x \to 3} \frac{x^2 - 9}{x - 3} = 6.
This is the most common pattern in limit problems: a \frac{0}{0} form that can be resolved by cancelling a common factor. The cancellation reveals what the function is "trying to be" at the problem point.
Example 2: A limit evaluated using one-sided limits
Evaluate \displaystyle\lim_{x \to 0} \frac{|x|}{x}.
Step 1. Write |x| using its piecewise definition: |x| = x if x > 0, and |x| = -x if x < 0.
Why: the absolute value sign hides a piecewise function. To check whether the limit exists, you need to examine the left and right sides separately — and the piecewise form makes each side explicit.
Step 2. Compute the right-hand limit. For x > 0: \frac{|x|}{x} = \frac{x}{x} = 1.
Why: when x is positive, |x| = x, and the fraction simplifies to 1 everywhere. A constant function has a constant limit.
Step 3. Compute the left-hand limit. For x < 0: \frac{|x|}{x} = \frac{-x}{x} = -1.
Why: when x is negative, |x| = -x, and the fraction simplifies to -1 everywhere.
Step 4. Compare: \lim_{x \to 0^+} = 1 \neq -1 = \lim_{x \to 0^-}.
Result: The limit \displaystyle\lim_{x \to 0} \frac{|x|}{x} does not exist.
This function is the sign detector: it outputs +1 for positive inputs and -1 for negative inputs. At the boundary x = 0, there is no single value being approached — the function makes a clean jump from -1 to +1. The one-sided limits are well-defined; the two-sided limit is not.
When the limit equals f(a)
In many "nice" functions — polynomials, rational functions (away from zeros of the denominator), trigonometric functions, exponentials — direct substitution works: \lim_{x \to a} f(x) = f(a).
This is not a coincidence. It is the definition of continuity: a function is continuous at a if the limit exists, the function is defined at a, and the two are equal. In other words, continuous functions are exactly the functions where the limit question and the evaluation question give the same answer.
The interesting limit problems — the ones that require algebra, cancellation, or careful analysis — are precisely the problems where f is not continuous at a, or not defined there. The \frac{0}{0} form is the signature of such a problem.
Common confusions
-
"The limit of f at a is f(a)." Only when f is continuous at a. Otherwise the limit and the function value can differ, or one might exist without the other. The function f(x) = \frac{x^2 - 1}{x - 1} has no value at x = 1, but its limit there is 2.
-
"If f(a) is undefined, the limit does not exist." The limit can exist even when f(a) is not defined. The limit cares about what happens near a, not at a. The \frac{x^2 - 9}{x - 3} example is exactly this case.
-
"A limit of \frac{0}{0} is 0, or 1, or undefined." The expression \frac{0}{0} is not a number — it is an indeterminate form, meaning you cannot determine the limit from the form alone. Different functions that produce \frac{0}{0} can have limits of 0, 1, 6, \pi, or any other number. You must simplify before substituting.
-
"If f(x) \to \infty as x \to a, the limit is infinity." Technically, we say the limit "does not exist" as a finite number. Some textbooks write \lim_{x \to a} f(x) = \infty as shorthand, but this means f grows without bound — it is a statement about behaviour, not a finite limit value.
-
"The left-hand limit and right-hand limit are the same as the limit." The one-sided limits are tools for checking whether the two-sided limit exists. They are separate objects. A function can have two perfectly good one-sided limits that are different numbers, in which case the two-sided limit fails to exist.
Going deeper
If you came here to learn what a limit is, how to check it from both sides, and when it exists, you have it — you can stop here. The rest of this section introduces the formal epsilon-delta definition and explains why it matters.
The epsilon-delta definition: making "as close as you like" precise
The intuitive definition uses phrases like "as close as you like" and "sufficiently close." These are vivid, but mathematicians need a definition that can be used in proofs — one where every word has a precise meaning and nothing is left to interpretation. The epsilon-delta definition achieves this.
The idea: turn the phrase "f(x) can be made as close to L as you like by choosing x sufficiently close to a" into a game between two players.
Player 1 (the challenger) picks a tolerance \varepsilon > 0 — how close to L they demand f(x) to be. This tolerance can be as small as they like: 0.01, 0.0001, 10^{-100}.
Player 2 (the responder) must then find a radius \delta > 0 such that whenever x is within \delta of a (but not equal to a), the value f(x) is within \varepsilon of L.
If Player 2 can always win — no matter how small the \varepsilon that Player 1 chooses — then the limit is L.
Limit (epsilon-delta)
\lim_{x \to a} f(x) = L means: for every \varepsilon > 0, there exists a \delta > 0 such that
Read this carefully. The |x - a| < \delta part says "x is within \delta of a." The 0 < |x - a| part excludes x = a itself — the limit only cares about nearby points, not the point itself. The |f(x) - L| < \varepsilon part says "f(x) is within \varepsilon of L."
An epsilon-delta verification
Verify that \lim_{x \to 4} (2x + 1) = 9 using the epsilon-delta definition.
You need: given any \varepsilon > 0, find a \delta > 0 such that 0 < |x - 4| < \delta implies |(2x + 1) - 9| < \varepsilon.
Simplify what you need: |(2x + 1) - 9| = |2x - 8| = 2|x - 4|.
So you need 2|x - 4| < \varepsilon, which means |x - 4| < \varepsilon/2.
Choose \delta = \varepsilon/2. Then whenever 0 < |x - 4| < \delta:
For any \varepsilon > 0, the choice \delta = \varepsilon/2 works. The limit is confirmed: \lim_{x \to 4}(2x + 1) = 9.
This is the simplest possible epsilon-delta proof, but it shows the pattern: start with what you need (|f(x) - L| < \varepsilon), simplify it until it looks like |x - a| < \text{something}, and then choose \delta to be that something. For a linear function, the algebra is one line. For more complicated functions, the chain of inequalities is longer, but the structure is identical.
Why the formal definition matters
You might wonder why anyone would bother with \varepsilon and \delta when the intuitive definition seems perfectly clear. The reason is that intuition can mislead you on edge cases. Is \lim_{x \to 0} x\sin(1/x) = 0? Your intuition might hesitate — \sin(1/x) oscillates wildly, and multiplying by x makes the oscillations smaller but does not eliminate them.
The epsilon-delta definition settles it cleanly: |x \sin(1/x) - 0| = |x||\sin(1/x)| \leq |x|. So for any \varepsilon, choose \delta = \varepsilon. Then |x| < \delta implies |x\sin(1/x)| < \varepsilon. Done — the limit is 0.
Without the formal definition, you would be arguing by hand-waving. With it, you have a proof that no one can dispute. That is the power of making "as close as you like" precise.
Where this leads next
The limit is the foundation on which the rest of calculus is built. Every major concept in calculus — continuity, the derivative, the integral — is defined using limits. The most direct next steps:
- Derivative — the slope of a tangent line, defined as the limit of the slope of secant lines. This is where limits become useful.
- Algebra of Limits — the rules for computing limits of sums, products, quotients, and compositions of functions, so you can evaluate limits without going back to the definition every time.
- Standard Limits — the classical results like \lim_{x \to 0} \frac{\sin x}{x} = 1 that serve as building blocks for harder limit problems.
- Continuity — what happens when the limit does equal the function value, and the consequences of that happy coincidence.
- Infinite Limits and Limits at Infinity — what happens when x \to \infty or when f(x) \to \infty, and how the epsilon-delta framework extends to handle those cases.