Here is a tactical rule for simplifying any algebraic expression. The moment your eye lands on a minus sign sitting in front of a bracket, or a number multiplied into a bracket, stop scanning the rest of the expression. Do the distribution first. Do not combine anything, do not cancel anything, do not substitute anything. Kill the bracket, then look at what is left.
This is not a matter of personal style, and it is not "usually a good idea." It is a recognition pattern. −(...) and k(...) are structural signals that mean the bracket is pending — it has not been resolved into loose terms yet — and almost every other algebraic move silently assumes the bracket has already been resolved. Attempting those other moves with a pending bracket still on the page is the single largest source of sign errors in school algebra.
Why this beats every other first move
Leaving the bracket intact makes every later step fragile. The pending bracket acts as a wall around the terms inside it, and almost every algebraic operation needs to see those terms as individuals — not as a huddle.
Consider 3x - (x + 2) + 5. It is tempting to notice the 3x and the x and try to combine them straight away. But those two x terms are not next to each other in any meaningful sense — there is a bracket between them, and that bracket is changing the sign of everything inside. If you write
you have "combined" 3x and x as if the minus sign in front of the bracket only applied to the x and nothing else. The +2 inside has quietly kept its plus sign, and the expression's value has changed. Check at x = 0: the original is -2 + 5 = 3, the "simplified" version is 2 + 5 = 7. Two different numbers, same expression, silently.
Distribute first, and the trap cannot spring. 3x - (x + 2) + 5 = 3x - x - 2 + 5. Now you have six free terms and no walls, and 3x - x = 2x combines safely: 2x - 2 + 5 = 2x + 3. Check at x = 0: 3. Same number. The distribute-first rule makes the rest of the work mechanical.
The same pattern repeats for every other operation. Factoring assumes there are no pending brackets to account for. Substituting multiple values one after another is fine with a bracket in place, but anything you want to do with the expression symbolically — rearrange, collect, factor, apply an identity — expects the brackets gone.
The −(...) case
A minus sign in front of a bracket is shorthand for -1 multiplied by the bracket. Nothing exotic is happening — you are just multiplying by -1, which flips the sign of every term inside. The full distributive law applies:
Every term gets a sign flip. Not just the first one.
Three worked instances:
- -(x + 3) = -x - 3. The x becomes -x; the +3 becomes -3.
- -(x - 3) = -x + 3. The x becomes -x; the -3 becomes +3. The two minus signs meet and produce a plus.
- -(2x - 5y + 7) = -2x + 5y - 7. Three terms, three sign flips. The 2x becomes -2x, the -5y becomes +5y, the +7 becomes -7.
The single most common error here is flipping only the first term. Students write -(x - 3) = -x - 3, leaving the second term alone. That is wrong. Check at x = 0: the real value is -(-3) = 3, but -x - 3 = -3. The minus sign applies to the whole bracket, which means it applies to every term inside the bracket.
The k·(...) case
A literal coefficient in front of a bracket works the same way. Every term inside gets multiplied by the coefficient, including its sign.
Two worked instances:
- 3(x - 2) = 3x - 6. Both terms are multiplied by 3: 3 \cdot x = 3x, 3 \cdot (-2) = -6.
- -4(2x + 5) = -8x - 20. Both terms are multiplied by -4: -4 \cdot 2x = -8x, -4 \cdot 5 = -20.
The two-variable reminder: a(b + c) = ab + ac. This is the plain distributive law from Operations and Properties, and it works exactly the same whether the outside factor is a number, a variable, or a combination. The point of the "distribute first" rule is to apply this law the instant you see the structure, not later.
The nested case
Brackets inside brackets are handled the same way, just recursively. Always distribute the outermost minus or coefficient that is adjacent to a bracket, then look at what is now inside and repeat.
Take 2[3 - (x - 4)]. The outermost structure is 2[\ldots], but that coefficient's bracket contains another bracket — -(x - 4) — which is itself pending. Work from the inside out here: first resolve the inner -(x - 4), because you cannot safely distribute the 2 until the square brackets contain only loose terms.
Now the expression is 2[3 - x + 4]. The inside of the square bracket is three loose terms, which you can tidy by combining 3 + 4 = 7: 2[7 - x]. Now distribute the 2: 14 - 2x. Done.
The general recipe: find the deepest pending bracket that has a minus or coefficient glued to it, distribute that one, then zoom out one level and repeat. Each step takes you from a nested structure to a flatter one, until nothing is left to distribute.
The exception — when NOT to distribute first
There is one situation where distributing first is the wrong move: when the bracket is about to be squared, or multiplied by another bracket. In those cases, what looks like "distributing" is actually a different operation — the binomial expansion, or the FOIL pattern for multiplying two binomials.
You cannot square the bracket by squaring each term individually — that is exactly the (a+b)^2 = a^2 + b^2 mistake from the main Algebraic Expressions article. The correct move is (x + 3)^2 = (x + 3)(x + 3) = x^2 + 6x + 9, and you get there by multiplying the two binomials term by term, not by "distributing the square."
Similarly (x + 3)(x - 2) is not distribution — it is a product of two binomials, which expands to four terms by the FOIL pattern.
So the clean version of the rule is: distribute first when the factor outside the bracket is a single term (a number, a variable, or a signed scalar). Do not distribute when the factor outside is itself a binomial or a power. Scalars distribute; binomials expand. Those are two different operations, and spotting which one the expression calls for is itself a recognition skill.
Worked example
Simplify 4 - 3(x - 2) + 2(x + 1).
Two brackets, both with a coefficient glued to them. Distribute both, then combine like terms.
Step 1. Distribute -3 into (x - 2).
Why: the coefficient is -3, not +3 — the minus sign in front is part of it. Multiply each term inside: -3 \cdot x = -3x, and -3 \cdot (-2) = +6. Both signs, both terms.
Step 2. Distribute 2 into (x + 1).
Why: 2 \cdot x = 2x, 2 \cdot 1 = 2. Nothing tricky — the coefficient is positive, so no sign flips.
Step 3. Rewrite the whole expression with no brackets.
Why: replace each bracketed piece with its distributed form. The 4 at the start was never inside a bracket, so it rides along unchanged.
Step 4. Group like terms and combine.
Result. 4 - 3(x - 2) + 2(x + 1) = -x + 12.
Check at x = 1: original is 4 - 3(-1) + 2(2) = 4 + 3 + 4 = 11, simplified is -1 + 12 = 11. They match.
Notice that Steps 1 and 2 both happened before a single like term was combined. That is the whole point of the rule: every bracket gets killed before anything else happens. Once you commit to that order, the remaining work is always just "collect and add."
Close
A large fraction of school-algebra arithmetic errors are sign errors on distribution — the minus sign gets lost, or the coefficient multiplies only the first term, or someone tries to combine across a bracket that has not been resolved. One habit shuts most of these down: recognise −(...) or k(...) the moment it appears, and distribute it right then, before looking at anything else.
Recognise the pattern. Act immediately. The rest of the simplification takes care of itself.