There is one habit that separates students who compute 1776 \bmod 7 on paper in ten seconds from students who fill half a page with long multiplication before giving up: reduce each factor mod n before you multiply, not after. It sounds obvious once stated, but the reflex has to be trained. Most school students, asked to compute 37 \cdot 48 \bmod 7, will instinctively compute 37 \cdot 48 = 1776 first and only then divide by 7. That is the slow, error-prone path. The fast path is to notice that 37 \equiv 2 and 48 \equiv 6 mod 7, so the product is 2 \cdot 6 = 12 \equiv 5. No number in the entire computation ever exceeds 48.

This article is about building that reflex so deeply that you never compute a large intermediate product again.

The rule that makes it legal

The whole habit rests on one identity:

(a \cdot b) \bmod n \;=\; \big((a \bmod n) \cdot (b \bmod n)\big) \bmod n.

In words: you can replace any factor by its remainder mod n without changing the final remainder. The rule extends to any number of factors and to any combination of additions and multiplications. So

(a \cdot b \cdot c) \bmod n \;=\; \big((a \bmod n) \cdot (b \bmod n) \cdot (c \bmod n)\big) \bmod n,

and you can reduce mod n at every step — after each multiplication, or after each factor is introduced, or both. The final answer never changes.

Why: write a = qn + r_a and b = q'n + r_b where r_a = a \bmod n and r_b = b \bmod n. Then ab = (qn + r_a)(q'n + r_b) = qq'n^2 + q n r_b + q' n r_a + r_a r_b. The first three terms are all multiples of n, so they vanish mod n. What survives is r_a r_b \bmod n — which is the reduced product.

The two pipelines, side by side

Here is the contrast that should burn into your memory.

Multiply-then-reduce versus reduce-then-multiplyTwo pipelines shown side by side. The top pipeline, labelled multiply then reduce, starts with 37 times 48, produces the giant intermediate 1776, and then reduces to 5. The bottom pipeline, labelled reduce then multiply, first reduces 37 to 2 and 48 to 6, multiplies to get 12, and reduces to 5. The bottom pipeline's intermediates never exceed 48. Two ways to compute 37 · 48 mod 7 Multiply first, reduce later (slow): 37 · 48 1776 giant intermediate 1776 ÷ 7 = 253 r 5 5 Reduce first, multiply small (fast): 37 mod 7 48 mod 7 2 · 6 12 mod 7 5 Every intermediate stays below 49. Same final answer.
Both pipelines land at $5$. The upper one manufactures a four-digit intermediate that must be long-divided by $7$. The lower one never leaves the universe of two-digit numbers.

The two outputs are identical because the rule guarantees it. The work is not identical — one path is five times slower and far more prone to arithmetic slips.

A three-factor example: 23 \cdot 47 \cdot 89 \bmod 11

Watch what reduce-early does to a product that would otherwise balloon into six digits.

Without the habit: 23 \cdot 47 = 1081, then 1081 \cdot 89 = 96{,}209, then 96{,}209 \div 11 = 8746 remainder 3. You just did two multi-digit multiplications and a long division. One slip anywhere gives the wrong answer, and you have no way to spot-check.

With the habit:

23 \equiv 1 \pmod{11}, \quad 47 \equiv 3 \pmod{11}, \quad 89 \equiv 1 \pmod{11}.

Why each reduction: 23 = 2 \cdot 11 + 1, so 23 \equiv 1. Similarly 47 = 4 \cdot 11 + 3, and 89 = 8 \cdot 11 + 1.

Multiply the reduced residues:

23 \cdot 47 \cdot 89 \equiv 1 \cdot 3 \cdot 1 \equiv 3 \pmod{11}.

Done. No number in the entire derivation exceeded 89. You can check it mentally in under fifteen seconds.

Reduce after every multiplication, not just at the start

When the product has many factors, it pays to reduce between multiplications too. Suppose you want 6 \cdot 8 \cdot 9 \cdot 7 \bmod 5. You could reduce each factor first — 6 \equiv 1, 8 \equiv 3, 9 \equiv 4, 7 \equiv 2 — but then 1 \cdot 3 \cdot 4 \cdot 2 = 24 is still two digits. Smarter: combine and reduce as you go.

1 \cdot 3 = 3, \quad 3 \cdot 4 = 12 \equiv 2, \quad 2 \cdot 2 = 4 \pmod 5.

Every intermediate remainder stays in \{0, 1, 2, 3, 4\}. This is the full discipline: reduce early, reduce often. After every factor, after every multiplication, whenever a number starts growing, drop it back into the range [0, n).

Why this matters for computers too

This habit is not only about pencil speed. Inside every cryptographic library on the planet, exactly this principle keeps the numbers tractable. When your browser negotiates a TLS handshake, it computes products mod a 2048-bit prime. The factors are themselves 2048-bit numbers; if you multiplied all of them first and then reduced at the end, the intermediate would grow to millions of bits and the computation would grind to a halt. Instead, every multiplication is followed immediately by a reduction mod n, keeping the working value bounded at 2048 bits. The mathematics you are learning to do by hand — reduce-then-multiply — is exactly the algorithm the hardware runs.

In every modular-exponentiation routine you will ever write or see, the inner loop looks like result = (result * base) mod n. The mod n is there every iteration, not at the end. That is the same rule as this article — applied inside a loop.

How this connects to repeated squaring

Once you have internalised "reduce every factor, reduce every intermediate," the next tool — computing huge powers like 2^{100} \bmod 7 by repeated squaring — is a direct consequence. Repeated squaring computes a^2, a^4, a^8, a^{16}, \dots one step at a time, reducing mod n after every squaring. Without the reduce-early habit, each squared number doubles in size, and by a^{32} you are drowning in digits. With it, the working value never exceeds n^2 — no matter how large the exponent. The technique you learned here is the foundation for the technique in that article.

One-line takeaway

In modular arithmetic, never let a number escape the range [0, n) if you can help it. Reduce each factor mod n before multiplying, and reduce every intermediate product mod n as soon as it appears. The final answer is the same, but the work stays small, your chance of arithmetic slip plummets, and the method scales from three-digit exam problems all the way up to RSA.

Related: Modular Arithmetic · How to Compute 2^{100} \bmod 7 by Hand · Last-Digit Problems: Switch to Mod 10 · mod Operator vs Congruence Relation