Start with 2. Double it. Double again. Double again. By step 10 you already have 1024 — a number everyone knows because computer memory keeps running into it. By step 20 you have about a million. By step 30 you have about a billion. By step 60, you are past the number of grains of sand on every beach on Earth.
That is 2^n. It is the simplest possible exponential growth, and it is the one that breaks your intuition most brutally. Linear growth adds the same amount at every step. Exponential growth multiplies by the same factor at every step — and multiplication, compounded, runs away from you in a way that addition never does.
The tower, step by step
Here is the tower of 2s up to n = 30. Each bar is twice as tall as the one before it. By the time you reach the right end of the graph, the bar has completely disappeared off the top — not because the graph is too small, but because there is no graph large enough.
The shape of the curve is what most people get wrong on their first encounter. It looks flat near the origin, as if doubling were a lazy, slow process for small n. It is not. Each step is still doubling the previous one; the reason the first few steps look flat is that the later values are so big that the early ones look like specks on the same axis.
The numbers in a table
Why a table, not a formula: when growth is this aggressive, spelling out the values makes the scale-jump concrete in a way the formula 2^n hides.
| n | 2^n | In words |
|---|---|---|
| 0 | 1 | one |
| 5 | 32 | thirty-two |
| 10 | 1{,}024 | about a thousand |
| 15 | 32{,}768 | thirty-two thousand |
| 20 | 1{,}048{,}576 | about a million |
| 25 | 33{,}554{,}432 | thirty-three million |
| 30 | 1{,}073{,}741{,}824 | about a billion |
Every increase of 10 in n multiplies the value by 2^{10} = 1024 \approx 1000. That is why the values in the table step up by a factor of roughly a thousand every five rows — n jumps by 10, so the value jumps by about three decimal digits. This is the engineer's rule of thumb: 2^{10} is about 10^3, so 2^{20} is about 10^6, and 2^{30} is about 10^9.
The chessboard legend
The oldest story about exponential growth is this. A sage invents the game of chess and presents it to a king. The king, delighted, asks the sage to name his reward. The sage says: "One grain of rice on the first square, two on the second, four on the third, doubling each time, until the 64th square."
The king laughs — that sounds cheap. Then his treasurer runs the sum. The total is 2^{64} - 1, which is about 1.8 \times 10^{19} grains. More rice than has ever been grown in the entire history of agriculture. The kingdom cannot pay.
Why the total is 2^{64} - 1: the sum 1 + 2 + 4 + \dots + 2^{63} is a geometric series, and the standard trick — subtracting 1 + 2 + 4 + \dots + 2^{63} from 2 + 4 + 8 + \dots + 2^{64} — leaves 2^{64} - 1. You can see it more directly: each partial sum 1 + 2 + \dots + 2^{k} = 2^{k+1} - 1 by induction.
The point of the legend is not the specific number. It is that the doubling that looked cheap for the first twenty squares blew up uncontrollably in the last twenty. Every exponential process has this character. The first half of the run looks tame. The second half is apocalyptic.
Why this matters for computer science
Every Indian student who has touched a computer has bumped into the powers of 2. The reason is that computers store numbers in binary, and the sizes that come up naturally are 2^n.
- 2^8 = 256: the number of values in one byte. That is why image channels run from 0 to 255.
- 2^{10} = 1024: one "kilobyte" (in the computer-memory sense), one "kibibyte" more precisely.
- 2^{16} = 65{,}536: the range of a 16-bit integer. Older games and old file formats often have this as a hard limit.
- 2^{32} \approx 4.3 billion: the range of a 32-bit integer. Also the number of distinct IP addresses in IPv4, which is why the world is running out.
- 2^{64} \approx 1.8 \times 10^{19}: the range of a 64-bit integer. Big enough to count the grains of rice on the chessboard.
When you run out of 16-bit addresses, you do not get twice as many by upgrading to 17 bits. You get 2 \times as many. When you upgrade to 32 bits, you get 2^{16} \times as many — sixty-five thousand times as many. Doubling the exponent is an entirely different creature from doubling the value.
The reflex
When you see a doubling process — bacteria in a petri dish, compound interest, a chain letter, a viral share, radioactive buildup, anything that grows by a fixed multiplier per step — write it as 2^n (or r^n more generally) and expect the second half of the run to dominate the first. The "slow start, explosive finish" is not a surprise that the situation is throwing at you; it is the geometry of the function. Every exponential curve has it.
For the opposite behaviour — halving instead of doubling — the same geometry plays out in reverse, and you get the exponential decay that shows up in radioactive half-lives, drug clearance in the bloodstream, and cooling bodies. The slider on 2^n runs both ways if you allow negative n, and that is the bridge to negative exponents.
Related: Exponents and Powers · Bacteria Doubling Every Hour — Until the Dish Runs Out · Why a^0 = 1 — The Staircase That Halves Down to One · Tile-View Proof of the Three Core Exponent Laws