In short

A system of linear equations is consistent if it has at least one solution (exactly one, or infinitely many) and inconsistent if it has none. The rank of a matrix — the number of nonzero rows after reducing to echelon form — is the tool that decides which case you are in. For the system AX = B: if \text{rank}(A) = \text{rank}([A \mid B]) = n (the number of unknowns), there is a unique solution. If \text{rank}(A) = \text{rank}([A \mid B]) < n, there are infinitely many. If \text{rank}(A) < \text{rank}([A \mid B]), there are none.

Three shopkeepers in a small town each sell the same three items — pens, notebooks, and erasers — in different bundles. A customer records the total cost of each shopkeeper's bundle:

2x + y + z = 10
x + 2y + z = 9
x + y + 2z = 11

Here x, y, z are the prices of a pen, notebook, and eraser. Three equations, three unknowns. From your earlier work with matrices, you know to write this as AX = B and check whether \det A \neq 0. If it is, A^{-1} exists and you can compute X = A^{-1}B. Problem solved.

But what happens when \det A = 0? The previous article on systems of linear equations left this question partly open: the system might have infinitely many solutions, or it might have no solutions at all. The determinant alone cannot tell you which.

That is the gap this article fills. The tool you need is called the rank of a matrix, and it settles every case cleanly.

Echelon form: organising the matrix

Before defining rank, you need a way to simplify a matrix into a standard shape — the same way you simplify an equation before solving it. That standard shape is called row echelon form.

Row echelon form

A matrix is in row echelon form if:

  1. All rows consisting entirely of zeros are at the bottom.
  2. In each nonzero row, the first nonzero entry (called the pivot or leading entry) is to the right of the pivot in the row above.
  3. Every entry below a pivot is zero.

Here is what this looks like concretely. The matrix

\begin{bmatrix} 2 & 1 & 3 \\ 0 & 4 & -1 \\ 0 & 0 & 5 \end{bmatrix}

is in echelon form — the pivots are 2, 4, 5, and each one sits strictly to the right and below the previous one. It looks like a staircase descending from left to right.

The matrix

\begin{bmatrix} 1 & 3 & 2 \\ 0 & 0 & 7 \\ 0 & 0 & 0 \end{bmatrix}

is also in echelon form — the pivots are 1 and 7, the all-zero row is at the bottom, and each pivot is to the right of the one above it.

You reach echelon form by applying elementary row operations — the same operations you already know from computing inverses and solving systems:

These operations do not change the solution set of the system — they are rearrangements that preserve all the information. The process of applying them systematically to reach echelon form is called Gaussian elimination.

The staircase pattern of echelon formA schematic showing three rows. In row 1, the pivot is in column 1. In row 2, the pivot is in column 2 (the entry in column 1 is zero). In row 3, the pivot is in column 3 (the entries in columns 1 and 2 are zero). The pivots form a staircase descending from upper-left to lower-right. pivot₁ * * * 0 pivot₂ * * 0 0 pivot₃ *
The staircase pattern of row echelon form. Each pivot sits strictly to the right and below the previous one. Entries marked * can be anything; entries below each pivot are zero. The number of nonzero rows — the number of "steps" in the staircase — is the rank.

Rank of a matrix

Rank

The rank of a matrix A, written \text{rank}(A) or \rho(A), is the number of nonzero rows in any row echelon form of A.

The rank does not depend on which echelon form you arrive at — different sequences of row operations may produce different echelon forms, but they all have the same number of nonzero rows.

Take a 3 \times 3 matrix. If its echelon form has three nonzero rows, the rank is 3. If one row vanishes (becomes all zeros), the rank is 2. If two rows vanish, the rank is 1.

The rank captures how many "independent" equations you really have. If you start with three equations but one of them turns out to be a combination of the other two, the echelon form will show this: that dependent equation becomes a row of zeros. The rank drops from 3 to 2, telling you that you really only have two constraints on three unknowns — which means you cannot pin down a unique answer.

Computing rank: a worked reduction

Take the matrix

A = \begin{bmatrix} 1 & 2 & 3 \\ 2 & 4 & 6 \\ 1 & 3 & 5 \end{bmatrix}

Step 1. Eliminate below the first pivot. Apply R_2 \to R_2 - 2R_1 and R_3 \to R_3 - R_1:

\begin{bmatrix} 1 & 2 & 3 \\ 0 & 0 & 0 \\ 0 & 1 & 2 \end{bmatrix}

Row 2 has vanished entirely — the second equation was just twice the first.

Step 2. Swap R_2 and R_3 to put the zero row at the bottom:

\begin{bmatrix} 1 & 2 & 3 \\ 0 & 1 & 2 \\ 0 & 0 & 0 \end{bmatrix}

This is in echelon form. There are 2 nonzero rows, so \text{rank}(A) = 2.

The consistency theorem

Now you have the language to state the complete rule. Given a system AX = B with n unknowns, form the augmented matrix [A \mid B] by attaching the column B to the right of A.

Conditions for consistency (Rouché–Capelli theorem)

For the system AX = B with n unknowns:

  1. Unique solution: \text{rank}(A) = \text{rank}([A \mid B]) = n.
  2. Infinitely many solutions: \text{rank}(A) = \text{rank}([A \mid B]) < n.
  3. No solution (inconsistent): \text{rank}(A) < \text{rank}([A \mid B]).

The logic behind these three cases is direct.

Case 1 means every equation is independent and every unknown is pinned down. The echelon form has a pivot in every column — you can back-substitute and get exactly one value for each unknown.

Case 2 means some equations are redundant (they are combinations of others), so you have fewer constraints than unknowns. The "extra" unknowns become free variables — they can take any value, and the other unknowns adjust accordingly. This gives infinitely many solutions, parametrised by the free variables.

Case 3 is the strange one. The rank of [A \mid B] being larger than the rank of A means that the augmented matrix has a nonzero row of the form [0 \; 0 \; \cdots \; 0 \mid k] with k \neq 0. That row says 0 = k — a contradiction. No values of the unknowns can satisfy it. The system is inconsistent.

Three cases of system consistencyThree diagrams side by side. On the left, three lines in a plane meet at a single point (unique solution). In the middle, three lines coincide or two coincide (infinitely many solutions). On the right, three parallel or triangle-forming lines with no common point (inconsistent). Unique ρ(A) = ρ([A|B]) = n Infinite ρ(A) = ρ([A|B]) < n Inconsistent ρ(A) < ρ([A|B])
For two equations in two unknowns, the three cases correspond to three pictures. **Unique:** the lines cross at one point. **Infinite:** the lines overlap (or the system has fewer independent equations than unknowns). **Inconsistent:** the lines are parallel — they never meet, and no point lies on all of them.

Worked examples of all three cases

Example 1: A system with a unique solution

Solve:

x + y + z = 6
2x + 3y + z = 14
x + 2y + 3z = 14

Step 1. Write the augmented matrix.

[A \mid B] = \begin{bmatrix} 1 & 1 & 1 & \mid & 6 \\ 2 & 3 & 1 & \mid & 14 \\ 1 & 2 & 3 & \mid & 14 \end{bmatrix}

Why: putting the system into augmented-matrix form lets you apply row operations to all four columns simultaneously, keeping the equations in sync.

Step 2. Apply R_2 \to R_2 - 2R_1 and R_3 \to R_3 - R_1:

\begin{bmatrix} 1 & 1 & 1 & \mid & 6 \\ 0 & 1 & -1 & \mid & 2 \\ 0 & 1 & 2 & \mid & 8 \end{bmatrix}

Why: eliminating the x-terms from rows 2 and 3 starts the staircase. The first pivot is now sitting alone in column 1.

Step 3. Apply R_3 \to R_3 - R_2:

\begin{bmatrix} 1 & 1 & 1 & \mid & 6 \\ 0 & 1 & -1 & \mid & 2 \\ 0 & 0 & 3 & \mid & 6 \end{bmatrix}

Why: this completes the echelon form. Three nonzero rows, three unknowns.

Step 4. Read off the ranks: \text{rank}(A) = 3, \text{rank}([A \mid B]) = 3, and n = 3. The condition \text{rank}(A) = \text{rank}([A \mid B]) = n is met. Unique solution.

Back-substitute: from row 3, 3z = 6, so z = 2. From row 2, y - 2 = 2, so y = 4. From row 1, x + 4 + 2 = 6, so x = 0.

Result: (x, y, z) = (0, 4, 2).

Setting $z = 2$ reduces the three equations to three lines in the $xy$-plane. All three lines meet at the single point $(0, 4)$ — confirming the unique solution. The red dot is exactly where the staircase of pivots pointed.

The three pivots — one in each column — locked every unknown into a single value. No freedom, no contradiction. That is what \text{rank}(A) = n means geometrically: n independent constraints pinning down n unknowns.

Example 2: An inconsistent system (no solution)

Determine whether the following system is consistent:

x + 2y + 3z = 4
2x + 4y + 6z = 12
x + y + z = 3

Step 1. Write the augmented matrix.

[A \mid B] = \begin{bmatrix} 1 & 2 & 3 & \mid & 4 \\ 2 & 4 & 6 & \mid & 12 \\ 1 & 1 & 1 & \mid & 3 \end{bmatrix}

Why: you need both A and [A \mid B] in echelon form to compare ranks.

Step 2. Apply R_2 \to R_2 - 2R_1 and R_3 \to R_3 - R_1:

\begin{bmatrix} 1 & 2 & 3 & \mid & 4 \\ 0 & 0 & 0 & \mid & 4 \\ 0 & -1 & -2 & \mid & -1 \end{bmatrix}

Why: the second row of A was exactly twice the first, so the A-entries vanish. But the right-hand side did not: 12 - 2 \times 4 = 4 \neq 0. That surviving 4 is a warning sign.

Step 3. Swap R_2 and R_3:

\begin{bmatrix} 1 & 2 & 3 & \mid & 4 \\ 0 & -1 & -2 & \mid & -1 \\ 0 & 0 & 0 & \mid & 4 \end{bmatrix}

Why: putting the nonzero row above the zero row puts the matrix in echelon form.

Step 4. Read off the ranks. In the coefficient part (the first three columns), there are 2 nonzero rows, so \text{rank}(A) = 2. In the full augmented matrix, row 3 reads [0 \;\; 0 \;\; 0 \mid 4] — the row is nonzero because of the 4 on the right, so \text{rank}([A \mid B]) = 3.

Since \text{rank}(A) = 2 < 3 = \text{rank}([A \mid B]), the system is inconsistent.

Result: No solution exists.

The contradiction row in echelon formA stylised augmented matrix in echelon form. The bottom row shows 0 0 0 on the left of the divider and 4 on the right, highlighted in red. This row translates to the equation 0 equals 4, which is a contradiction. 1 2 3 4 0 −1 −2 −1 0 0 0 4 ← 0 = 4 !
The red-highlighted row says $0x + 0y + 0z = 4$, which simplifies to $0 = 4$ — a contradiction. This is the signature of an inconsistent system: the left side of the equation vanishes, but the right side does not. No choice of $x$, $y$, $z$ can make zero equal four.

The second equation (2x + 4y + 6z = 12) looked like it should be compatible with the first (x + 2y + 3z = 4) because the coefficients are proportional — the left side of equation 2 is just twice the left side of equation 1. But the right side is 12, not 2 \times 4 = 8. The equations demand that the same expression simultaneously equal 4 and 6. That is why no solution exists.

The special case: homogeneous systems

A system AX = 0 — where the right-hand side is all zeros — is called a homogeneous system. These systems have a useful property that non-homogeneous systems lack:

A homogeneous system is always consistent. The trivial solution x_1 = x_2 = \cdots = x_n = 0 always works. The only question is whether there are also non-trivial solutions — solutions where at least one variable is nonzero.

The consistency theorem simplifies. Since B = 0, the augmented matrix [A \mid 0] can never produce a row of the form [0 \; 0 \; \cdots \; 0 \mid k] with k \neq 0 — the last column is all zeros. So \text{rank}(A) always equals \text{rank}([A \mid 0]), and Case 3 (inconsistency) is impossible. What remains:

A particularly important consequence: if the number of equations is fewer than the number of unknowns (i.e., A is not square, or the square matrix is singular), then \text{rank}(A) < n, and the homogeneous system has non-trivial solutions.

Example: a homogeneous system with non-trivial solutions

Take the system

x + 2y + 3z = 0
2x + 4y + 6z = 0
3x + 6y + 9z = 0

Every equation here is a multiple of the first: equation 2 is 2 \times equation 1, equation 3 is 3 \times equation 1. The augmented matrix reduces in one step:

\begin{bmatrix} 1 & 2 & 3 & \mid & 0 \\ 0 & 0 & 0 & \mid & 0 \\ 0 & 0 & 0 & \mid & 0 \end{bmatrix}

The rank is 1, with n = 3 unknowns. So there are 3 - 1 = 2 free variables. Set y = s and z = t (where s and t are any real numbers), and from row 1: x = -2s - 3t.

The solution set is

\begin{bmatrix} x \\ y \\ z \end{bmatrix} = s\begin{bmatrix} -2 \\ 1 \\ 0 \end{bmatrix} + t\begin{bmatrix} -3 \\ 0 \\ 1 \end{bmatrix}

This is a plane through the origin in three-dimensional space. The two vectors (-2, 1, 0) and (-3, 0, 1) span this plane, and every point on it satisfies all three equations.

A slice of the solution set of the homogeneous system, projected into the $sy$-plane (setting $t = 0$). The solutions lie along a line through the origin — the direction $(-2, 1, 0)$. The trivial solution $(0,0,0)$ sits at the origin. Every point on this line is a valid solution.

Non-homogeneous systems with infinitely many solutions

When the system AX = B (with B \neq 0) has infinitely many solutions, the solution set has a specific structure: it is a particular solution plus the general solution of the associated homogeneous system AX = 0.

Take the system

x + y + z = 3
2x + 2y + 2z = 6

The second equation is twice the first — redundant. The augmented matrix reduces to

\begin{bmatrix} 1 & 1 & 1 & \mid & 3 \\ 0 & 0 & 0 & \mid & 0 \end{bmatrix}

Rank of A is 1, rank of [A \mid B] is 1, n = 3. So \text{rank}(A) = \text{rank}([A \mid B]) = 1 < 3 = n. Infinitely many solutions, with 3 - 1 = 2 free variables.

Set y = s, z = t. Then x = 3 - s - t.

\begin{bmatrix} x \\ y \\ z \end{bmatrix} = \begin{bmatrix} 3 \\ 0 \\ 0 \end{bmatrix} + s\begin{bmatrix} -1 \\ 1 \\ 0 \end{bmatrix} + t\begin{bmatrix} -1 \\ 0 \\ 1 \end{bmatrix}

The first vector (3, 0, 0) is a particular solution (any one specific solution works). The remaining part is the general solution of x + y + z = 0 — the homogeneous version. Geometrically, the solution set is a plane in 3D space, but shifted away from the origin by the particular solution.

The rank–determinant connection

For a square matrix A of order n, the rank gives you an alternative way to state a fact you already know from determinants:

\text{rank}(A) = n \iff \det A \neq 0 \iff A \text{ is invertible}

If \text{rank}(A) < n, then \det A = 0 and A is singular. The echelon form has at least one zero row, which means at least one row of the original matrix was a linear combination of the others — the rows are not independent.

For non-square systems (more equations than unknowns, or fewer), the determinant is not defined, but the rank still works. The rank is the more general tool; the determinant is a special case that applies only to square matrices.

Decision flowchart for system consistencyA flowchart. Start: compute rank of A and rank of augmented matrix. If rank A is less than rank of augmented matrix, the answer is inconsistent. Otherwise, if rank A equals n, unique solution. If rank A is less than n, infinitely many solutions. Compute ρ(A) and ρ([A|B]) ρ(A) = ρ([A|B]) ? No Inconsistent Yes ρ(A) = n ? No Infinite solutions Yes Unique
The decision process for system consistency. First compare the ranks. If they differ, stop — no solution. If they agree, compare the common rank to the number of unknowns $n$.

Common confusions

Going deeper

If you came here to learn how to check whether a system has a solution and how many, you have it — you can stop here. The rest of this section covers the formal relationship between rank and solutions in more detail, the connection to the column space and null space, and a complete worked example with three equations and three unknowns that has infinitely many solutions.

Rank and the structure of solutions

The number of free variables in the solution set of AX = B (when it is consistent) is exactly n - \text{rank}(A). This number is sometimes called the nullity of A, and the equation

\text{rank}(A) + \text{nullity}(A) = n

is called the rank–nullity theorem. It says that the unknowns divide into two groups: those pinned down by the equations (as many as the rank) and those free to roam (the nullity). Every extra independent equation removes one degree of freedom.

A complete 3 \times 3 case with infinitely many solutions

Solve the system

x + 2y + 3z = 5
2x + 4y + 7z = 11
3x + 6y + 10z = 16

Form the augmented matrix and reduce:

\begin{bmatrix} 1 & 2 & 3 & \mid & 5 \\ 2 & 4 & 7 & \mid & 11 \\ 3 & 6 & 10 & \mid & 16 \end{bmatrix}

R_2 \to R_2 - 2R_1, R_3 \to R_3 - 3R_1:

\begin{bmatrix} 1 & 2 & 3 & \mid & 5 \\ 0 & 0 & 1 & \mid & 1 \\ 0 & 0 & 1 & \mid & 1 \end{bmatrix}

R_3 \to R_3 - R_2:

\begin{bmatrix} 1 & 2 & 3 & \mid & 5 \\ 0 & 0 & 1 & \mid & 1 \\ 0 & 0 & 0 & \mid & 0 \end{bmatrix}

\text{rank}(A) = 2, \text{rank}([A \mid B]) = 2, n = 3. So the system is consistent with 3 - 2 = 1 free variable.

Column 2 has no pivot, so y is the free variable. Set y = t.

From row 2: z = 1.

From row 1: x + 2t + 3 = 5, so x = 2 - 2t.

The solution is

\begin{bmatrix} x \\ y \\ z \end{bmatrix} = \begin{bmatrix} 2 \\ 0 \\ 1 \end{bmatrix} + t\begin{bmatrix} -2 \\ 1 \\ 0 \end{bmatrix}, \quad t \in \mathbb{R}

Geometrically, this is a line in three-dimensional space — a one-parameter family of solutions, each one valid.

Using the determinant as a quick first test

For a square system AX = B with n equations in n unknowns:

The determinant is a shortcut for Case 1. For Cases 2 and 3, you need the full rank comparison.

Where this leads next

You now have a complete toolkit for deciding the consistency of any linear system and describing its solution set. The next set of ideas extends this in several directions.