In short
A projective measurement is a set of orthogonal projectors \{P_m\} that sum to the identity, \sum_m P_m = I. Given an input state |\psi\rangle, the probability of outcome m is the Born rule, p(m) = \langle \psi | P_m | \psi\rangle. After the measurement, the state collapses to the projected, renormalised component: |\psi\rangle \mapsto P_m |\psi\rangle / \sqrt{p(m)}. Measurement is the only non-unitary operation in the entire quantum postulate system — it is irreversible, it destroys phase information, and it is where quantum becomes classical. One measurement yields one bit per qubit; everything else about |\psi\rangle is gone the instant you read the outcome.
You have just spent several chapters building a state. Complex amplitudes, normalisation, inner products, gates that rotate the state into elaborate superpositions — every piece of it governed by the crisp, reversible rule U^\dagger U = I. Then you measure the qubit.
You get a single bit. A 0 or a 1. And the state you so carefully prepared is gone.
This is not a bug in the formalism. It is the rule. Every quantum computation you will ever do ends in a measurement, because a quantum computer's job is to produce classical output — numbers, answers, bits you can email. The step that converts a quantum superposition into a classical bit is projective measurement, and it is the only move in the entire quantum postulate system that is not unitary. It is the one event that deletes information, the one event that is irreversible, the one event where "quantum" becomes "classical."
This chapter tells you exactly what happens in that step. What the probabilities are. What the post-measurement state is. Why you cannot undo it. And why, despite its destructiveness, measurement is the machine that actually extracts answers from quantum algorithms.
Setup — what a projective measurement needs
The ingredients of a projective measurement are embarrassingly simple.
Pick an orthonormal basis of your state space — call it \{|m\rangle\}, where m runs over some index set (for a qubit, m \in \{0, 1\}). From each basis vector, build a projector:
The object |m\rangle\langle m| is the outer product — the column |m\rangle times the row \langle m| — which gives back a matrix. For the computational basis on a qubit,
Why these are called "projectors": P_m has the defining property P_m^2 = P_m. Applied twice, it does the same thing as applied once — like the shadow operation in 3D space, where "project onto the floor" applied twice is the same as "project onto the floor" applied once. The shadow is already on the floor; projecting it again doesn't move it.
The projectors of a projective measurement have two structural properties you should write on your palm:
- Orthogonality: P_m P_{m'} = 0 when m \ne m'. Different outcomes correspond to different, non-overlapping subspaces.
- Completeness: \sum_m P_m = I. The projectors together account for the entire state space — every state is "somewhere" in the decomposition.
The Born rule — probabilities from amplitudes
Given |\psi\rangle and the projectors \{P_m\}, the probability of measurement outcome m is
This is the Born rule, first stated by Max Born in 1926, and it is the bridge between the abstract quantum formalism and the classical statistics you actually observe in the lab.
For the computational basis on a qubit, write |\psi\rangle = \alpha|0\rangle + \beta|1\rangle and compute:
Why \langle \psi | 0 \rangle \langle 0 | \psi \rangle = |\langle 0 | \psi \rangle|^2: the two factors are \langle \psi | 0 \rangle and \langle 0 | \psi \rangle, which are complex conjugates of each other (because \langle a|b\rangle = \overline{\langle b|a\rangle}). A complex number times its conjugate is the squared modulus.
So the probability of getting outcome m is the squared modulus of the m-th amplitude. This is "the squared amplitudes are the probabilities" in one formal line.
And crucially: the probabilities always sum to 1.
where the last step uses the fact that |\psi\rangle is a unit vector. This is not a lucky coincidence — it is why the completeness condition \sum_m P_m = I was imposed in the first place. Completeness is what makes the Born rule produce a valid probability distribution.
Born's 1926 paper was the first time anyone wrote down p = |\psi|^2 as an interpretation of the quantum amplitude, rather than just a mathematical object. Before him, Schrödinger had introduced the wave function but was not sure what it meant physically; Born's insight that |\psi|^2 is a probability density is the interpretation every quantum physicist has used since. It also earned him the 1954 Nobel Prize in Physics — almost three decades after the paper, because the interpretation took that long for the community to accept over competing ideas.
Collapse — the post-measurement state
Given that outcome m happened, what is the state now?
Three things are happening in that formula. Apply the projector P_m to |\psi\rangle to pick out the component in the |m\rangle direction. Divide by \sqrt{p(m)} to renormalise, because the projected vector has shrunk and you need to restore the unit-vector condition. The result is what the qubit's state is the instant after the measurement.
For the computational basis, this simplifies dramatically. If outcome 0 occurred:
where e^{i\theta} is a global phase (since \alpha / |\alpha| has modulus 1 but in general a phase). Physically the state is |0\rangle — the global phase is irrelevant.
Why the global phase drops out: \alpha / |\alpha| is a unit complex number, a pure phase. Multiplying a state by a pure phase does not change any measurement probability or any expectation value, so the physical state is the same as |0\rangle. You can safely write "the post-measurement state is |0\rangle" without loss.
Likewise, if outcome 1 occurred, the post-measurement state is |1\rangle (up to global phase). So for computational-basis measurement, the post-measurement state is just the basis vector corresponding to the outcome you observed. The superposition is gone. The complex amplitudes \alpha and \beta are gone. All that is left is one definite eigenstate, and one bit of classical information that says which one.
What the collapse means operationally
"Collapse" sounds dramatic. The operational content is quieter: the formula for the state has been updated. Every subsequent gate, every subsequent measurement, every subsequent calculation now treats the qubit as if it is |m\rangle — because that is what it is, as far as any future experiment can tell.
In particular:
-
Repeated measurement in the same basis gives the same outcome. If you measure and get 0, then immediately measure again in the computational basis, you get 0 again with probability 1. The state is |0\rangle, and a projective measurement on |0\rangle in the \{|0\rangle, |1\rangle\} basis deterministically yields 0.
-
Measurement in a different basis can give a new random outcome. If after collapsing to |0\rangle you now measure in the X basis \{|+\rangle, |-\rangle\}, you get + or - with probability 1/2 each — because |0\rangle = (|+\rangle + |-\rangle)/\sqrt{2} is a 50-50 superposition in the X basis.
-
The original amplitudes \alpha, \beta cannot be recovered from the outcome alone. A single outcome is one bit; it cannot encode two complex numbers.
Whether "collapse" is a real physical event (Copenhagen interpretation) or merely an update of your knowledge (the many-worlds and relational-QM interpretations handle this differently) is a topic philosophers still argue about. The mathematics is the same either way, and for the purposes of quantum computing — where all that matters is predicting measurement outcomes — the operational rule |\psi\rangle \mapsto P_m |\psi\rangle / \sqrt{p(m)} is all you need.
Why measurement is not unitary
Look at a projector: P_m^2 = P_m. Applying it twice is the same as applying it once. But for a unitary, U^\dagger U = I — applying a unitary and then its dagger gives back the identity. Does P_m satisfy P_m^\dagger P_m = I? For a projector, P_m^\dagger = P_m (projectors are Hermitian), so P_m^\dagger P_m = P_m^2 = P_m — and P_m \ne I unless P_m is the identity (which would make the measurement trivial). So projectors are not unitary.
Two deeper consequences of this algebraic fact:
Non-injectivity. A unitary is a bijection — every input has a unique output and every output can be traced back to a unique input. A projector squashes an entire subspace (every state of the form |\psi\rangle = \alpha|0\rangle + \beta|1\rangle with different (\alpha, \beta)) down to a single state (|0\rangle, if outcome 0 happened) scaled by different amplitudes. Many different input states produce the same post-measurement state. From the post-measurement state alone, you cannot reconstruct what came before. Information has been destroyed.
Irreversibility. There is no "reverse measurement" operation V with V \cdot \text{measure} = \text{identity}. In the formalism of unitary evolution, you could always apply U^\dagger to undo U. There is no P_m^\dagger whose action is "un-project" — because the projected-away part is genuinely gone and cannot be retrieved by any further operation on the qubit alone.
Connection to no-cloning. The no-cloning theorem (Wootters & Zurek, 1982) says you cannot duplicate an unknown quantum state. If you could (a) clone |\psi\rangle to produce two copies, (b) measure one of them, and (c) somehow un-measure it to recover |\psi\rangle, you would have both the classical outcome and the original state. This is impossible in quantum mechanics. Two pieces of the impossibility argument are at work: cloning is forbidden, and measurement is irreversible. Remove either one and the prohibition collapses. Both are structural facts about the formalism.
Measurement in other bases
The Born rule is not specific to the computational basis. Any orthonormal basis of the state space defines a valid projective measurement. Some bases you will meet constantly:
- Computational basis \{|0\rangle, |1\rangle\} — the default; what hardware measures.
- X basis \{|+\rangle, |-\rangle\} — the Hadamard eigenbasis; measures the qubit's phase information.
- Y basis \{|+i\rangle, |-i\rangle\} — the other complex-phase basis.
For an X-basis measurement, the projectors are P_+ = |+\rangle\langle +| and P_- = |-\rangle\langle -|. The probability of outcome + on a state |\psi\rangle is
and the post-measurement state (if + occurred) is |+\rangle, up to global phase.
Example. Apply X-basis measurement to |0\rangle. Since |0\rangle = (|+\rangle + |-\rangle)/\sqrt{2}, the amplitudes in the X basis are both 1/\sqrt{2}. The probabilities are |1/\sqrt{2}|^2 = 1/2 for each outcome. Measuring |0\rangle in the X basis is a fair coin flip — even though |0\rangle gave outcome 0 deterministically in the computational basis. The same state gives very different statistics in different bases. This is one of the defining features of quantum mechanics.
Implementing arbitrary-basis measurement via gates
Real quantum hardware typically only has the physical infrastructure for computational-basis measurement — it can detect whether a superconducting qubit is in its ground state or its excited state, and that is the Z-basis, period. How do you measure in a different basis on such hardware?
Rotate the state first, then measure. To measure in the X basis, apply a Hadamard to rotate the X basis onto the Z basis, then do a computational-basis measurement. Concretely: "measuring |\psi\rangle in the X basis" is implemented as "measuring H|\psi\rangle in the computational basis." The Hadamard pre-rotation is the software side; the hardware only does one kind of measurement. This pattern — apply a unitary, then measure in the computational basis — covers every projective measurement you will ever need.
Repeated measurements and statistics
One measurement gives you one bit. To learn anything more about |\psi\rangle than a single bit, you need many runs — prepare the state, measure, record the outcome, repeat. This is called "shot-based" measurement on quantum hardware. Every number a quantum computer reports, from the output of Shor's algorithm to a simple sampling experiment, ultimately comes from histograms over many shots.
Say |\psi\rangle = \alpha|0\rangle + \beta|1\rangle with |\alpha|^2 = 0.7 and |\beta|^2 = 0.3. Run the circuit 1000 times. On average you will see 0 about 700 times and 1 about 300 times. The statistical uncertainty is \sqrt{1000 \cdot 0.7 \cdot 0.3} \approx 14 in the count of zeros — so you would expect 700 \pm 14 zeros, giving you the probability to about \pm 0.014. To pin down p(0) to three decimal places, you need around 100{,}000 shots.
This is a real engineering constraint. Quantum hardware today runs circuits at typical repetition rates of maybe 10^4 shots per second. A million shots is a couple of minutes. Estimating probabilities finer than \sim 10^{-3} starts to take meaningful wall time — and this is one reason quantum-classical hybrid algorithms (like variational quantum eigensolvers) care so much about sample efficiency.
The Indian National Quantum Mission (2023, ₹6000 crore) funds hardware at IITs and TIFR, and the shot-statistics bottleneck applies equally to those machines. Sampling every quantum result takes time; the cleverness of an algorithm is often measured in how few shots it needs per useful bit.
The no-communication theorem — an important aside
You may have heard that entangled qubits can "communicate faster than light" because measuring one instantly affects the other. The phrase is wrong, and it is worth stating this clearly here, because projective measurement is the tool used in Einstein's original "spooky action at a distance" argument.
When two qubits are entangled (say, in a Bell state \tfrac{1}{\sqrt{2}}(|00\rangle + |11\rangle)) and you measure one of them, the measurement outcome on the other is correlated with your outcome — if yours is 0, theirs will also be 0 when they measure. But here is the crucial point: you cannot use this to send information. Your outcome is random (50-50), and the other party's outcome is also random (50-50), with the same statistics regardless of whether or when you measure. The correlation shows up only when you compare outcomes afterwards, which requires a classical (light-speed or slower) communication channel.
This is the no-communication theorem (Ghirardi, Rimini, Weber and independently others, around 1980). It is a consequence of the linearity of quantum mechanics plus the Born rule, and it is what makes quantum mechanics consistent with special relativity. Entanglement is surprising; it is not a telegraph.
The proper treatment belongs in later chapters on entanglement and teleportation. For this chapter, just record that measurement outcomes on one qubit do not signal information to the holder of another qubit, however entangled they are.
Worked examples
Example 1: Computational-basis measurement on |+⟩
Measure |+\rangle = \tfrac{1}{\sqrt{2}}(|0\rangle + |1\rangle) in the computational basis. Give the probabilities and the post-measurement state for each outcome.
Step 1. Write the state with explicit amplitudes: |+\rangle = \alpha|0\rangle + \beta|1\rangle with \alpha = \beta = 1/\sqrt{2}. Why: the definition of |+\rangle gives the amplitudes directly — both are 1/\sqrt{2}.
Step 2. Apply the Born rule.
Why: in the computational basis, P_0 = |0\rangle\langle 0| gives p(0) = |\langle 0|\psi\rangle|^2 = |\alpha|^2, and similarly for P_1.
Step 3. Compute the post-measurement states.
If outcome 0 was obtained:
Why: P_0 |+\rangle picks out the |0\rangle component — amplitude 1/\sqrt{2} times |0\rangle. Dividing by \sqrt{p(0)} = 1/\sqrt{2} normalises the result to a unit vector, which gives exactly |0\rangle.
If outcome 1 was obtained:
Result. Measuring |+\rangle in the computational basis gives 0 or 1 with equal probability 1/2. The post-measurement state is |0\rangle or |1\rangle respectively — a complete basis vector, no superposition, no phase information. The original |+\rangle is gone.
Remark. Run this circuit 1000 times: expect about 500 zeros and 500 ones, with statistical fluctuation \pm\sqrt{1000 \cdot 0.5 \cdot 0.5} \approx \pm 16. This is a quantum coin flip — fair by construction, statistically indistinguishable from a very well-made classical coin.
Example 2: X-basis measurement on the same |+⟩
Measure the same state |+\rangle in the X basis \{|+\rangle, |-\rangle\}. Give the probabilities and the post-measurement state.
Step 1. Identify the projectors.
Why: the X-basis measurement uses projectors built from the X-basis vectors. This is the same construction as the computational-basis measurement, just with different basis vectors.
Step 2. Apply the Born rule. The probability of outcome + is p(+) = |\langle + | + \rangle|^2.
Compute \langle + | + \rangle. This is the inner product of |+\rangle with itself, which is 1 (because |+\rangle is a unit vector).
And the probability of outcome - is p(-) = |\langle - | + \rangle|^2 = |0|^2 = 0 (because |+\rangle and |-\rangle are orthogonal). Why \langle - | + \rangle = 0: the X basis is orthonormal — two different basis vectors have inner product zero. You built |+\rangle and |-\rangle exactly to be orthogonal so that they could serve as a basis.
Step 3. Compute the post-measurement state. Since p(+) = 1, the outcome is always + (deterministically). The post-measurement state:
Result. Measuring |+\rangle in the X basis gives outcome + with probability 1. The state is unchanged.
Contrast with Example 1. The exact same state gave a random coin flip when measured in the Z basis but a deterministic outcome when measured in the X basis. The state |+\rangle is an eigenstate of the X-basis measurement — it has a definite value in that basis — but it is a superposition in the Z basis. "A qubit's value" is not a property of the qubit alone; it is a property of the qubit plus the basis you choose to measure in.
This is the lesson that trips up every first-time student of quantum mechanics, and it is the reason "measure the qubit" is ambiguous. Always say in what basis.
Common confusions
-
"Measurement is just reading the qubit." No. Reading a classical bit is passive — the bit is in some state, you inspect it, the bit keeps being in that state. Measuring a qubit is an active physical event that projects the state onto one of the basis vectors, destroying the rest of the state's information in the process. The word "read" invites the reader to imagine something gentle and non-disturbing; measurement is not that.
-
"Measurement gives you the amplitude." No. Measurement gives you a classical outcome — a bit. The amplitudes \alpha, \beta are complex numbers you cannot access from any single measurement. You can estimate |\alpha|^2 and |\beta|^2 from many measurements (repeat the circuit, build a histogram), but you cannot learn the phase of \alpha or \beta this way. Phase information requires measurements in different bases — you have to rotate the state first. This is why quantum tomography (fully characterising an unknown state) requires measurements in multiple bases and many runs.
-
"The collapse is a real physical event." Operationally, the post-measurement state really is |m\rangle, in the sense that every future experiment will treat it as such. Whether this reflects a genuine physical "collapse of the wave function" (Copenhagen-style) or just an update of your knowledge (the more epistemic interpretations) is an interpretive question — the mathematics is the same either way, and predictions about any future measurement are the same either way. Pick whichever interpretation helps you think; the algorithms work regardless.
-
"Measurement is random because our instruments aren't precise enough." No. The randomness is fundamental in quantum mechanics for genuinely superposed states. No amount of better engineering can predict an individual outcome for a state like |+\rangle measured in the Z basis. This has been tested experimentally against every "hidden-variable" theory that tried to make measurement secretly deterministic. Bell's theorem (1964) and its experimental verifications (Aspect 1982, many others since) ruled out the most natural hidden-variable theories. Randomness in the Born rule is a real feature, not a bug.
-
"You can distinguish |0\rangle and |+\rangle with one measurement." Not with a single measurement in any single basis. In the Z basis: |0\rangle always gives 0; |+\rangle gives 0 or 1 with probability 1/2. One measurement yielding 0 is consistent with either state. In the X basis: |0\rangle gives + or - with probability 1/2; |+\rangle always gives +. Again, one measurement yielding + is consistent with either. No single basis distinguishes |0\rangle and |+\rangle perfectly, because the two states are not orthogonal. Orthogonal states can be distinguished in one measurement (by choosing the right basis); non-orthogonal states cannot, ever. This is a precise theorem, and it has structural consequences for quantum cryptography (BB84, QKD).
-
"Global phase is observable because it is in the state vector." No. Multiplying |\psi\rangle by e^{i\theta} changes no measurement probability and no expectation value. The overall phase is a "gauge" freedom — a formal knob on the state vector that has no physical content. What matters is relative phases between different components of a superposition, which do affect measurement outcomes in other bases (through interference). Distinguish global from relative carefully.
Going deeper
Everything you need to do quantum computing is in the first half of this chapter: projectors, the Born rule, collapse to the projected component, irreversibility. The rest of this section goes deeper: the general measurement formalism beyond projective (POVMs), a sketch of why the Born rule is the only consistent probability rule, the quantum Zeno effect, and the interpretive zoo.
The general measurement formalism — POVMs
Projective measurement is a special case of a more general structure. A generalised measurement on a quantum system is a set of operators \{M_m\} (one per outcome) satisfying \sum_m M_m^\dagger M_m = I. The probability of outcome m is p(m) = \langle \psi | M_m^\dagger M_m | \psi\rangle, and the post-measurement state is M_m |\psi\rangle / \sqrt{p(m)}.
When each M_m is a projector (and the projectors are orthogonal), this reduces to projective measurement. When the M_m are not projectors, you get a more general family — POVMs (positive operator-valued measures). POVMs let you do measurements that are not achievable by any projective measurement, including:
- Optimal state discrimination between non-orthogonal states (where you explicitly allow an "inconclusive" outcome to improve the certainty of the definite outcomes).
- Unambiguous quantum state discrimination (Ivanovic-Dieks-Peres).
- Weak measurement (getting partial information at the cost of perturbing the state less than a projective measurement would).
Projective measurement is the version you will use in nearly every quantum algorithm. POVMs are the framework you need when characterising quantum detectors, analysing protocols like BB84, and optimising measurement strategies.
The Born rule from Gleason's theorem
Why p(m) = \langle \psi | P_m | \psi\rangle, and not some other function of the state and the projector?
Gleason's theorem (1957) says: for a Hilbert space of dimension \ge 3, the only probability assignment on projectors that is consistent with the algebra (additivity over orthogonal projectors, probability 1 on the identity) is the trace rule p(P) = \text{Tr}(\rho P) — which, for pure states, is exactly the Born rule \langle \psi | P | \psi\rangle.
The theorem is deep. It says the Born rule is not an extra postulate bolted on to make the theory predictive — it is the unique probability assignment consistent with the rest of the structure. Assume states live in Hilbert space, assume measurements are projectors, assume probabilities are additive over orthogonal outcomes; Gleason's theorem forces |\langle m | \psi \rangle|^2 out of those three inputs and nothing else.
(The theorem fails in 2 dimensions, which is an amusing technicality — a single qubit has 2-dimensional Hilbert space. But extensions and alternative derivations, including Deutsch's many-worlds argument and Zurek's envariance derivation, recover the Born rule in 2D as well.)
The quantum Zeno effect
If you measure a quantum system frequently enough, you can freeze its evolution. This is the quantum Zeno effect (Misra and Sudarshan, 1977).
Concretely: suppose a qubit is about to evolve continuously from |0\rangle to |1\rangle under some unitary, on a timescale T. If you measure it in the computational basis at time \delta t \ll T, the probability of finding it still in |0\rangle is roughly 1 - O(\delta t^2 / T^2) — a tiny deviation. The measurement collapses it back to |0\rangle. Do this N times in rapid succession with \delta t = T/N: the probability of still being in |0\rangle at time T is (1 - O(1/N^2))^N \to 1 as N \to \infty.
So: if you watch the pot often enough, it never boils. The effect has been experimentally demonstrated (Itano et al., 1990; Fischer et al., 2001). It is a direct consequence of the collapse rule plus the quadratic-in-time initial deviation of unitary evolution (the so-called "quadratic short-time behaviour" that comes from \cos(Ht/\hbar) \approx 1 - (Ht/\hbar)^2/2).
The Zeno effect is not just a curiosity. In quantum error correction, frequent syndrome measurements freeze the state against slow-drifting errors — a deliberately engineered application of the same physics.
Interpretations of collapse
The mathematics of measurement is settled. The interpretation — what is actually happening when the state "collapses" — is not. A few of the major options, with no endorsement:
-
Copenhagen (Bohr, Heisenberg; orthodox since the 1920s). There is a fundamental distinction between "classical" and "quantum" domains. Measurement is the interaction of a quantum system with a classical apparatus, at which point the state genuinely collapses. The boundary between quantum and classical is left fuzzy.
-
Many-worlds (Everett, 1957). The universal wave function never collapses; it just decoheres. At a measurement, the universe branches into multiple parallel branches, one per outcome, and "you" are in one of them. No randomness is fundamental; the apparent randomness is the observer's ignorance about which branch they are in.
-
Consistent histories / decoherent histories (Griffiths, Gell-Mann, Hartle). Collapse is replaced by classically decoherent sequences of projectors. You compute probabilities of histories, not instantaneous collapses.
-
Relational quantum mechanics (Rovelli, 1996). Every observation is relative to some observer; there is no universal state of a system, only relations between observers. Collapse is a feature of the relation, not an absolute event.
-
Objective collapse models (GRW, Penrose). The state does physically collapse, but the collapse is caused by a genuine modification of the Schrödinger equation — either at random times (GRW) or when gravitational self-energy becomes too large (Penrose). These are empirically testable: they predict slightly non-unitary dynamics on large superpositions, which experimenters are actively looking for.
For the purposes of this track — quantum computing, where the job is to predict measurement outcomes — all of these give the same answers. The interpretive choice does not affect any calculation you will do. But being aware that there is a choice matters for honest writing about the field.
Where this leads next
- The Bloch sphere — a geometric picture where measurement becomes "projection onto an axis," making the Z-basis-vs-X-basis distinction immediate.
- Unitary evolution — the other half of postulate 2; everything that is not measurement.
- The Hadamard gate — used to implement X-basis measurement on Z-basis hardware.
- Measurement in arbitrary bases — the full treatment of how to measure in any basis by prepending the right unitary.
- Outer products and projectors — the algebra of |m\rangle\langle m|, which this chapter relies on throughout.
- Part 13 on density matrices — the proper formalism for mixed states and open-system measurement; lifts everything here to a more general setting.
References
- Max Born, Zur Quantenmechanik der Stoßvorgänge (1926) — the paper where |\psi|^2 first becomes a probability. Wikipedia: Born rule has a summary with the original citation [1].
- Nielsen and Chuang, Quantum Computation and Quantum Information (2010), §2.2.3 and §2.2.5 — the definitive reference on projective and general measurement. Cambridge University Press.
- John Preskill, Lecture Notes on Quantum Computation, Ch. 2 (measurement, POVMs, Gleason's theorem) — theory.caltech.edu/~preskill/ph229.
- Wikipedia, Measurement in quantum mechanics — projectors, the Born rule, and collapse in encyclopaedic form.
- Wikipedia, Wave function collapse — the collapse postulate and its interpretive status.
- Qiskit Textbook, Measurement — hands-on measurement in the Qiskit framework, with live histograms.