In short
Code-based cryptography — invented by Robert McEliece in 1978 — rests on the hardness of decoding a general linear error-correcting code. Forty-eight years of public cryptanalysis have failed to break it. Its one flaw: public keys of hundreds of kilobytes to a megabyte, orders of magnitude larger than RSA or Kyber. Classic McEliece is a NIST Round 4 finalist for conservative applications where key size is acceptable; HQC (another code-based KEM) was selected by NIST in March 2025 as a backup to ML-KEM. Isogeny-based cryptography, by contrast, promised tiny keys and elegant mathematics — until July 2022, when Wouter Castryck and Thomas Decru published a purely classical attack that broke SIKE, a NIST Round 4 finalist, on a single laptop in about an hour. The Castryck-Decru paper killed the scheme overnight and eliminated an entire branch of post-quantum candidates from NIST. One isogeny variant, CSIDH, uses a different group structure and survives — but runs thousands of times slower than a lattice KEM, so it occupies a niche rather than a mainstream slot. The lesson of 2022 echoes through every PQC discussion: schemes need decades of cryptanalysis to build confidence, elegant new mathematics can collapse in an afternoon, and a conservative portfolio with mathematical diversity is worth more than a single beautiful algorithm. NIST's current portfolio — lattices (Kyber, Dilithium, FALCON) for mainline, SPHINCS+ for hash-based insurance, HQC and Classic McEliece for code-based diversity — is shaped directly by what happened to SIKE.
You have read the post-quantum chapter. You know that NIST standardised ML-KEM (Kyber) for key exchange and three signature schemes, three of which rest on lattices. You have probably asked the obvious question: if lattices are so good, why did NIST run a competition with seventy-two submissions for eight years? The answer is that NIST wanted a portfolio. A single cryptographic family — however trusted — is a single point of failure. If somebody finds a fast lattice-sieving algorithm in 2032, every NIST lattice standard fails together, and the internet's encryption falls in a day. So the competition reached for diversity: hash functions as a conservative backup, codes as an even older backup, isogenies as a promising new direction.
This chapter is about the two families that did not make it to the main stage — code-based cryptography, which did not win because its keys are too large, and isogeny-based cryptography, which did not win because one of its flagship schemes was catastrophically broken in July 2022. Together they tell you two stories the lattice chapter could not. The first is that the oldest public-key cryptosystem still believed secure is a code-based one from 1978, which has outlived RSA in conceptual age and will probably outlive it in deployed age. The second is what happens when an elegant, young cryptographic scheme gets attacked in the open — how fast confidence can collapse when the underlying assumption turns out to admit a polynomial-time classical algorithm nobody had found yet.
Code-based cryptography — the oldest surviving public-key idea
RSA was published in 1977. One year later, Robert McEliece — then at the Jet Propulsion Laboratory — proposed a different public-key scheme based on a completely different hard problem: decoding a random linear error-correcting code. Factoring looked powerful; coding theory looked old-fashioned. Almost nobody paid attention to McEliece's scheme for two decades. Meanwhile, RSA took over the internet.
Now it is 2026. Shor's algorithm breaks RSA in polynomial time on a future quantum computer. Classical McEliece, as the scheme has come to be called, is not broken by Shor, Grover, or any known algorithm. It has survived forty-eight years of open cryptanalysis without a structural weakness. Every post-quantum standardisation body that meets today takes it seriously.
The scheme — a picture before the formalism
Forget the algebra for a moment. Here is what Alice does if she wants to receive messages from Bob using McEliece.
- Alice picks a particular error-correcting code \mathcal{C} — call it the Goppa code — that she knows how to decode efficiently. This is her secret.
- She scrambles the code. She picks a random permutation of the code's columns and a random invertible linear map applied on the left, and applies both. The scrambled code is mathematically equivalent to the original (same error-correcting power) but looks like a random linear code.
- The scrambled code — specifically, its generator matrix G' = SGP — is Alice's public key. She publishes it.
- To encrypt a message m for Alice, Bob treats m as a codeword input, multiplies by G' to produce a valid (scrambled) codeword, and then adds a random error vector e of weight exactly t — that is, exactly t bit-flips in random positions.
- The ciphertext is c = mG' + e. Bob sends it.
- Alice undoes the permutation, decodes \mathcal{C} (which she can do because she picked a decodable code), then un-scrambles the left-multiplication. She recovers m.
Anyone intercepting c without the secret sees a random-looking linear code with a random error pattern added. Decoding a general linear code with t errors is NP-hard (Berlekamp-McEliece-van Tilborg, 1978) and is believed to remain hard for quantum computers.
The trapdoor — what makes Goppa codes special
The choice of code matters intensely. Alice needs a code that:
- is efficient to decode when you know its structure (so Alice can decrypt fast),
- looks indistinguishable from a random linear code once its columns are permuted and a left-invertible S is applied (so the attacker cannot tell Alice's code from a random one), and
- can correct t errors, where t is chosen large enough that random decoding of a code with t errors is infeasible.
The original McEliece choice was binary Goppa codes — a family due to V. D. Goppa (1970) with an efficient decoder (Patterson's algorithm, quadratic time in the code length) and an algebraic structure that has proven resistant to distinguishing attacks for decades. Alternative codes — Reed-Solomon, Reed-Muller, LDPC, MDPC, algebraic geometry codes — have all been tried; several have been broken (e.g. Sidelnikov-Shestakov 1992 broke the Niederreiter scheme using Reed-Solomon codes). Classic McEliece, the current NIST submission, stays with binary Goppa because they are the ones with the longest track record.
Why Goppa codes survive when Reed-Solomon doesn't: a Reed-Solomon code has such a rigid algebraic structure that even after scrambling, attackers found ways to reverse-engineer the underlying polynomial structure from the public generator matrix. Binary Goppa codes are defined over a binary extension field with a more complex support set, and after scrambling their generator matrices have withstood every known distinguishing attack. The resistance is empirical rather than proved — the scheme rests on this empirical confidence.
The key-size problem
Here is the bad news. A McEliece public key is a k \times n generator matrix over \mathbb{F}_2, where n is the code length and k is its dimension. For the Classic McEliece parameter set mceliece6960119 (128-bit post-quantum security):
- n = 6960, k = 5413, t = 119
- Public-key bit-size ≈ k (n - k) \approx 5413 \times 1547 \approx 8.4 \times 10^6 bits ≈ 1.05 megabytes.
For mceliece8192128 (highest level):
- n = 8192, k = 6528, t = 128
- Public key ≈ 1.36 megabytes.
Compare this to:
- RSA-2048 public key: 256 bytes
- ECDSA-P256 public key: 32 bytes
- Kyber-768 (ML-KEM-768) public key: 1184 bytes
McEliece's public key is roughly 1000× larger than Kyber's and four orders of magnitude larger than ECC. For a TLS handshake where the public key is transmitted on every connection, this is a dealbreaker. For a long-lived certificate stored once and reused for a decade, it is acceptable. For an HSM that stores ten public keys total, it is fine.
Ciphertext size is modest — around 200 bytes — and encryption/decryption are fast (microseconds on modern hardware once the key is loaded). The pathology is specifically the public key.
HQC — a code-based KEM with smaller keys
In March 2025 NIST selected HQC (Hamming Quasi-Cyclic) as a fifth standard — a code-based KEM as backup to Kyber. HQC uses quasi-cyclic codes (structured codes with rotational symmetry) to get public keys of about 2.2-7.2 kB and ciphertexts about 4.5-14 kB — larger than Kyber but a thousand times smaller than Classic McEliece. Security reduction is to the Syndrome Decoding Problem on quasi-cyclic codes (QCSD), a narrower assumption than general decoding but one that has held up for over a decade.
HQC's value in the portfolio is precisely the same as SPHINCS+'s: diversity. If a future attack structure-breaks lattice cryptography, HQC gives NIST and its users a KEM to fall back to whose hardness assumption is unrelated to lattices.
Isogeny-based cryptography — and the 2022 collapse
The second family is more dramatic. In 2011, David Jao and Luca De Feo proposed a key-exchange protocol called SIDH — Supersingular Isogeny Diffie-Hellman — based on a construction using isogenies between supersingular elliptic curves. For a decade it was the rising star of post-quantum cryptography: tiny public keys (hundreds of bytes), elegant mathematics from number theory, strong academic enthusiasm. Its KEM variant SIKE became a Round 4 finalist at NIST.
Then, on 30 July 2022, Wouter Castryck and Thomas Decru of KU Leuven posted a paper titled An efficient key recovery attack on SIDH. It broke SIKE — the Round 4 NIST finalist — in about an hour, on a single laptop, using only classical computation. The scheme was retired from NIST within weeks. Its entire branch of the post-quantum tree pruned itself.
Isogenies — the gentlest possible sketch
An elliptic curve is the set of points (x, y) satisfying an equation like y^2 = x^3 + ax + b over some field. You already meet elliptic curves in ECDSA and ECDH, the classical signature and key-exchange schemes that protect most of today's internet.
An isogeny \phi : E_1 \to E_2 between two elliptic curves is a map that (a) sends points of E_1 to points of E_2, (b) respects the group structure on both curves, and (c) is a rational function (a ratio of polynomials). You can think of an isogeny as a "structure-preserving morphism" between two curves — and interestingly, there can be many different isogenies from E_1 to curves of the same size.
What SIDH actually did
In SIDH, Alice and Bob each pick a secret short walk in an isogeny graph. They exchange the endpoints of their walks (public keys) along with some auxiliary information (specifically, the images of certain torsion points). Using each other's endpoints and the auxiliary data, each recovers a common curve at the "intersection" of their walks. That shared endpoint's j-invariant becomes their shared secret.
The key sizes were beautiful:
- SIKE-p434: public key 330 bytes, ciphertext 346 bytes, security approximately NIST level 1 (128-bit classical / 64-bit quantum).
- SIKE-p751: public key 564 bytes, ciphertext 596 bytes, security approximately NIST level 3.
Compare to lattice KEMs (1-2 kB) and code-based KEMs (kilobytes to megabytes). SIKE's keys were the smallest of any serious PQC candidate. For bandwidth-constrained applications — IoT, low-power radios, space — this was hugely attractive.
The security rested on the belief that inverting the SIDH walk was computationally hard. That belief held from 2011 to 2022.
The Castryck-Decru break
On 30 July 2022, Wouter Castryck and Thomas Decru at KU Leuven published a paper titled An efficient key recovery attack on SIDH on the IACR eprint archive. Within two weeks their attack code could recover SIKEp434 private keys in 62 minutes on a single laptop core, SIKEp503 in 2 hours, SIKEp751 in 21 hours. The attack was purely classical — no quantum computer required. Microsoft's SIKE Cryptographic Challenge offered cash prizes for breaking the scheme at various levels; all were claimed within days.
What went wrong? The attack exploited the auxiliary torsion-point information that SIDH had always published. In SIDH, Alice's public key includes not just the endpoint curve E_A but also the images \phi_A(P_B) and \phi_A(Q_B) of Bob's basis points under Alice's secret isogeny \phi_A. This auxiliary data was needed to enable the shared-secret computation.
Castryck and Decru observed that the torsion-point images, taken together with a deep theorem due to Kani (1997) on isogenies between abelian surfaces (products of elliptic curves), let one recover Alice's secret isogeny in polynomial time. The technical core of the attack is a construction that uses Alice's published torsion data to build a gluing of two elliptic curves into an abelian surface, then decomposes the surface in a way that reveals the secret isogeny's degree decomposition.
The underlying isogeny problem without torsion-point information — the pure isogeny path problem — remains believed hard. But SIDH's specific design, in which the torsion images had to be published, was broken. SIKE was dead.
CSIDH — the isogeny survivor
Not all isogeny-based cryptography died in July 2022. A different scheme, CSIDH (Commutative Supersingular Isogeny Diffie-Hellman, pronounced "sea-side"), proposed by Castryck-Lange-Martindale-Panny-Renes in 2018, uses a different kind of isogeny graph — one based on the action of a commutative class group on supersingular curves defined over \mathbb{F}_p rather than \mathbb{F}_{p^2}. CSIDH does not publish torsion-point information, so the Castryck-Decru attack does not apply.
CSIDH survives, but with a price: it is thousands of times slower than Kyber. A CSIDH-512 key exchange takes about 40 milliseconds of CPU time per operation; a Kyber-768 key exchange takes under 100 microseconds. For most mainstream TLS, 40 ms per connection is a showstopper. CSIDH remains interesting for very-specific applications where tiny keys matter more than per-handshake latency — or where certain protocol features (non-interactive key exchange in the random oracle model) are needed that lattice-based schemes don't naturally provide.
CSIDH is also still under cryptanalytic scrutiny. The best known quantum attack is Kuperberg's subexponential algorithm for the hidden shift problem (e^{O(\sqrt{\log p})} time), meaning CSIDH-512 provides lower quantum security than its classical level suggests. Getting to NIST category 1 security quantumly requires CSIDH-2048 or larger, pushing key-exchange time well over 100 ms. CSIDH lives; it thrives only in niches.
Worked example — McEliece parameters
Example 1: mceliece6960119 — what the parameters buy you
Setup. The Classic McEliece NIST submission defines several parameter sets. Take mceliece6960119: a binary Goppa code with length n = 6960, dimension k = 5413, error-correction capacity t = 119. What does this give?
Step 1 — Security. Decoding a random linear code with t errors has information-set-decoding (ISD) cost that grows roughly as \binom{n}{t} / \binom{n-k}{t} operations, with various speedups (Stern, BJMM, MMT) shaving logarithmic factors. Against the best known classical and quantum ISD algorithms, mceliece6960119 provides approximately 128 bits of post-quantum security — the NIST category 1 target.
Why t = 119 specifically: t is the code's error-correction capacity, set just below the Goppa code's decoding limit. Pushing t higher would make encryption harder (more errors to add) and would also make ISD easier; pushing t lower would weaken security. The NIST submission picks t at exactly the value where decoding-capability and ISD-resistance balance at the target security level.
Step 2 — Public-key size. The public key is the systematic generator matrix's non-trivial part: k \times (n-k) bits = 5413 \times 1547 = 8{,}373{,}911 bits = 1{,}046{,}739 bytes \approx 1.05 MB.
Step 3 — Ciphertext size. A ciphertext is c \in \mathbb{F}_2^n of length n = 6960 bits \approx 870 bytes. (Small! The ciphertext is genuinely compact.)
Step 4 — Private-key size. The private key stores the Goppa polynomial, the support, and the scrambling matrices: roughly 14 kB total. Modest.
Step 5 — Performance on modern hardware. On an x86 core with the liboqs reference implementation:
- Key generation: ~100 ms (one-time, per key)
- Encapsulate: ~50 μs
- Decapsulate: ~150 μs
Encapsulate/decapsulate are fast; keygen is slow. Compare Kyber-768: ~40 μs keygen, ~30 μs encap, ~20 μs decap. Classic McEliece is slower on keygen by a factor of roughly 2500×.
Step 6 — What this is good for. A web server loading a 1 MB public key on every TLS handshake is a terrible idea. A code-signing authority that stores one master public key and uses it to verify billions of signatures over a decade is a perfect use case. An HSM in a bank's core with a handful of long-lived keys is also fine. mceliece6960119 is not a drop-in replacement for X25519; it is a specialised tool for specialised pipes.
Result. A PQC system with 128-bit post-quantum security, 1.05 MB public keys, 870-byte ciphertexts, fast encap/decap, and forty-eight years of unbroken cryptanalysis. The NIST Round 4 ongoing evaluation leans toward including it as a "conservative option where bandwidth is unconstrained" — a place in the portfolio rather than a mainstream slot.
Worked example — why SIKE's torsion-point leak was fatal
Example 2: the Castryck-Decru attack in outline
Setup. SIKEp434, the Round 4 finalist, uses supersingular elliptic curves over \mathbb{F}_{p^2} where p = 2^{216} \cdot 3^{137} - 1. Alice picks a secret isogeny \phi_A : E_0 \to E_A of degree 2^{216}. Bob picks \phi_B : E_0 \to E_B of degree 3^{137}. Alice's public key is (E_A, \phi_A(P_B), \phi_A(Q_B)) — the endpoint curve and the images of Bob's basis points P_B, Q_B of order 3^{137} on E_0 under Alice's secret isogeny.
The attacker's task. Recover \phi_A (equivalently, recover Alice's private key) given (E_A, \phi_A(P_B), \phi_A(Q_B)).
Step 1 — Kani's theorem. Ernst Kani proved in 1997 a theorem about pairs of isogenies (\phi, \psi) between elliptic curves: if \deg \phi + \deg \psi = N for a specific N, then \phi and \psi glue into a single isogeny between abelian surfaces (two-dimensional generalisations of elliptic curves), and the glued isogeny has a predictable kernel structure.
Step 2 — Guessing a partial walk. Castryck and Decru's algorithm first guesses that Alice's secret isogeny can be written as \phi_A = \phi_2 \circ \phi_1, where \phi_1 is a very short starting segment (small degree 2^a for some small a). They enumerate all possible \phi_1 — there are only a few thousand.
Step 3 — Building an abelian surface. For each guessed \phi_1, they construct an auxiliary isogeny \psi on E_0 of degree 2^{216-a} using Bob's points P_B, Q_B and the published images \phi_A(P_B), \phi_A(Q_B). Kani's theorem says \phi_A \circ \phi_1^{-1} and \psi should glue into an isogeny of abelian surfaces with a specific kernel.
Step 4 — Checking consistency. For the correct guess \phi_1, the glued isogeny decomposes cleanly — the (2,2)-isogeny algorithm on abelian surfaces runs in polynomial time and outputs a consistent decomposition. For wrong guesses, the decomposition fails. So the attacker enumerates \phi_1, runs the abelian-surface algorithm, and wins when a decomposition succeeds.
Step 5 — Running time. Total work: O(2^a \times \text{polynomial}). For SIKEp434, a can be chosen small enough that the whole enumeration runs in about an hour on a single laptop core. For SIKEp751, about 21 hours. No quantum computer required.
Why this was invisible to eleven years of cryptanalysis: nobody had put the pieces together. Kani's theorem dates to 1997. SIDH was proposed in 2011. The attacker's chain — "use the published torsion-point images to build an abelian surface, then use Kani to decompose it" — required a specific imaginative leap that nobody made until Castryck and Decru. The theorem, the scheme, and the attack idea existed in different communities; nobody bridged them.
What this shows. Elegant schemes with published auxiliary data are dangerous. The torsion-point images were the "hole in the fence" — information the protocol had to publish for correctness, which an attacker eventually learned to turn into a winning move. Modern post-quantum design increasingly prefers schemes where auxiliary data is minimised (or proven to not leak), and the 2022 break is cited in nearly every subsequent PQC security analysis as the cautionary example.
What survives — NIST's post-2022 portfolio
The NIST PQC portfolio as of 2025-2026:
- Lattice-based: ML-KEM (Kyber) for key exchange, ML-DSA (Dilithium) and FN-DSA (FALCON) for signatures. The mainline.
- Hash-based: SLH-DSA (SPHINCS+) signatures. The conservative backup with hash-only assumptions.
- Code-based: HQC (2025 selection) as KEM backup to Kyber. Classic McEliece remains under Round 4 evaluation for conservative long-lived-key applications.
- Isogeny-based: none in NIST standards. CSIDH continues as a research-level alternative for specialised use cases.
The architecture is mathematical diversity as insurance. If lattices fall, HQC and Classic McEliece stay. If both lattices and codes fall, SPHINCS+ stays. If the hash assumption falls too, approximately every cryptographic system in daily use fails simultaneously and the crisis is bigger than PQC.
What India's NQM and CERT-In are tracking
India's post-quantum transition planning recognises the same diversity imperative. CERT-In's draft 2026 PQC guidance lists ML-KEM + ML-DSA as the mainstream targets for banking, Aadhaar, and UPI; notes SPHINCS+ for conservative long-horizon signatures in defence and diplomatic communications; and flags HQC and Classic McEliece as alternatives to track for niche high-assurance applications. IIT Madras's SeCCI centre publishes periodic cryptanalysis updates on lattice sieving algorithms, and IIT Kharagpur's Mukhopadhyay group has published side-channel evaluations of Classic McEliece on ARM Cortex-M4 implementations (2024) as part of the NQM's implementation-security pillar. The research community in India is small but genuinely active in keeping multiple PQC families alive.
Common confusions
- "Code-based cryptography means error correction — it's the same thing." No. Classic McEliece uses error-correcting codes (specifically, binary Goppa codes) but for cryptographic purposes — the hardness of decoding a random linear code is the security assumption, not the ability to correct errors in a noisy channel. You can think of it as "Alice publishes a decodable code disguised as a random one; the attacker can't tell which it is."
- "SIKE was dead on arrival — nobody should have trusted it." This is 20/20 hindsight. SIKE had eleven years of open cryptanalysis by a serious international community before the 2022 break. The lesson of SIKE is not that the community was careless — it is that new mathematical assumptions genuinely need decades of scrutiny, because the attack required bridging three separate fields (isogeny theory, abelian surfaces, and Kani's 1997 theorem) in a way nobody had done.
- "Code-based schemes are slow because their keys are huge." Actually they are fast at encap/decap (microseconds) but slow at keygen (hundreds of milliseconds) because the Goppa-code generator matrix has to be constructed. The key size is the problem, not computational speed. Bandwidth and storage are the constraints, not CPU.
- "Isogeny crypto is topology." It is the opposite. Isogenies are elliptic-curve morphisms — maps between solutions of cubic equations over finite fields, studied in number theory and algebraic geometry, not topology. If you hear "isogeny" in a post-quantum context, think elliptic curves and their maps, not topological spaces.
- "Since SIKE fell, all isogeny crypto is dead." No. CSIDH is structurally different (commutative group action, no torsion-point publication) and survives. Its performance is too slow for TLS but fine for niche use cases — non-interactive key exchange, certain oblivious-transfer constructions. Isogeny crypto is a smaller niche now, not a dead field.
- "Diversity is just academic overkill — NIST should have picked one winner." The 2022 SIKE break is the direct counter-argument. NIST's portfolio (lattice + hash + code) is engineered so that a single mathematical surprise does not kill every deployed PQC system at once. Diversity is the insurance premium against the next Castryck-Decru-class discovery.
Going deeper
If you understand that code-based cryptography (Classic McEliece) is the oldest unbroken public-key scheme — forty-eight years old with 1-MB keys — that NIST added HQC in 2025 as a code-based KEM backup, that isogeny-based SIKE was broken in 2022 by Castryck-Decru using classical computation and Kani's theorem, and that CSIDH survives but runs thousands of times slower than Kyber — you have chapter 161. The material below is for readers who want the McEliece decoding strategy (Patterson's algorithm), more on the structure of the 2022 break, HQC's quasi-cyclic codes, CSIDH's class-group action, and the deployment implications.
McEliece decoding — Patterson's algorithm in outline
Alice's decoding of a Goppa code uses Patterson's algorithm (N. Patterson, 1975), which runs in O(nt^2) bit operations for code length n and error capacity t. The rough steps, for a binary Goppa code with Goppa polynomial g(z) over \mathbb{F}_{2^m}:
- Compute the syndrome s(z) = \sum c_i / (z - \alpha_i) modulo g(z), where c_i is the received codeword's i-th bit and \alpha_i is the support element.
- Solve \sigma(z)^2 \equiv z \cdot (s(z)^{-1} + z) \pmod{g(z)} using the extended Euclidean algorithm on (g(z), s(z)^{-1} + z), recovering an error-locator polynomial.
- The roots of the error-locator polynomial identify the error positions; flip those bits.
Without knowledge of the Goppa polynomial g(z) and the support \alpha_1, \ldots, \alpha_n, this is infeasible — and those are exactly what Alice's private key holds and the public key's scrambling hides.
Information-set decoding — the best generic attack
The best known generic algorithm for decoding a random linear code — used as the yardstick for McEliece security — is Information-Set Decoding (ISD). The basic idea (Prange 1962, improved by Stern 1989, BJMM 2012, MMT 2013): randomly pick k coordinates as an information set; assume they are error-free; invert the restricted generator matrix to recover the candidate message; check if the resulting error has weight \le t. Success probability is about \binom{n-t}{k} / \binom{n}{k} per attempt; expected runtime scales as \binom{n}{t} / \binom{n-k}{t} with various refinements. Against mceliece6960119, the best classical ISD runs in approximately 2^{140} operations; Grover-style quantum speedups halve this to 2^{70} quantum operations — still infeasible.
HQC's structure
HQC uses quasi-cyclic codes over \mathbb{F}_2 — codes whose parity-check matrix is composed of circulant blocks. The quasi-cyclic structure means the generator matrix is described by a single row (plus its cyclic shifts), reducing key size from the megabyte scale of Classic McEliece to the kilobyte scale. The underlying hard problem is Quasi-Cyclic Syndrome Decoding (QCSD). Security rests on QCSD being hard; the conservative concern is that quasi-cyclic codes, being more structured, might admit attacks that random codes do not — but after more than a decade of cryptanalysis, no such attack has been found at the parameter levels HQC uses.
CSIDH's class-group action
CSIDH operates in a commutative setting: the class group of an imaginary quadratic order acts on the set of supersingular elliptic curves defined over \mathbb{F}_p (rather than \mathbb{F}_{p^2} as in SIDH). A secret is an element of the class group; applying it to a public base curve gives a public key, and two parties can combine their secrets in either order to reach the same shared curve — just like classical Diffie-Hellman in an abelian group. No torsion-point information is published.
The best quantum attack is Kuperberg's subexponential algorithm for the dihedral hidden shift problem. CSIDH-512 provides only about 2^{58} quantum security; reaching NIST level 1 (2^{128} quantum) requires CSIDH-2048 or larger, which multiplies the per-operation cost by tens. CSIDH's niche is non-interactive key exchange for applications where Kyber's interactive structure is inconvenient — a real but small market.
The deployment story
- Classic McEliece: used in experimental deployments by McAfee and some HSM vendors for long-lived code-signing keys where the 1-MB public key is amortised over many signatures. Not expected to appear in TLS.
- HQC: NIST standardisation expected 2026-2027. Likely to appear as a TLS hybrid mode alongside Kyber, providing an alternative KEM family for diversity.
- SIKE: withdrawn from NIST; no production deployments remain.
- CSIDH: used in academic protocols (blind signatures, verifiable delay functions) where non-interactivity matters; no mainstream TLS role.
Regulatory and procurement implications
The U.S. Department of Commerce's Bureau of Industry and Security and NIST's CSRC both list NIST-standardised PQC as the baseline for federal procurement. The European Cybersecurity Agency (ENISA) takes a similar view. India's CERT-In draft 2026 guidance tracks NIST's portfolio closely but adds the explicit requirement that any critical national infrastructure deployment must use a hybrid (classical + PQC) scheme through 2028 and must have a documented migration path to any HQC-adjacent or Classic-McEliece-adjacent alternative in case of future lattice breaks. The diversity imperative is written into the procurement language.
The cryptanalysis community's culture of open review
The 2022 SIKE break was possible only because SIKE's specification, implementations, and parameter sets were fully public. Castryck and Decru published their code along with the paper. Microsoft offered cash prizes to anyone breaking SIKE at various levels; all the prizes were claimed. The community's norm of open cryptanalysis — publish everything, attack everything, trust only what survives — is what discovered the weakness in time to prevent it from being deployed at scale. The lesson generalises: post-quantum confidence requires public code, public datasets, cash incentives for breaks, and international cryptanalytic teams. Black-box or proprietary PQC schemes should be viewed with significantly more suspicion than open ones.
Where this leads next
- Post-Quantum Cryptography: Why Now — the overall migration story, NIST's four signature standards, and India's NQM transition timeline.
- Lattice-Based Cryptography — the mainline PQC family: Kyber, Dilithium, FALCON, and the LWE problem at their core.
- Hash-Based Signatures — SPHINCS+ and the conservative hash-only alternative.
- Quantum Crypto Threat Model — which primitives break, which bend, and why Shor is qualitatively different.
- Factoring and RSA — the structural weakness that motivates the entire PQC migration.
References
- Robert J. McEliece, A public-key cryptosystem based on algebraic coding theory (1978) — NASA JPL DSN Progress Report 42-44.
- Wouter Castryck and Thomas Decru, An efficient key recovery attack on SIDH (2022) — eprint.iacr.org/2022/975.
- Classic McEliece team, Classic McEliece: conservative code-based cryptography — classic.mceliece.org.
- Wikipedia, McEliece cryptosystem.
- Wikipedia, Supersingular isogeny key exchange.
- NIST, Post-Quantum Cryptography Round 4 submissions — csrc.nist.gov/projects/post-quantum-cryptography/round-4-submissions.