The cryptography we use today to secure digital certificates relies on mathematical problems that classical computers cannot solve efficiently.
RSA relies on integer factorization. Given N = p * q, where p and q are large primes, finding p and q from N alone is computationally infeasible for classical machines. The best known classical algorithm (General Number Field Sieve) runs in sub-exponential time: O(exp((64/9 * n)^(1/3) * (log n)^(2/3))), where n = log2(N).
ECDSA relies on the Elliptic Curve Discrete Logarithm Problem. Given a point Q = k * G on a curve E over a finite field Fp, recovering the scalar k from Q and G is computationally hard. The best classical attack (Pollard's rho) runs in O(sqrt(n)) where n is the order of the group.
Shor's algorithm changes the equation entirely. On a quantum computer, both integer factorization and discrete logarithm are solvable in polynomial time: O((log N)^3). A 4096-bit RSA key that would take classical computers longer than the age of the universe to factor could be broken in hours.
This is no longer a theoretical hypothesis. It is a timeline.
The countdown is public
In 2024, NIST finalized its first three post-quantum cryptography standards.
FIPS 203 (ML-KEM) for key encapsulation. FIPS 204 (ML-DSA, formerly Dilithium) for digital signatures. FIPS 205 (SLH-DSA, formerly SPHINCS+) for stateless signatures.
The same year, the NSA published version 2.0 of CNSA (Commercial National Security Algorithm Suite). The deadlines are explicit and category-specific.
By 2030, software signing and traditional networking equipment must exclusively use CNSA 2.0 algorithms. By 2033, web services, operating systems, and remaining systems must complete the transition. The ultimate goal is full quantum resistance across all national security systems by 2035.
These deadlines do not only concern the government sector. They define the trajectory for the entire ecosystem. European regulators, ANSSI in France, BSI in Germany, are following similar paths.
"Harvest now, decrypt later" : the threat starts today
The classic argument is that quantum computers capable of breaking RSA-2048 do not exist yet. This is correct.
But the threat model does not begin the day the machine exists. It begins the day data is captured.
A state-level adversary or advanced actor can intercept and store encrypted traffic today. TLS sessions. VPNs. Sensitive communications. This data is incomprehensible now, but it is patient.
The day a quantum computer becomes operational, everything captured can be decrypted retroactively.
For organizations handling long-lived data (intellectual property, trade secrets, medical records, classified information), the vulnerability window is already open.
This risk has a name: "harvest now, decrypt later". And it does not concern the future. It concerns the present.
Certificates are at the center of the transition
When discussing the post-quantum transition, the first thought is usually about communication encryption. TLS, VPN, messaging.
But X.509 digital certificates are the foundation of this entire infrastructure. They carry public keys. They authenticate servers, users, machines. They sign code, documents, timestamps.
Every certificate contains a signature algorithm. If that algorithm becomes vulnerable, the certificate proves nothing.
The post-quantum transition for a PKI is not limited to "changing the CA's algorithm." It involves multiple dimensions.
Certificate authorities must be able to issue with new algorithms. Certificate templates must support new key sizes and new OIDs. Clients (browsers, applications, systems) must be able to validate these certificates. Trust chains must be rebuilt or extended.
And all of this must happen progressively, without breaking what already exists.
Three migration strategies, not one
The cryptographic community has identified several approaches for the transition. Their maturity varies. Some are standardized, others still in draft. Each addresses different constraints.
Classical certificates with PQC algorithms
The most direct approach. Replace RSA or ECDSA with ML-DSA or SLH-DSA in certificates. The X.509 structure remains identical, only the algorithm changes.
The advantage is conceptual simplicity. The drawback is that clients that do not yet support these algorithms will reject the certificate. It is a big bang.
Related certificates (RFC 9763)
This approach issues two linked certificates for each identity. A classical certificate (RSA or ECDSA) and a post-quantum certificate (ML-DSA). The two are bound via a RelatedCertificate extension that contains a hash of the paired certificate, creating a verifiable cryptographic link.
The client chooses the certificate it can validate. If it supports PQC, it uses the post-quantum certificate. Otherwise, it falls back to the classical one. The transition is progressive and backward-compatible.
This approach requires that the PKI infrastructure can manage pairs of CAs (one classical and one PQC) and issue both certificates atomically.
Composite certificates (IETF draft)
A single certificate contains two keys and two signatures. One classical and one post-quantum, combined in a unique composite OID.
The certificate is valid if at least one of the two signatures is verifiable. Security is maximal: even if one of the two algorithms is broken, the other still protects.
Composite OIDs like id-MLDSA44-ECDSA-P256-SHA256 or id-MLDSA65-ECDSA-P384-SHA512 are currently in draft status at the IETF. They are not yet finalized and may evolve.
This approach is conceptually the most robust but also the least mature. It requires clients to understand composite OIDs, and the lack of final standardization currently limits production adoption. It remains relevant for pilots and R&D.
Alternative signatures (X.509 AltSignature extensions)
A fourth approach uses the altSignatureAlgorithm and altSignatureValue extensions defined in ITU-T X.509 (2019 edition). The certificate carries a primary signature with one algorithm (e.g. ECDSA) and embeds a second, alternative signature with another algorithm (e.g. ML-DSA) directly in the X.509 extensions.
Unlike composite certificates, AltSignature does not define a new combined OID. It reuses standard X.509 extension mechanisms. A client that understands the extensions validates both signatures. A client that does not simply ignores the extensions and validates the primary signature alone.
This makes AltSignature inherently backward-compatible: existing clients continue to work, while upgraded clients get the benefit of post-quantum verification.
However, this approach introduces a subtle risk. If a client validates only the primary (classical) signature and ignores the alternative signature entirely, the post-quantum protection is effectively absent. The certificate appears hybrid but provides only classical security. Conversely, an attacker who can forge a classical signature could craft a certificate that passes validation on clients that only check the primary signature. The security guarantee depends entirely on the client implementation, which is harder to enforce across a heterogeneous ecosystem.
AltSignature is currently supported in some PKI tooling and is being evaluated by several standards bodies. It offers a pragmatic migration path but requires careful attention to client-side validation behavior.
The cryptographic inventory: non-negotiable first step
Before migrating anything, you need to know what you have.
How many certificates are active in the organization. What algorithms and key sizes are used. Which certificate authorities issued them. When they expire. What systems depend on them.
This cryptographic inventory seems trivial. In practice, it is rarely complete.
Certificates live in multiple places. Web servers, load balancers, databases, internal APIs, IoT devices, workstations, smartcards, HSMs, Java keystores, Kubernetes secrets.
Without an exhaustive inventory, any migration planning is approximate. You cannot prioritize what you do not know.
A complete cryptographic inventory answers a simple question: what percentage of our certificates uses quantum-vulnerable algorithms, and when do they expire?
Crypto-agility: designing for change
Crypto-agility is the ability of an infrastructure to change its cryptographic algorithm without a major overhaul.
In theory, everyone agrees. In practice, very few PKI infrastructures are crypto-agile.
A CA that only supports RSA is not crypto-agile. A certificate template that enforces a single algorithm is not crypto-agile. An issuance workflow that assumes a fixed key format is not crypto-agile. A CLM that cannot orchestrate multiple types of CAs is not crypto-agile.
Crypto-agility is built at multiple levels.
At the CA level, they must be able to issue with multiple algorithm families. At the CLM level, it must be able to route requests to the right CA based on the required algorithm. At the protocol level, ACME and EST must support new key types. At the inventory level, you need to visualize the algorithmic distribution and plan the migration.
The ecosystem is not ready. And that is normal
The algorithms are standardized. But between a NIST standard and an ML-DSA certificate validated by a browser in production, there is an entire ecosystem that must evolve.
Today, the reality is as follows.
OpenSSL 3.5 (released April 2025) includes native support for ML-KEM, ML-DSA, and SLH-DSA, with ML-KEM as the default TLS 1.3 key-share. However, most production systems still run older versions that require the OQS provider for any PQC support. Browsers support hybrid ML-KEM key exchange in TLS, but do not yet validate PQC signatures in X.509 certificates. HSMs from a few vendors are starting to support ML-DSA, often through specific firmware not widely deployed. Languages and frameworks (Java, Go, Python) have experimental libraries, not native support in their standard libraries. Network equipment, load balancers, and reverse proxies do not validate post-quantum signatures. Automation protocols like ACME and EST do not yet specify PQC profiles.
This gap between algorithm maturity and ecosystem maturity is normal. It has occurred for every previous cryptographic transition: SHA-1 to SHA-2, RSA-1024 to RSA-2048, TLS 1.0 to TLS 1.2.
This does not mean you should wait. It means the current phase is preparation, not mass deployment. Inventory, architect for crypto-agility, launch internal pilots on non-critical scopes. When the ecosystem is ready, organizations that have prepared their PKI will be able to switch. The others will have to do everything in a rush.
What the transition will concretely require
For PKI teams, the post-quantum transition will translate into concrete actions.
Create new certificate authorities with PQC algorithms, in parallel with existing classical CAs. Define certificate profiles that support new algorithms while maintaining backward compatibility. Set up a dual or composite issuance mechanism for cases where backward compatibility is needed. Adapt trust chains so clients can validate the new CAs. Automate renewal so the switchover happens progressively, certificate by certificate, as they expire. Monitor PQC coverage in the inventory to track migration progress.
This is not a one-time project. It is a progressive transformation that will span several years.
Mistakes to avoid
Several mistakes are predictable in this transition.
Waiting for standards to be "final". The first three FIPS are finalized. Composite OIDs are in progress but stable enough for pilots. Waiting for perfection means accumulating technical debt.
Treating the migration as a crypto-only project. It is an infrastructure project. It involves PKI teams, DevOps teams, security teams, and application teams.
Trying to migrate everything at once. The transition will be progressive. Start with the least critical internal certificates, validate the workflow, then extend.
Ignoring the inventory. Without a complete map of existing certificates, planning is impossible. This is the first step, non-negotiable.
Forgetting automation. The transition will multiply certificate types and workflows. Without ACME or EST to automate, the operational load will be unsustainable.
The right time to start
The question is not whether the post-quantum transition will happen. Standards are published. Timelines are set. Regulators are moving.
The question is whether your PKI is ready to absorb it.
It starts with three actions.
Map the cryptographic inventory of your active certificates. Assess the crypto-agility of your current PKI infrastructure. Launch a pilot on a limited scope with PQC or hybrid certificates.
Organizations that start now will have time to migrate progressively. The others will have to do it under pressure when regulatory deadlines arrive.
And under pressure, you do not build good architectures.
References
- FIPS 203 : ML-KEM (NIST, August 2024)
- FIPS 204 : ML-DSA (NIST, August 2024)
- FIPS 205 : SLH-DSA (NIST, August 2024)
- RFC 9763 : Related Certificates for Use in Multiple Authentications (IETF, 2025)
- Composite ML-DSA Internet-Draft (IETF LAMPS WG)
- CNSA 2.0 Algorithm Suite (NSA)
- Position de l'ANSSI sur la transition post-quantique (ANSSI)
