Plans and timeline for post-quantum cryptography
Adrien Nader
adrien.nader at canonical.com
Thu Feb 27 15:53:35 UTC 2025
Hello,
Post-quantum cryptography (PQC) is merely three words but a very wide
topic. There are many algorithms and protocols and they come with
different trade-offs compared to classical cryptography.
We want to enable users to experiment with PQC early on and we have
received marks of interest for it from various customers. We also want
to allow production workloads when relevant.
The big question is: what with? Which algorithms, which protocols, which
implementations?
NB: a lot of this is tentative, not guarantees
# Motivations, challenges and landscape
Unfortunately there is no perfect and simple answer to these questions.
Algorithms are fairly recent, research is still on-going and
implementations are new and sometimes still un-optimized.
Moreover, the new algorithms or protocols come with either larger
private keys, larger public keys, larger signatures, higher computing
requirements for signing, or higher computing requirements for
verification.
Or several of these at once, if not all.
So far, asymmetric cryptography appears to be the most affected while
symmetric cryptography and hashes do not appear to be affected as much:
increasing sizes should be enough for them.
Following a years-long competition, NIST has standardised one Key
Encapsulation Mechanism (roughly similar a key exchange) and two digital
signature algorithms (soon three). Implementations exist and are
deployed, especially in web browsers and servers of tech giant
companies.
The reason to move forward now is to protect against « harvest now,
decrypt later » schemes: there is no quantum computer capable of
decrypting classical cryptography at the moment but there will probably
be one in 20 or 30 years. At that time, lots of current data will still
be relevant. This also explains why the current focus is on session
establishment.
Like everyone, we don't have a crystal ball and there will be bumps on
the road but not moving will certainly be more problematic. Deployments
should favor hybrid crypto (i.e. PQ + non-PQ crypto) which is what
current deployments do already.
# Changes expected in 25.10
Ubuntu 25.10 will get improved support with openssl 3.5 which is
currently feature-frozen and will be an LTS version.
There is a recent openssl blog post which shows the development status
for various PQC algorithms:
https://openssl-library.org/post/2025-02-12-openssl-3.5-go-nogo/
I've talked about the choice of openssl versions in the past and there
is a very good news: openssl is changing its release schedule as
detailed in https://openssl-library.org/post/2025-02-20-openssl-3.5-lts/
I think this is great for everybody: a better match for everyone, less
work and more updates (I detailed the reason last year while advocating
for the release schedule that has now been adopted
https://github.com/openssl/general-policies/issues/60 ).
In practice, this means that:
- interim Ubuntu releases will have the latest openssl major versions
within six months (mostly because of how respective feature freezes
align)
- LTS Ubuntu releases will have the same openssl major version as the
interim Ubuntu release right before them since that will be an LTS
version too
I'm very happy and thankful to the openssl project for that schedule
change.
# Longer-term
## New competitions and upcoming standards
PQC is a fast-changing field. Algorithms may become broken and there is
another NIST competition on-going to select other algorithms which use
different mathematical problems in order to avoid having all eggs in one
basket. Moreover, some of these have better size or performance
characteristics than the ones currently selected. They're newer and
haven't received as much scrutiny however which is why they haven't been
selected during the first competition. Only time will tell the results
of this other competition.
In any case, 26.04 and its openssl version will be released before
anything comes out of the second competition. While it is possible that
26.04 sees a large feature SRUs, it's definitely not likely. Even
without thinking about SRUs for 26.04, 28.04 will also probably need to
stick to openssl 3.5 or 3.6.
João Gomes pointed out that the SHA-3 competition and the transitions
surrounding it (e.g. how other algorithms stopped being used) could
maybe tell us how things may happen for PQC. I don't have specific
memories about that but there's also an interesting parallel with SHA-3
having been selected not because SHA-2 was broken but because in order
to have an alternative it it were to become broken.
## liboqs as a backup implementation of algorithms
There is also another implementation of PQC named liboqs. Upstream has a
debate about whether it's purely a research project, or an
implementation for production (*) but one thing is sure: updating it
doesn't have the same constraints as updating openssl and it can be used
to provide algorithms to openssl.
My long-term plan is to have liboqs alongside openssl in Ubuntu and use
liboqs to offer the algorithms that the version of openssl doesn't
handle rather than update openssl.
(*) I can't tell if this is a bias from the main developer who seeks
perfection; as far as I'm concerned, I consider that software is rarely
perfect from the beginning and appreciate when authors are open about
limitations of software they write and publish
This will also need to involve oqs-provider that implements the link
from liboqs to openssl. Upstream is not very happy about the its current
state but hopefully it can be improved.
Thanks to the openssl release schedule change, it is not very likely
that liboqs would be needed as a fallback for actual deployments.
It will however remain useful for testing new algorithms since it comes
with fewer deployment constraints.
Another possibility that João Gomes mentioned to me is to take advantage
of the provider architecture of openssl to maybe use a provider from
another version. This is probably a tough topic but it is definitely an
interesting one.
## Cryptographic agility in released Ubuntu versions with crypto-config
Finally, there is also an interesting question about disabling
algorithms if they lose trust without necessarily being fully broken.
Disabling algorithms is allowed for security updates but not SRUs.
It is most likely that some of these will be broken: the field is too
new for everything to hold perfectly. Thus the question: what would be
the policy to disable algorithms that lose trust but are not yet
catastrophically broken?
For that part, I'm thinking that crypto-config will be useful. I
recently published its specification:
https://discourse.ubuntu.com/t/spec-crypto-config-a-framework-to-manage-crypto-related-configurations-system-wide/54265
It won't give policy answers but it should help on the technical
aspects. At its core, crypto-config uses profiles, including one named
`default` and others which will be `modern`, `legacy`, `fips`, ...
(names not final but only to give an idea of their purposes).
We could keep `default` unchanged, therefore not changing users'
experience compared to the current situation but change `modern` so that
it keeps tracking best practices of the time.
It is also possible to have profiles such as `modern-proposed` which
would contain the changes in advance so that people can test them more
easily.
Since the crypto-config specification is pretty long, I recommend
starting with the README.md file in the crypto-config repository:
https://github.com/canonical/crypto-config
PS: I mentioned tradeoffs earlier on. These are basically size and CPU
time.
Head over to the Post-Quantum signatures zoo:
https://pqshield.github.io/nist-sigs-zoo/ and look at the plot: blue
dots the bottom left are non-pqc signature algorithms. The scale is
logarithmic; some of the signatures are massive. You can also scoll down
and look at the performance metrics: almost all algorithms are much
slower than classical ones.
--
Adrien Nader
More information about the Ubuntu-devel-discuss
mailing list