Podcast | PQC Standards Arrive! What You Need to Know — with NIST

They’re here! After a seven-year process, NIST has finally released the first standards for post-quantum cryptography. The industry is excited and optimistic about the future, but what does it mean for you and your organization? In an interview recorded right before the August 13 release, you’ll hear how we got here with the quantum computing threat, what standards were selected and what additional ciphers are coming in the future. You’ll also get a sense of how soon you may need to start implementing the new Key Encapsulation Mechanisms (KEMs) and Digital Signatures. Join host Konstantinos Karagiannis for a wide-ranging PQC primer chat with mathematician Dustin Moody from NIST.

Guest: Dustin Moody from NIST

The Post-Quantum World on Apple Podcasts

Quantum computing capabilities are exploding, causing disruption and opportunities, but many technology and business leaders don’t understand the impact quantum will have on their business. Protiviti is helping organizations get post-quantum ready. In our bi-weekly podcast series, The Post-Quantum World, Protiviti Associate Director and host Konstantinos Karagiannis is joined by quantum computing experts to discuss hot topics in quantum computing, including the business impact, benefits and threats of this exciting new capability.

Subscribe
Read transcript +

Dustin Moody: I expect industry will largely follow what the government does. These algorithms will be required for government use, and industry likes to sell products to the government. Then, once you already have those products with these algorithms in place, and you know there’s a need for them — the harvest now, decrypt later, protecting against a quantum computer.

Konstantinos Karagiannis: TThey’re here. After a seven-year process, NIST has finally released the first standards for post- quantum cryptography. The industry is excited and optimistic about the future, but what does it mean for you and your organization? In an interview recorded right before the release, you’ll hear from NIST how we got here, what standards were selected and what additional ciphers are coming in the future. You’ll also get a sense of how soon you may need to start implementing the new KEMs and signatures.

We explain it all in this episode of The Post-Quantum World. I’m your host, Konstantinos Karagiannis. I lead Quantum Computing Services at Protiviti, where we’re helping companies prepare for the benefits and threats of this exploding field. I hope you’ll join each episode as we explore the technology and business impacts of this post-quantum era.

Our guest today is a mathematician at NIST, Dustin Moody. Welcome to the show.

Dustin Moody: Thanks. Glad to be here.

Konstantinos Karagiannis: We’ve talked numerous times on the show about the coming quantum threat, so listeners are aware of this. NIST ultimately decided to do something about it years ago. Maybe we could start with you walking us through a timeline of the NIST process for standardization and what key milestones have happened.

Dustin Moody: It’s been quite a while now, but back in around 2015 is when we started getting some momentum for this. We had an internal working group that was studying PQC and doing some research and going to the workshops in the field. But we held our own workshop in 2015 where we had academia, industry and government come together and give some talks. We had some discussions, and that got the ball rolling. A few months later, the NSA put out a statement about how they were looking to NIST to come up with some PQC standards. That further got interest from a lot of people — caught their attention. We officially announced we would be doing this competition-like process — everyone calls it the PQC competition.

About a year later, in February 2016, at PQ Crypto, which is the main workshop in the field, we got a lot of strong support from the crypto community, which was excited to have this venue where they could do a lot of research. They knew it was very high-impact. About a year later, we had this submission deadline, November 30, 2017. We had 82 submissions come in — 69 met the requirements. That kicked off the formal part, where we had the first round of algorithms being evaluated — some being attacked. We chose a smaller number to move on into a second round and then into a third round.

Along the way, we had workshops. We issued reports on how we selected the ones we did to advance on. But that did come to an end in July 2022, when we announced the four algorithms we would be standardizing as a result of the process. That was an exciting day. Within the next week or two, we should hopefully publish the final versions of the standard, which will be probably the biggest milestone of this whole project.

Konstantinos Karagiannis: As we’re recording this, the standards aren’t out yet, but by the time it gets posted, they probably will be. Enjoy, listener of the future. They’re with you, most likely.

Dustin Moody: It’s out of our hands. It goes up to the secretary of commerce, who has to sign things, and we can’t tell the secretary what to do. But we’re in the final stages, and it should be just a little bit longer, which now is in the past.

Konstantinos Karagiannis: There are no real surprises coming, probably.

Dustin Moody: No. We posted draft versions of the standards for the public to see. We got comments and feedback. We held a workshop two months ago outlining, “Here are the small changes we made as a result of that feedback.” We don’t want any surprises. It’s very much like the draft versions we posted.

Konstantinos Karagiannis: It’s, like, “Here’s this cipher you never heard of. Enjoy. Good luck with that.”

Dustin Moody: We don’t want that at all.

Konstantinos Karagiannis: NIST did a great job leading the charge here on post-quantum cryptography. What are some of the biggest challenges in the process? It’s intense. Cryptography is hard. That’s the only true thing anyone can ever say about cryptography — it’s hard. What kind of challenges were involved with that?

Dustin Moody: There were a huge number of challenges. We could spend a long time talking about this. NIST has done some cryptography competitions like this in the past, but this one was way more complicated than when we did AES or when we did SHA-3. The field of post-quantum cryptography is still an active area of research, with new results that could be coming out. It’s coming from a variety of different mathematical backgrounds, not just one single area.

Cryptographically, there are things we had to deal with we haven’t had to in the past. Some of the encryption algorithms have decryption failure. Even if you implement everything correctly, you still might decrypt, and it doesn’t work out. Protocols aren’t built to handle that. There are even some technical challenges like that.

We’re dealing with quantum computers, which don’t yet exist, but we have to try and protect against them, which makes it a challenge when you’re designing a cryptosystem to select parameters and say they’re going to defend against this machine that does not yet exist. We don’t know how fast it will run. We don’t know how expensive it will be to operate. That was definitely a challenge, as well as how to define security against a quantum computer. We put forward some definitions at the start of the process. There’s no universally agreed-on way to say, “Here’s how much security is being provided by an algorithm.”

Throughout the process, we had algorithms that were broken, some late in the process. That’s part of the process. We expected that to happen. But to a lot of people, they thought that was very notable — an algorithm could go four or five years under intense study and then be broken. They were worried about, is that a problem? Can we trust the algorithms that are still in the process?

The PQC forum, that was certainly more challenging than we thought it might be. It’s a Google group we use for announcements. We would post things we wanted to. Submitters could talk about their algorithms, could ask questions. But as it’s a public forum, many people had strong opinions, and personalities and often would disagree with each other, and we tried to make it a civil and polite conversation, but we can’t control everyone. That was a very challenging arena to manage as best as we could. Those are some of the big challenges we had. We knew it would be challenging, and it certainly was challenging over the past several years.

Konstantinos Karagiannis: There were a few moments where everyone got worried with Kyber, particularly. There was that infamous alleged hack of code, which was weird because it wasn’t the final code — it was just an implementation of high-order masking and using AI to reverse that. That was controversial. Then, of course, there was the paper written a little while ago where, allegedly, a quantum computer could reverse it, but then it all fell apart with learning with errors. There’s a lot going on there still.

Dustin Moody: That was something newer we’ve seen as well — the media was quick to jump on some of these things. A paper would be published, and instantly, there were stories about it — people saying, “Is this all in vain? Has PQC been broken?” We had to watch the media as well and make sure people understood, what’s fact and fiction? And we need to give cryptographers time to study the results in these papers, and that takes more than a day.

Konstantinos Karagiannis: It’s fascinating. You’ll see 300 articles the minute someone claims they broke Kyber. But when a paper comes out explaining how it’s not broken, I don’t think I see one article, not even one.

Dustin Moody: That’s not as exciting to post. That’s true.

Konstantinos Karagiannis: It’s, like, “It’s not a disaster. Next story.”

What’s a realistic timeline for widespread adoption here now that these standards are coming. What are some of the hurdles organizations are going to have to face? Legacy infrastructure is an obvious one.

Dustin Moody: From past cryptographic transitions, which we’ve seen, it’s never a very fast process. Even if an algorithm is known to be weak, still, it takes years to get off that algorithm and migrate to something new. We expect that with the PQC migration, and to add to that, it’s more complicated. These algorithms are bigger. The key sizes, signature sizes, are larger. The math involved is more complicated for people to understand, and so is the potential for bugs and implementation.

While we certainly want to see a rapid adoption, we know it will take time. A timeline that is probably reasonable is 10 years, 15 years before we see wide adoption from industry and organizations around the world. That’s because it takes time to find where you’re using cryptography, and which particular crypto systems you’re using. You need to do an inventory of your system, and then you need to devote the resources necessary to swap out that algorithm for the new algorithms. Just because cryptographers say it’s important, that doesn’t necessarily mean the president of the organization puts it as a high priority. It takes time to get the message out, and takes time to convince people that this is indeed what needs to happen to provide the security they need. We’ll get there, but it will be slow as we make the transition.

Konstantinos Karagiannis: I feel like a hybrid approach would be easier for a lot of people to swallow — the idea that the crypto they rely on is there. It’s just got a post-quantum wrapper around it. Of course, cloud providers are starting to play around with that, and some messaging apps and things. I wonder if that would make an easier adoption for organizations if there was some in-between period.

Dustin Moody: We’ve heard a lot of positive feedback from industry that they are in favor of a hybrid approach where you’re using one of our current classical algorithms and a new PQC algorithm that will be standardized. That certainly makes a lot of sense. Security-wise, you’ve got the best of both worlds. Performance-wise, you’ll take a little hit because you’re implementing two crypto systems and not just one.

Will that work for everyone? Maybe, maybe not. NIST is not enforcing. We’re not going to require it or enforce that. We will leave it to organizations and applications to decide what’s the best thing for them to do. We will certainly accommodate that so you can still get FIPS validation if you’re doing a hybrid technique. We have some guidance coming out on that as well, as we already have a way you can do it already. Hybrid makes sense for many applications, but maybe not everyone. We’ll leave it to the different groups to decide for themselves.

Konstantinos Karagiannis: Some time has passed since the May 2022 NSM-10 memorandum from the White House. Can you discuss what would be required of federal agencies and the NIST standards when they come out? It’s a tipping point — NIST standards come out, and you must do all these things. Of course, inventory was already due, so that’s already happening with federal agencies. We mentioned that. Can you talk about the steps involved there?

Dustin Moody: The White House put out two national security memos. One was NSM-8 and one was NSM-10, which you referenced. NSM-10, in particular, focused on crypto-agility. It said NIST needed to create a working group with industry. We’d already been doing that. As the memo was being written, we had our NCCoE migration–to–PQC working group, where we partnered with industry to work toward a seamless migration as much as possible. CISA was tasked with coordinating with agencies and with critical infrastructure.

You mentioned the inventory. That was pointed out — agencies needed to do an inventory, and more guidance would be coming out on how to do that because it’s a very complicated thing. Ninety days after the standard is published, NIST has to give out a proposed timeline for deprecation. That won’t be a concrete timeline yet. It’ll still take time to get there. But within 90 days of the standard, we will put out some guidance on, “Here’s a rough outline of when you can expect to need to deprecate.” It won’t be sudden — it won’t be rushed.

The memo also talked about that the NSA needs to provide guidance for national security systems, which they have put out. They put out the Commercial National Security Algorithm Suite 2.0 with documents and FAQs and some timelines on that. This was followed up with a memo from the OMB that gave some more specifics and timelines associated with this to federal agencies — when they needed to have a lead on getting the inventory and what date they needed to have it done and where to send it and things like that. Those were the key points agencies were required to do with these memos.

Konstantinos Karagiannis: That sounds, in a way, ancient now. This memo came out over two years ago. Do you think anything’s changed since then? We regularly hear about both advances and hiccups in achieving fault-tolerant quantum computers, and that’s what we’re talking about — like when you said earlier that quantum computers don’t exist. Of course, you meant the ones that are fault-tolerant. Would timelines for rollout be accelerated or slowed? Realistically, now, 2035 seems like far away and close, depending on your role.

Dustin Moody: That’s the goal that the U.S. government has put out as our target date. It’s a goal for transitioning, especially your high-assurance systems, your critical systems. That’s 11 years away, which seems like plenty of time. But as we talked about with the migration earlier, there will be many things that are not migrated at that point, so we won’t be completely done by then.

You asked if anything’s changed in the past two years. Progress has continued to be made on quantum computers, or what’s called a cryptographically relevant quantum computer — one that would threaten current levels of security. We haven’t seen any dramatic increases that have necessitated where we need to start accelerating rapidly. Our timeline has been steady progress as far as we can tell. The timelines that have been coming out, the 2035 one, the NSA has made it a little bit more fine-grained, where they said, “This system, if you’re doing code signing, you need to have it in place by this.” If it’s a software update, you need to have it in place by whatever date they put. I don’t think too much has changed in the past two years, because we have not seen a need to accelerate beyond what we’d already been anticipating.

Konstantinos Karagiannis: Whenever people are making changes now to organizations, they’re doing it for a few reasons. One would be that they’re worried about harvest-now, decrypt-later attacks — the idea that this data is being stored and, in the future, it’ll be relevant. Not so much credit card data, but definitely state secrets or whatever. The other is, it’s about how you feel that the industry will force it upon you.

Let’s be real: People are told what to do. With this concept of deprecation, what you talked about, there would be consequences that once ciphers are getting close to being deprecated if you’re keeping up to date. We can expect to see this maybe even in PCI, and how PCI handles their DSS. 4.0 already mentions this idea of looking for cryptographic vulnerabilities and monitoring regularly. Post-quantum is going to be lumped into that given time. Do you think the deprecation you hinted at will be not sudden? What kind of timelines would people expect there? That’s the real difference maker. Once deprecation appears, it’s everybody’s problem, whether you believe in quantum computing achieving success rates or not. It’s everyone’s problem at that point.

Dustin Moody: I don’t think, from this viewpoint, that when we put out our timelines on this date you need to deprecate by, or when will RSA or Diffie–Hellman become legacy, I don’t think it will be sudden. It will certainly not be for a few years, because we know this transition is going to take time. When we put out that 90 days’ guidance here in a little bit, I still don’t know if we’ll even have firm dates that are baked into that. But deprecation will not be for a number of years, because we know how long it takes to transition. This isn’t going to catch anyone by surprise. There will be sufficient time.

Will that mean that they will have deprecated by that date? Hopefully. But we know not everyone always makes these deadlines. There will be probably mechanisms in place, as you alluded to, that industry working groups have best practices that you need to be following. If there are security vulnerabilities or hacks that occur because you’re not using best practice cryptography, you could be held liable for those sorts of situations as well. There will be motivations to help you deprecate and migrate to the new algorithms, but it will still be pretty slow.

Konstantinos Karagiannis: Do you know how these required actions are going to impact the private sector? So far, we’ve been talking about how the White House has been signaling what federal agencies have to do. Do you expect private-sector regulators to basically cut and paste? You think it’ll be, like, "Whatever the White House said for federal, we’re going to do the same thing for private?” Things are just going to lift-and-shift that idea?

Dustin Moody: I’m no expert on this, but I expect industry will largely follow what the government does. These algorithms will be required for government use, and industry likes to sell products to the government. Then, once you already have those products with these algorithms in place and you know that there’s a need for them, we’re not just migrating to them for fun. The harvest-now, decrypt-later, protecting against a quantum computer — every organization is going to migrate to these over time to protect against quantum computers. I expect industry will use the same standards the government is using and that both in the United States and abroad, we’ll see large adoption of these algorithms.

Konstantinos Karagiannis: You brought up a great point: It’s not just about what they’ll do so they don’t have to do their research, but they also want to sell stuff to the government. It’s driven by a real-world need to be in synch quicker.

Standardizing cryptography is no small feat. The N in NIST is for “national,” but it’s an international problem. It’s the whole world that’s going to be facing this. Can you elaborate on any international collaboration in this process and how global adoption might be ensured? It’s, like, “What’s Germany doing?” This isn’t Germany. Can you tell us how that all works?

Dustin Moody: This is definitely an international problem. A quantum computer would attack cryptography no matter where you are. As NIST organized this process, we knew very much we would want and expect international cooperation. The people that were designing the candidate algorithms that sent in, it was a very international group of submitters — a lot from Europe, a lot from Asia, a lot from the United States and Canada. The cryptographers are worldwide. They’re international. As they were evaluating, there were workshops being held all over. We’ve got a very international audience.

It was very nice to see, though, to a large degree, that national bodies and other international standards organizations didn’t want to fracture the effort that was going on with NIST leading the way. People had confidence that what was going to come out of our process would be strong algorithms. We communicated and collaborated with many of them throughout the process, and they agreed to participate in this process and wait and see what came out of it.

We talked with the IETF, we talked with ETSI — the European Telecommunications Standards Institute — ISO. We talked to many national bodies — the equivalents of NIST — in Canada, the UK, Germany, France, Japan, South Korea and I’m sure others as well, to keep them up-to-date on the process, answer any questions, and they could also update us on their viewpoint and how they were proceeding. Interestingly to us, China was even participating a little bit. Cryptographically, they always come out with their own standards, which is fine. Every country can do that. But they had researchers that submitted algorithms. They hosted one of the PQ Crypto workshops back in 2019. That was cool to see. They are doing their own internal standardization. The algorithms they selected are very similar to the ones that came out of our process as well.

But now that the standards have been announced, as we continue to talk with them, many of these agencies and standards organizations are going to be using the algorithms that were in the NIST process. Sometimes they’ll use ones we didn’t select. For example, Germany likes FrodoKEM and classic McEliece, which were both in the third round. Classic McEliece is still in the fourth round. We still could standardize that. And that’s just fine. If Germany wants to use those algorithms, we have no issues with that. They’re strong, secure algorithms.

Pretty much everyone likes Kyber and Dilithium and Falcon, which were the main three we selected. We expect to see those the most widely used. We see in ISO that they’re going to be standardized — in the IETF as well. We do expect a lot of global adoption because when you have a small number of algorithms, that is good for interoperability, which facilitates commerce, and that’s what we hear a lot. The feedback from industry is, they want that interoperability.

Konstantinos Karagiannis: This idea of future rounds, is it possible that as time goes on, we’re going to start seeing some of the other approaches slip in, like, all of a sudden, here’s now a multivariate, here’s a code-based, here’s an isogeny, whatever. Are we going to start seeing some other things added to the mix in the future? Will that confuse folks?

Dustin Moody: We will have new algorithms being added in. We selected the main ones, which we expect to be the primary ones used for the next few decades. We do still have a few algorithms in the fourth round. We are going to select one or two within a few months. They are all KEM. That will add to our encryption or key-encapsulation mechanism profile to complement Kyber. We also have what’s called the on-ramp or the additional digital-signature standardization project, where we are considering more signatures. We’re at the very beginning of that process, but in four years, five years, we could select one or two signatures to standardize from that as well. They could be based on multivariate or isogeny or some other families. We’ll continue to see standardization occur.

Cryptographers always adapt. If there are any attacks on algorithms that have been standardized, we need to have backup algorithms ready to go. We’re always looking for the latest research. If there’s a new solution discovered that’s significantly better, of course we’d be interested in that. Standardization will definitely continue. It is possible that confuses some people — waves of standardization. They might think they need to wait for the latest, greatest, newest algorithm. You don’t need to do that. You can roll with Kyber and Dilithium as the main two, and you should be good to go.

Konstantinos Karagiannis: Of course, like you said, there’s key encapsulation. You’re sending something, and then you have your signatures. We’re going to be getting different types of math in each of those. Of course, Kyber is lattice-based, and that’s probably a loaded question, but lattice is near and dear to my heart. I’m speaking about them at DEF CON in a few days from when we record this. But that brings up a point: You might like lattices. You might like other approaches. How do you think people will be selecting a feature? Do you think they’ll get down to the performance aspects? Do you think they’ll just be benchmarking and choosing?

Dustin Moody: We wanted to give them that capability if desired. There will be more than one algorithm standardized as a signature. There will be more than one KEM. For some applications, there could be one algorithm that’s slightly better. For example, Falcon has smaller signatures than Dilithium by a factor of four or five. For some applications, you need that small signature. Falcon has the trade-off where the implementation is much more complex. You need floating-point operations, and that might not work for your application. There are different trade-offs. If we end up selecting a multivariate signature as part of the on-ramp, they have even smaller signatures, or an isogeny signature. That could be a factor in your choice.

On the other hand, sometimes people want to be very conservative with security, and performance isn’t the main driver. Then you can choose the parameter sets that are a higher security category, or some algorithms are viewed as more conservative, such as SPHINCS+. People agree that is a very conservative signature design. If we end up selecting classic McEliece, another KEM that’s widely viewed as being very conservative, even though it has large public keys, that could be the right choice for your application. With that said, while you can make choices, the vast majority will be fine to just use Kyber and Dilithium as the main two algorithms.

Konstantinos Karagiannis: That’s a good point to keep hitting home because that’ll be the first question a lot of people ask: “What is the future looking like for 90% of businesses and other users?” Kyber has some promising performance numbers. It scales well. It seems to almost outperform when it gets to the larger levels. For example, when the number of dimensions goes up in lattices, they seem to perform well. I don’t think anyone will have any real problems with that. Of course, hardware always gets better. We’re building toward the future. This is one of those great moments where folks can buy new hardware for the organization’s new systems to begin their path toward PQC. I have to believe that what they buy into will be able to handle this with ease.

Dustin Moody: Kyber is very efficient. Key sizes and cipher text sizes are a little bit bigger than what we’re used to. Most applications, they’re still small enough. It probably won’t make an impact. Your choice might be just which parameter set of Kyber. We have Category 1, 3 and 5. Category 1, smaller security margin, more efficient. That might be where you’re needing to make your choices. We recommend category 3 as the default. It provides a good security margin and is still very fast. But if you need to go a little bit quicker, you can go down to Kyber Category 1, or, on the other side, if you want more security, you can go up to Category 5.

Konstantinos Karagiannis: Very good point. Finding the first perfect post-quantum algorithm might be a moving target — how could we say what that is? Can these standards be future-proofed while allowing for agility? Is it just like moving through levels, like you said, because new advances are going to emerge. What does that look like? It’s hard to know. Are we going to begin this process all over again one day?

Dustin Moody: There is no perfect post-quantum algorithm. Back at the beginning, when we started this, we were looking at what we use today, and we were looking at the possible candidates that were going to be submitted. There was no perfect drop-in replacement that’s just as small, just as fast, security is just as good. We knew there were always going to need to be trade-offs that we were going to have to make as we standardize these algorithms. We try and future-proof them by making sure we have cryptographic diversity so that we have things based on lattices, we have things based on SPHINCS+. It is based on hash-based cryptography. We’re going to be selecting something from the fourth round that’s based on code-based cryptography.

If there is an attack discovered on one of these, we have another algorithm you can turn to. We will continually be keeping an eye on research so that if there are advances, we have other algorithms ready to go. That’s part of the reason, for example, we have the on-ramp going on with the additional digital signatures. We want something not based on lattices. We’d seen that research was advancing for digital signatures, so we kicked off another process to consider candidates for digital signatures. We have multivariate, we have isogeny, we have code-based, we have some lattice-based, we have some based on what’s called MPC in the head, which is a very exciting field and is very promising.

The way to stay future-proof is just to always have your eye on this and make changes as you go. Cryptography is never going to be static. There are always going to be people breaking algorithms, finding new attacks. We have to have algorithms ready to go.

Konstantinos Karagiannis: That’s well said. This whole process was groundbreaking. We’re not just modifying an algorithm or something along the way, or going from TLS 1 to 1.1. This is a complete redesign and rethink of what we do in the future.

Are there any key takeaways or lessons from this process that you can apply to future standards development — something that would surprise you?

Dustin Moody: Some things maybe surprised me. Some things maybe surprised others. We knew as the process went on that algorithms would get broken. We expected that in each round of the process, but some people were very surprised by that. We think people should know that’s just how the process works: You design an algorithm — you put it out there for people to evaluate. The strongest ones will survive, but it sometimes takes time. It takes years before we can have confidence in the security of an algorithm. That’s one reason we need the focus of the entire cryptographic community.

If NIST had decided to run this process and our team of 15 very smart people had worked on this, that wouldn’t be enough eyes on these algorithms. You need the focus of everyone, especially when you’ve got lattices, isogenies, and multivariate. Mathematicians work in different fields. Then, when you look at the implementations, you’ve got to have computer scientists involved and people that know quantum algorithms. It’s very much a community effort to ensure that we have security.

One lesson I definitely learned along the way that I probably didn’t expect — and I should have expected it — was, you can’t please everyone. Whenever we would make a decision, there were some people who would agree with it, but we would hear very loudly sometimes from other people that we’d made the wrong choice. Or, if you go to the PQC forum, there are plenty of different opinions offered there. You can’t please everyone, but if you’ve pleased most people, you’re probably doing OK. Overall, the lesson I take away is, this sort of cryptographic process, it works very well. You harness the attention from the international crypto community and take a number of years to study algorithms, and what comes out of it are algorithms that we have confidence in and that people trust and will implement in their products.

Konstantinos Karagiannis: That sounds like a perfect place to close this conversation. Thanks for all the great work your team is doing. I'm looking forward to seeing everything that ensues now that these are pretty much out in the world.

Dustin Moody: Thanks for talking.

Konstantinos Karagiannis: Now, it’s time for Coherence, the quantum executive summary, where I take a moment to highlight some of the business impacts we discussed today in case things got too nerdy at times. Let’s recap.

On August 13, 2024, NIST released its first standards for post-quantum cryptography. They’ve been a long time coming. Cryptography is hard, but the hope is that these ciphers will help organizations prepare for the eventual arrival of cryptographically relevant quantum computers, or CRQCs. We can officially say the first batch includes ML-KEM (formerly Kyber) for key encapsulation mechanisms, as well as ML-DSA (formerly Dilithium) and SLH-DSA (formerly SPHINCS) for digital signatures. Most applications will likely use ML-KEM and ML-DSA, which are both lattice-based.

NIST is working on a fourth round with future KEMs and signatures using other approaches, including isogeny, multivariate and code- based. This doesn’t mean you can’t use the current standards — only that more options will be available as we build the cipher suite of the future. Different strength levels are available for ML-KEM today based on performance and security requirements. We also expect numerous organizations to choose a hybrid approach, combining, say, ML-KEM with EC-DSA. If an issue arises with ML-KEM, you’re no worse off that way.

The long process NIST went through, in many ways, could have been longer. NIST learned a lot from past standardization, which means these new offerings have been heavily vetted in many ways. NIST will also release guidance within three months on the deprecation of ciphers that are not post-quantum-safe. But there should be generous dates involved to allow for migration over the next few years. Still, harvest-now, decrypt-later attacks are a real thing, and it’s never too soon to start protecting critical communications. Secrets with a long shelf life need at least hybrid protection as quickly as possible.

While NIST is U.S.-based and the White House NSM-10 memorandum calls for federal agencies to start the migration process, we expect the private sector and other countries to largely follow suit, or at least be compatible with the NIST standards. U.S. businesses want to do business with the government, and everyone wants to do business with the U.S. Other countries will add ciphers, and NIST will add some too in those next rounds, leading to a hopefully quantum-safe future long before the big ones, CRQCs, arrive.
That does it for this episode. Thanks to Dustin Moody for joining to discuss NIST’s PQC standard, and thank you for listening. If you enjoyed the show, please subscribe to Protiviti’s The Post-Quantum World, and leave a review to help others find us. Be sure to follow me on all socials @KonstantHacker. You’ll find links there to what we’re doing in Quantum Computing Services at Protiviti. You can also DM me questions or suggestions for what you’d like to hear on the show. For more information on our quantum services, check out Protiviti.com, or follow Protiviti Tech on Twitter and LinkedIn. Until next time, be kind, and stay quantum-curious.

Loading...