The NSA, NIST and the AMS

Among the many disturbing aspects of the behavior of the NSA revealed by the Snowden documents, the most controversial one directly relevant to mathematicians was the story of the NSA’s involvement in a flawed NIST cryptography standard (for more see here and here). The New York Times reported:

Classified N.S.A. memos appear to confirm that the fatal weakness, discovered by two Microsoft cryptographers in 2007, was engineered by the agency. The N.S.A. wrote the standard and aggressively pushed it on the international group, privately calling the effort “a challenge in finesse.”

The standard was based on the mathematics of elliptic curves, so this is a clearly identifiable case where mathematicians seem to have been involved in using their expertise to subvert the group tasked with producing high quality cryptography. A big question this raises has been what the NIST will do about this. In April they removed the dubious algorithm from their standards, and published the public comments (many of which were highly critical) on a draft statement about their development process.

At the same time a panel of experts was convened to examine what had gone wrong in this case, and this panel has (on a very short time-scale) just produced its report (associated news stories here, here and here). The rules of how such panels are set up evidently require that each panelist provide an individual report, rather than attempt to have a consensus version. The new NIST document gives these reports together with minutes of the meetings where the panelists were provided with information. It seems that the NSA provided no information at all as part of this process, and they remain unwilling to answer any questions about their actions.

Appendix E contains the individual reports. These include, from Edward Felten:

The bottom line is that NIST failed to exercise independent judgment but instead deferred extensively to NSA with regard to DUAL_EC. After DUAL_EC was proposed, two major red-flags emerged. Either one should have caused NIST to remove DUAL_EC from the standard, but in both cases NIST deferred to NSA requests to keep DUAL_EC…
at the time NIST had nobody on staff with expertise in elliptic curves.
NSA’s vastly superior expertise on elliptic curves led NIST to defer
to NSA regarding DUAL_EC, while NIST people spent more of their limited time on other parts of the standard that were closer to their expertise.

From Bart Preneel:

There is no doubt that the inclusion of Dual EC DRBG in SP 800-90A was a serious mistake…
The explanations provided by NIST are plausible, but it seems that not all decisions in the standardization process of SP 800-90A are properly documented; moreover, we did not have access to the source documents. This means that it is impossible to decide whether this mistake involved in addition to clever manipulation of the standards processes by NSA also some form of pressure on the technical and/or management staff of NIST. It is also not clear whether there would be any traces of such pressure in documents. Without access to the documents, it is also diffcult to decide whether or not NIST has deliberately weakened Dual EC DRBG…

However, it seems that NSA (with its dual role) seems to be prepared to weaken US government standards in order to facilitate its SIGINT role. This undermines the credibility of NIST and prevents NIST reaching its full potential in the area of cryptographic standards. In view of this, the interface between NSA and NIST and the role of the NSA should be made much more precise, requiring an update to the Memorandum of Understanding. At the very least, the terms “consult”, “coordination” and “work closely” should be clarified. Ideally, NIST should no longer be required to coordinate with NSA. There should be a public record of each input or comment by NSA on standards or guidelines under development by NIST.

From Ronald Rivest (the “R” in “RSA”):

Recent revelations and technical review support the hypothesis that, nonetheless, the NSA has been caught with “its hands in the cookie jar” with respect to the development of the Dual-EC-DRBG standard. It seems highly likely that this standard was designed by the NSA to explicitly leak users’ key information to the NSA (and to no one else). The Dual-EC-DRBG standard apparently (and I would suggest, almost certainly) contains a “back-door” enabling the NSA to have surreptitious access. The back-door is somewhat clever in that the standard is not designed to be “weak” (enabling other foreign adversaries to perhaps exploit the weakness as well) but “custom” (only the creator (NSA) of the magical P,Q parameters in the standard will have such access).

[Recommendation]
NIST (and the public) should know whether there are any other current NIST cryptographic standards that would not be acceptable as standards if everyone knew what the NSA knows about them. These standards should be identified and scheduled for early replacement. If NSA refuses to answer such an inquiry, then any standard developed with significant NSA input should be assumed to be “tainted,” unless it possesses a verifiable proof of security acceptable to the larger cryptographic community. Such tainted standards should be scheduled for early replacement.

One way this goes beyond the now-withdrawn NIST standard is that the committee also looked at other NIST current standards now in wide use, which in at least one other case depend upon a specific choice of elliptic curves made by the NSA, with no explanation provided of how the choice was made. In particular, Rivest recommends changing the ECDSA standard in FIPS186 because of this problem.

For a detailed outline of the history of the Dual-EC-DRBG standard, see here. Note in particular that this states that in 2004 when the author asked where the Dual-EC-DRBG elliptic curves came from, the response he got was “NSA had told not to talk about it.”

Also this week, the AMS Notices contains a piece by Richard George, a mathematician who worked at the NSA for 41 years before recently retiring. Presumably this was vetted by the NSA, and is a reasonably accurate version of the case they want to make to the public. Personally I’d describe the whole thing as outrageous, for a long list of reasons, but here I’ll just focus on what it says about Dual-EC-DRBG, since it now seems likely that it is all we will ever get from the NSA about this. It says:

I have never heard of any proven weakness in a cryptographic
algorithm that’s linked to NSA; just innuendo.

The reaction from a commenter here (publicly anonymous, but self-identified to me) was:

As a member of a standards committee involved the removal of the mentioned algorithm from a standard, none of the members believe the “innuendo” theory, and all believe it was deliberately weakened.

Read carefully (and I think it was written very carefully…), note that George never directly denies that the NSA back-doored Dual-EC-DRBG, just claims there is no “proven weakness”. In other words, since how they chose the elliptic curves is a classified secret, no one can prove anything about how this was done. All the public has is the Snowden documents which aren’t “proof”. This is highly reminiscent of the US government’s continuing success at keeping legal challenges to NSA actions out of the courts, even when what is at issue are actions that everyone agrees happened, on the grounds that plaintiff can’t “prove” that they happened, since they are classified. Snowden’s release of documents may yet allow some of these cases to come to a court, just as they were the one thing capable of getting the NIST to acknowledge the Dual-EC-DRBG problem.

I hope that there will be more response to the NSA issue from the Math community than there has been so far. In particular, Rivest’s call for the removal from NIST standards of material from the NSA which the NSA refuses to explain should be endorsed. The innuendo from George is that the NSA may be refusing to explain because they used superior technology to choose better, more secure elliptic curves. If this is the case I don’t see why an official statement to that effect, from the NSA director, under oath, cannot be provided.

On the many other issues the George article raises, I hope that the AMS Notices will see some appropriate responses in the future. Comments here should be restricted to the NIST/NSA story, with those from anyone knowledgeable about this story most encouraged.

Update: The NIST has made available on its website the materials provided to the panel looking into this.

One remarkable thing about the panel’s investigation is that the NSA evidently refused to participate, in particular refusing to make anyone available to answer questions at the panel’s main meeting on May 29 (see page 12 of the report). This appears to be in violation of the Memorandum of Understanding that governs the NIST/NSA relationship, which explicitly states that “The NSA shall … Be responsive to NIST requests for assistance including, but not limited to, all matters related to cryptographic algorithms and cryptographic techniques, information security, and cybersecurity.” All evidence I’ve seen is that the NSA sees itself as above any need to ever justify any of its actions. I can’t see any possible argument as to why they did not have an obligation to participate in the work of this committee.


Update
: A new development in this story is a letter from Congressman Grayson to NSA Director Clapper asking exactly the right questions about what happened at the NIST. Will be interesting to see if a member of Congress can get anything out of the NSA beyond the usual stone-walling.

This entry was posted in Uncategorized. Bookmark the permalink.

23 Responses to The NSA, NIST and the AMS

  1. QM says:

    “The innuendo from George is that the NSA may be refusing to explain because they used superior technology to choose better, more secure elliptic curves. If this is the case I don’t see why an official statement to that effect, from the NSA director, under oath, cannot be provided.”

    Is it in the NSA’s interest to leave this an open question? If they chose a better curve, and say so, do their targets gain confidence, and are encouraged to use it more? If they say it’s back-doored, do their targets abandon it? Uncertainty surely works in their favor here, as usual.

  2. Peter Woit says:

    QM,
    In this case, no one with any interest in the security of their communications is intentionally using Dual-EC-DRBG, the potentially backdoored standard, anymore.

    I think you’re right that they want uncertainty, about this and about everything else they may or may not be doing. Life is much easier if you never have to justify anything you do.

  3. jeremy78 says:

    How do these memo confirm that the NSA did anything? I keep seeing the phrase “appear to confirm” in all of these articles. Notice the word “appear” as a qualifier. Why don’t they reveal what it is exactly in these memos that confirms all of this? Because this vagueness is the exact same thing the NSA is doing, but you seem alright with it because it confirms your previously held worldview. Why aren’t you demanding more clarification from them?

  4. Peter Woit says:

    jeremy78,
    You and Richard George are right that nothing is “proved”: the NY Times reporters only claimed that memos “appear to confirm” that the NSA introduced a back-door. The quote the NY Times gives “Eventually, NSA became the sole editor” of the standard is pretty much confirmed by the NIST description of what happened back then, but proves nothing about a back-door.

    The importance of the Snowden memos is not that they revealed exactly what the NSA did, but that they caused people to go back and look carefully at how the standard was developed. Like George, you’re completely ignoring the facts that are the main evidence for a problem. I’m not making vague claims but linking to the new NIST report and quoting it. The bottom line is that the experts who have looked into this mostly think there is an NSA back-door.

    Is this proven? No, and it never will be as long as the NSA keeps classified the method used to produce the elliptic curves. I really only see three possibilities:
    1. The method was something innocuous, some random choice.
    2. The method used something classified to get something more secure than random, so they need to protect this.
    3. The method was designed to weaken the standard so that they could attack it.
    If what happened is 3., it makes sense the NSA would behave exactly as it has. If it was 1. or 2., there is no reason they could not simply make a public statement to that effect (giving the details in case 1, keeping those secret in case 2.).

  5. jeremy78 says:

    You say that you’re not using the Snowden memos as evidence, but that is clearly false. Many of those experts refer to “revelations” which show a clear bias going in.

    Maybe it’s just me, but I take an evidence based view of the world. Regardless of what experts think, I look at the evidence. Right now there is no direct evidence, which you agree with.

    I thought the evidence based approach was the purpose of this blog. If we go with your current view, then perhaps those string theorists are right. They have no evidence, but they’re experts so their opinions must be right in spite of the lack of the evidence. I don’t know. Perhaps, you think in complex social situations, we should default to the experts till we have direct evidence, but I think that is a mistake.

    Also, let’s be honest. Even if the NSA did exactly what you said they should do, I’d doubt you believe them or it would change anything. It’s hard to prove a negative.

  6. jeremy78 says:

    Sorry for the double post, but I just wanted to point out the similarities of this situation to string theory and the multiverse hype.
    1. Untestable hypothesis-“Is this proven? No, and it never will be as long as the NSA keeps classified the method used to produce the elliptic curves.”
    2. Trusting experts without question-“The bottom line is that the experts who have looked into this mostly think there is an NSA back-door.”
    3. No direct evidence -“proves nothing about a back-door”

  7. Matt says:

    I think this is far less cut and dry than mathematicians and the media are making it out to be. The NSA is a gigantic organization, and we give it a lot of credit to think these conspiracy theories are so nicely carried out. I’m not saying it is impossible.

    I went to a talk given by one of the 2 Microsoft people that discovered the flaw. He is wholly unconvinced that it was intentionally done by the NSA, which I found pretty convincing. No one really wants to hear that side though.

  8. Peter Woit says:

    jeremy78
    I’ve now written several long blog postings about this, with many links to detailed documentation and the evidence I’m aware of. It seems to me to provide a rather strong case that the NSA was up to something in this standards process other than providing the best possible encryption. This evidence is not incontrovertible. There’s doubtless a lot more to the story which is known only to those at the NSA. If you want to take the attitude that only incontrovertible, direct evidence will convince you that there’s a problem at the NSA, that’s your choice. Arguments about the multiverse are almost all a complete waste of time, I don’t think they provide insight into this topic.

    If the NSA provided a plausible explanation of the source of the elliptic curves, that definitely could change my mind. I’m not someone who believes that NSA officials regularly lie to the public. On the whole the most damaging accusations from Snowden have not been denied.

    Matt,
    I don’t think this is a story about a “conspiracy”, it’s a story about an organization with a professed agenda (getting access to encrypted internet communications) pursuing that agenda. If you read the NIST report I linked to and look at its authors, I don’t think you can say this is a case where “no one wants to hear the NSA’s side”. The NSA was specifically asked to meet with the committee for that purpose and refused to do so (see page 12).

  9. Peter Woit says:

    All,
    Enough trolling and arguing with trolls, I’ve just deleted a bunch of it. If you’ve got something knowledgeable to say I’d love to hear it, if you want to have a dumb argument, please go elsewhere.

  10. Pawl says:

    I would call attention to Preneel’s mention of possible pressure involved in the decision-making process, in conjunction with a pattern in the leaked documents, saying that the NSA should “leverage” decisions, access to information, and so on. What sort of leverage, exactly, has been applied? To whom?

  11. Tom Leinster says:

    It seems to me that Richard George’s statement is demonstrably false, independently of political opinion or evidence from the Snowden papers. Here’s what he wrote again:

    I have never heard of any proven weakness in a cryptographic
    algorithm that’s linked to NSA; just innuendo.

    This says nothing about intent. It’s the flat-out assertion that no cryptographic algorithm linked with the NSA has a proven weakness. (Or at least, no weakness that George has heard of — but we can safely assume that he has heard of this one.)

    In his February Notices article, Tom Hales notes: “It is a matter of public fact that the NSA was tightly involved in the writing of the standard” (NIST SP800-90A). So, George is asserting in particular that this standard has no proven weakness. But Ferguson and Shumow, on whose work Hales’s article is based, conclude that

    The prediction resistance of this [pseudorandom number generator] is dependent on solving one instance of the elliptic curve discrete log problem.

    And as I understand it, that’s a serious weakness even if you don’t believe the NSA did anything sneaky. In that case, George’s suggestion that no cryptographic algorithm linked to the NSA has a proven weakness is simply false — as a scientific fact, regardless of politics.

    So I can’t see how to escape the conclusion that the NSA was, at the very least, incompetent. It pushed through a cryptographic standard with a serious vulnerability. Even if you trust the NSA politically and ignore its stated intention to “insert vulnerabilities into commercial cryptosystems”, this episode provides a reason not to trust it with designing the world’s cryptographic standards.

  12. Peter Woit says:

    Tom,
    I think George would argue that’s not a “proven weakness”. My reading of Ferguson/Shumow is that they are pointing out that whoever chose P and Q could do it in such a way that they know e, and thus a backdoor. The NSA chose P and Q and refuses to discuss how they chose them, so NSA defenders can claim that maybe they chose them randomly, or in a way that used secret sauce to make things even more secure. My understanding is that one unusual thing about this is that, unless you find e yourself (which is supposedly impossible), you can’t prove that there is a backdoor. From the outside world’s point of view, “this could be backdoored in a way that only the NSA could know” is a weakness, but from the NSA’s point of view, it’s not.

    What George is basically saying is “you can’t prove we did anything”, not “we didn’t do anything”, and I think the difference is intentional.

  13. Tom Leinster says:

    If George doesn’t call Ferguson and Shumow’s discovery a “proven weakness” in the protocol then he’s using words in a different way from ordinary human beings. This is of course a stock-in-trade of intelligence agencies issuing carefully-worded denials, but none the better for it. Any ordinary person would say that if a cryptographic system is untrusted (as this one now most certainly is), that’s a weakness: who wants a system they can’t trust?

    In the long NIST report that you link to, Edward Felten (professor of computer science at Princeton, specializing in computer security) defines a “pure trapdoor” as one in which “NSA is the only party who can possibly exploit it”. He continues:

    [A]ll of the suspected NSA trapdoors in NIST standards are impure trapdoors. In every case discussed below, if the suspected trapdoor does exist, its existence reduces the security of users against attack by other adversaries, including organized crime groups or foreign intelligence services.

    This includes the caveat “if the suspected trapdoor does exist”. But whether or not it does, the possibility that it might is a weakness in itself.

  14. Peter Woit says:

    Tom,
    Still, the NSA’s point of view is that for all you know they chose the numbers randomly and there is no trapdoor, and no weakness. So, you have no proof (or any hope of getting one, unless a mathematician version of Snowden appears…) that there is a trapdoor and thus a weakness in the standard. I think it highly likely that George and whoever vetted his piece at the NSA are well aware of the Ferguson/Shumow argument, and they chose their language carefully so that they could claim what they said was true. Yes, it’s highly misleading, but we know that misleading people about what they do they see as part of their mission.

    My initial reading of this was “this guy George must be some senile incompetent claiming to be unaware of the Dual-EC-DRBG story”, I changed my mind after re-reading carefully.

  15. Tom Leinster says:

    I agree, it’s carefully worded, but I still think it’s plain wrong. If he’d said that there’s no “proven backdoor“, then I wouldn’t be able to contradict him. But he said that there’s no “proven weakness“, and a backdoor isn’t the only kind of weakness a cryptographic system can have.

    John Kelsey of NIST wrote this about choosing parameters for Dual EC:

    * It is possible to choose (P, Q) so that you know a backdoor […]

    * It is also possible to choose (P, Q) so that you can prove you don’t know a backdoor.

    Let’s call the system unbackdoorable if (P, Q) has been chosen in the second way, and backdoorable otherwise. A backdoorable system may or may not actually be backdoored. Still, for a system to be backdoorable is a serious weakness, because it makes it untrustworthy, even if you don’t know that a backdoor actually exists.

    As I understand it — and I hope someone will correct me if I’m wrong — it’s a matter of black-and-white fact that this particular system is backdoorable. That, then, is the proven weakness. And that’s why this system has been abandoned in a hurry: because we know it’s backdoorable, even though we don’t know whether it’s backdoored.

    Of course, this whole conversation that I started is in some sense absurd, since it deliberately makes no assumptions about the intentions of the NSA, despite all the documentary evidence in front of us.

  16. Peter Woit says:

    Tom,
    My reading of the same document you’re linking to is that nothing is known about how the NSA chose (P,Q). All it says is that on 10/27/04 he was told “could be generated randomly, but that NSA had told not to talk about it” and in Oct. 05 NSA said they “generated (P,Q) in a secure, classified way” (a huge mystery here is why NIST then decided to include in the standard a way people could generate their own (P,Q), but not mention that if you didn’t do that, you had to trust the NSA’s (P,Q) not to be backdoored. The current answer from NIST is basically “we were incompetent and made the mistake of trusting the NSA”).

    So, as long as the NSA stonewalls about how they chose (P,Q), George can claim that it can’t be proven that they did it in a backdoorable way. Maybe they did it in an unbackdoorable way. The story about DES that George tells refers to the fact that the NSA in that case knew how to choose parameters in a better, more secure way than what was publicly known, and suspicions they were putting in a backdoor turned out to be wrong: they were actually making things more secure (of course at the same time they were insisting on a key length short enough for them to break by brute force…). He’s telling that story to raise the possibility that this is what happened here, that NSA chose (P,Q) by some highly secure method they can’t talk about.

    My take is that if they had done that they would have consistently just said so, which would not have revealed anything classified. In particular they would have shown up at the NIST panel meeting and defended what they did back in 04/05, if only by reading a vetted statement saying “we chose (P,Q) in a secure, classified manner that we cannot reveal to you, we do not have a back-door”. That they are not doing this, instead letting their relationship with NIST be poisoned to the point of collapse seems to me to be the dog that didn’t bark here.

  17. ScentOfViolets says:

    Isn’t the proper course of action pretty much a no-brainer? If someone wants me to invest funds in their business venture, it’s on them to show me that there’s a reasonable expectation that it will succeed; not on me to show that it won’t. Similarly, given the costs of bad crypto, why should I use it if it’s not known to be secure? IOW, default burden of proof requirements should be on the systems defenders to prove that the crypto is trustworthy, not on other people to prove that it’s untrustworthy. The statements released by the NSA seem deliberately crafted to get people to buy into the reversal of those requirements in the hope that this will forestall any significant revision of the current standard. If this isn’t evidence of bad faith — particularly since the costs of a backdoor to the community at large is far larger than the benefits which accrue only to the NSA — I don’t know what is.

  18. Rob Fawcett says:

    Is another perspective to ask not even whether a back-door was implemented, but how credible it is that the NSA did not internally have insight into the possibility of such a back-door? Appreciation of the possibility would necessitate accounting for the integrity of the (P,Q) generation. After all, encryption is terminally compromised once the credible practicable possibility of an existing back-door is established, irrespective of proof of successful decryption. The NSA need not have flagged any insight, they need only have accounted for the (P,Q) generation to make the standard robust.
    I for one can but snort at the notion that they knew of the potential back-door, nobly abjured use of it but chose to obscure the (P,Q) derivation anyway, thereby ultimately dooming the standard from the start.

  19. Mike Sharples says:

    Why are people so hung up on what is and isn’t proven? This is not an exercise in mathematical certainty it is an evaluation of risk. There is a risk that the standard is compromised. Obfuscation by the NSA does nothing to remove that risk (quite the contrary). Therefore the standard should be considered unsafe.

  20. Chris W. says:

    See the latest post from Scott Aaronson:

    The issue reached a comical extreme last October, when Adi Shamir, the “S” in RSA, Turing Award winner, and foreign member of the US National Academy of Sciences, was prevented from entering the US to speak at a “History of Cryptology” conference sponsored by the National Security Agency. According to Shamir’s open letter detailing the incident, not even his friends at the NSA, or the president of the NAS, were able to grease the bureaucracy at the State Department for him.

    With the NIST episode as context, what could this mean?

  21. Peter Woit says:

    Mike Sharples,
    The discussion about what can be “proven” was purely about the question of whether the AMS notices piece defending the NSA was a bald-faced lie or not. Of course there is plenty of evidence with respect to the NSA’s activities that does not rise to the level of “proof”, and people should act accordingly.

    Chris W.,
    I think this is just one more example of the out-of-control US security establishment, of which you can easily find a hundred more. It should also remind people that when trying to figure out what has happened in stories like the NIST/NSA one, you can’t use the argument “the NSA can’t possibly have done X because that would be irrational, stupid, and damage US interests”.

  22. Nick P says:

    I agree with ScentofViolets. The burden of proof, even by NSA standards (eg Common Criteria), is on the person delivering a piece of tech to show that it’s trustworthy. There’s many ways to do this. There’s also the lifecycle requirement that, once problems are found, you explain and/or fix them. NSA is a world-class crypto organization that’s tied to numerous problematic submissions in crypto standards, even amateurish mistakes their own evaluations look for. Whether incompetence or malice, they can’t be trusted to deliver good crypto.

    Another angle is “reasonable suspicion” vs “proof.” The BULLRUN slides clearly say they’re trying to manipulate standards and commercial products. That their own highly protected document says this is proof that they do it. At this point, a submission of theirs automatically has a reasonable suspicion that it might be backdoored. That’s good enough for rejecting it. If we find a potential backdoor and they are who submitted it, then there’s reasonable suspicion that they put it there due to the other proof (BULLRUN). If they refuse to even talk about it, even more reasonable suspicion to think they might be guilty. I maintain that the BULLRUN slides alone are enough proof that their submissions should be considered backdoored unless proven otherwise as they have capability, intent, and history. And untrusted until proven otherwise is the default in INFOSEC on top of that.

    (Note: That said, this particular backdoor didn’t bother me as much as most people. The problem with NSA is their *mandated* mission of collecting everything, along with difficulties they encounter. Subversion is their solution with many making us weaker. Yet, contrary to the claim by Felten, this particular subversion was a win-win as they tried to provide us protection while creating a way for only them to get in. If they’re going to sneak in backdoors, I’d rather it be something like this than a buffer overflow or weak backdoor like that found in a recent FPGA.)

    Still, there’s enough people doubting their negative impact on security that I decided to look at the big picture to identify evidence they weaken it. The main problem I noted was “backdoor.” The truth is that any flaw that lets them circumvent a security policy is a backdoor in practice. All they have to do is encourage flaws, not tell people about existing flaws, or introduce flaws. They can do this at many levels. So, I wrote an essay on Schneier’s blog (main spot I post) showing the many ways NSA’s actions provably weaken our security, both intentionally and incidentally:
    https://www.schneier.com/blog/archives/2014/03/friday_squid_bl_420.html#c5226750

  23. Nick P says:

    I forgot to add a link to our smoking gun in all these discussions with key points conveniently highlighted:

    http://www.nytimes.com/interactive/2013/09/05/us/documents-reveal-nsa-campaign-against-encryption.html?ref=us&_r=0

Comments are closed.