"All we can do is tell people that NIST are the ones in the room
making the decisions, but if you don't believe us, there's no way
you could verify that without being inside NIST" says Moody.
There's our problem - right there!
If a body as important as NIST is not so utterly transparent that any
random interested person cannot comb through every meeting, memo, and
coffee break conversation then it needs disbanding and replacing with
something that properly serves the public.
We have bastardised technology to create a world of panopticonic
surveillance, and then misused it by scrutinising the private lives of
simple citizens.
This is arse-backwards. If monitoring and auditing technology has any
legitimate use the only people who should morally (though willingly)
give-up some of their privacy are those that serve in public, in our
parliaments, councils, congress, government agencies and standards
bodies.
> All we can do is tell people
No. You can prove it, and if you cannot, step aside for leadership
better suited to serving the public interest.
I'm now picturing a slightly different government to ours, where the oversight bodies have the additional function of making sure officials don't talk to one another outside of recorded meetings under the public's eye.
It seems like a huge burden. It is the kind of thing, though, that in a parallel universe would make total sense: our representatives should be beholden to us.
Many governments do have laws like this, called "sunshine laws". Enforcing them can be difficult though, and often enough they fail to achieve the transparency that is their goal while also substantially hindering process.
Modern panopticon level surveillance is not deployed through weakening encryption. It's just built right into platforms and apps and people willfully install it because it gives them free services and addictive social media feeds.
You don't need to weaken encryption to spy on people. You just have to give them a dancing bunny and to see the dancing bunny they must say yes to "allow access to contacts" and "allow access to camera" and "allow access to microphone" and "allow access to documents" and ...
For the higher-brow version replace dancing bunny with free service.
In addition the more we adopt and make use of cloud command and control architectures the more surveilled we become, because it becomes trivial for anyone with access to the cloud provider's internals to tap everyone's behavior. This could be done with or without the knowledge of the provider itself. The more such services we use the more data points we are surrendering, and these can be aggregated to provide quite a lot of information about us in near-real-time.
> In addition the more we adopt and make use of cloud command and
control architectures the more surveilled we become, because it
becomes trivial for anyone with access to the cloud provider's
internals to tap everyone's behavior.
Every US government office - at both the state and federal - levels has records keeping requirements. It’s the reason we can submit FOIA requests that return with data from the 30s.
I think everyone here is pretty clear how they would ethically view such a thing, but view it from NIST's (/ NSA's) perspective for the sake of argument. Maybe there's a specific threat where NIST (or presumably the NSA) believes it has a mandate to insert a backdoor.
In order to successfully do this, NIST needs to maintain a very large bank of social capital and industry trust that it can spend on very narrow issues.
But over the years there have been enough strange things (Dual EC DRBG being the most notorious) that that trust, at least when it comes to crypto design, simply isn't there. My perception is that newer ECC standards promoted by NIST have been trusted substantially less than AES was when it was released, and I can think of a number of major issues over the years that would lead to this distrust.
The inevitable outcome is that NIST loses much of its influence on the industry, which certainly is not in its own interest.
Everyone also discounts the other reason NIST (with NSA behind the scenes) might be shifty -- they know of a mathematical or computational exploit class that no one else does.
And therefore want to do things-which-seem-pointless-to-everyone-else to an algorithm to guard against it.
Without disclosing what "it" is.
Everyone's quick to jump to the "NSA is weakening algorithms" explanation, but there's both historical and practical precedent for the strengthening alternative.
After all, if the US government and military use a NIST-standardized algorithm too... how is using one with known flaws good for the NSA? They have a dual mission.
>I think everyone here is pretty clear how they would ethically view such a thing, but view it from NIST's (/ NSA's) perspective for the sake of argument. Maybe there's a specific threat where NIST (or presumably the NSA) believes it has a mandate to insert a backdoor.
That's an incredibly charitable version of their point of view. How's this for their POV: They're angry that they can't see every single piece of communications, and they think they can get away with weakening encryption because nobody can stop them legally (because the proof is classified), and nobody's going to stop them by any other avenue either.
> view it from NIST's (/ NSA's) perspective for the sake of
argument. Maybe there's a specific threat where NIST (or presumably
the NSA) believes it has a mandate to insert a backdoor.
Without any /sarcasm tags I have to take that on face value, and
frankly there are few words to fully describe what a colossally stupid
idea (not your idea, I am sure) that is. Belief in containable
backdoors is the height of naivety and recklessly playing fast and
loose with everyone's personal security, our entire economy and
national security.
That is to say, even taking Hollywood Terror Plots into consideration
[0], I don't believe there is ever a "mandate to insert a backdoor".
> In order to successfully do this, NIST needs to maintain a very
large bank of social capital and industry trust that it can spend on
very narrow issues.
Having some "trust to burn" is great for lone operatives, undercover
mercs, double agents and crooks that John le Carre described as
fugitives living by the seat of expedient alliances and fast
goodbyes. Fine if you can disappear tomorrow, reinvent yourself and
pop up somewhere else anew.
But absolutely no use for institutions holding on to any hope for
permanence and the power that brings.
> The inevitable outcome is that NIST loses much of its influence on
the industry, which certainly is not in its own interest.
Exactly this. And corrosion of institutional trust is a massive
loss. Not for NIST or a bunch of corrupt academics who'd stop getting
brown envelopes to stuff their pockets, but for the entire world.
But since you obliquely raise an interesting question... what is
NIST's "interest" here?
Surely we're not saying that by spending trust "on very narrow issues"
it's ultimate ploy is to deceive, defect and double-cross everything
the public believe it was created to protect? [1]
I'm all for the game, subterfuge and craft, but sometimes you just
bump up against the brute reality of principles and this is one of
those cases. Backdoors always cost you more than you ever thought
you'd save, and I've always assumed the people at a place like NIST
are smart enough to know that.
The fact that NIST is not transparent is enough to assume that anything related to cryptography that NIST touches is compromised.
Frankly, I would assume any modern encryption is compromised by default - the gamble is just in who compromised it and how likely it would be that they want access to your data.
NIST standardized AES and SHA3, two designs nobody believes are compromised. The reason people trust AES and SHA3 is that they're the products of academic competitions that NIST refereed, rather than designs that NSA produced, as was the case with earlier standards. CRYSTALS-Kyber is, like AES and SHA3, the product of an academic competition that NIST simply refereed.
The American people- who are the only ones who matter- want to live in a superpower.
Everything America does is in service of maintaining its position as the hegemonic player. The US intelligence agencies have infiltrated every university and tech company since forever. It's their job.
Sounds like it's time that academia form a crypto equivalent of NIST amongst universities so they can put out transparent versions of new cryptographic algorithms that can be traced back to their birth so that other cryptographers can look for holes, if NIST is unwilling to be open about their processes.
Why aren't other participants in the competition --- most of them didn't win! --- saying the same thing? Why are the only two kinds of people making this argument a contest loser writing inscrutable 50,000 word manifestos and people on message boards who haven't followed any of the work in this field?
The article is a bit weird, so here's my summary of the situation, as someone in the security field:
- Berstein, an extremely esteemed security researcher[0], published a long blog post last week[1] criticizing NIST's standardization process for new Post-Quantum-Crypto algorithms. He is focusing on the selection of Key Encapsulation Mechanisms (think TLS key exchange). Two big options are Kyber and NTRU (coauthored by Berstein).
- His main complaint is that NIST is playing fast and loose with the selection process, and had disqualified a fast NTRU variant due to barely not meeting a certain security threshold. The missing variant makes NTRU look slower and less flexible than it actually is.
- Meanwhile, NIST accepted a similar fast Kyber variant based on shaky assumptions. Berstein argues at length that it doesn't meet the security threshold either and should be disqualified. Funnily, NIST used Berstein's own research in (seemingly) incorrect fashion to argue for Kyber's security.
- There's an air of impropriety, as if NIST was favoring one algorithm over the other, for unknown reasons. And in the beginning of the post, Berstein shows the results of his recent lawsuit to reveal more information about the internal NIST process: it seems that NIST and NSA met more often than previously thought.
My interpretation leans more towards NIST making an internal mistake in evaluating the algorithms, rather than NSA pushing its agenda. One could argue that Berstein is sour that his algorithm might not be picked, and is trying underhanded tactics. On the other hand, he does have excellent reputation, and convincingly argues that NIST made an important mistake and is not transparent enough.
Because Dual_EC_DRBG was very heavy handed. It was driven by NSA itself (and based on a paper named "Kleptography"!); the backdoor was obvious; and they had to ~bribe~ monetarily incentivize companies to actually implement and use it.
Meanwhile, both NTRU and Kyber are lattice-based, and their designs came from honest attempts. To be an NSA effort, there would need to exist an exploitable flaw in Kyber, but not NTRU, known only to the NSA. And it's not like NTRU as a whole got disqualified; only the fastest variant did.
That's the problem with spy agencies, you never know what they are capable of. But if it was an NSA effort, it would be, by far, the most subtle one uncovered so far.
On the other hand, DES is an example of where people were sure that NSA persuaded IBM to weaken it but, to quote Bruce Schneier, "It took the academic community two decades to figure out that the NSA 'tweaks' actually improved the security of DES". <https://www.cnet.com/news/privacy/saluting-the-data-encrypti...>
If “an academic is sour they didn’t receive credit” and “an academic wants to help the world” are both on the menu, you should always linger over the first possibility. Hell, I’m an academic and you should consider it in regards to this post.
I've been in rooms watching cryptographers trying to figure out what exactly it is Bernstein was saying with that blog post for the past week, and I do not believe that Matthew Sparkes at The New Scientist understands it any better than they do. Since Sparkes doesn't have any direct reporting from Bernstein, and nobody here cares about the NIST quotes, the right thing to do here is to treat this story as a dupe.
Whenever the topic of DJB vs NIST comes up, there are always people saying "this may look petty, but he has a spotless track record, so we have to trust him".
I want to push back on this a little by linking this Twitter thread:
I did not dig through all the links in that twitter thread, but the first few tweets are pretty misleading.
The tweets say DJB implied that scientists who submitted algorithms were bribed by the NSA. That's a complete misunderstanding of that DJB wrote: he argued that the NSA wouldn't need to bribe those scientists, because they hired the top experts in the field years ago, so it might be the case that they're so far ahead of what's being submitted that all they have to do is push NIST to pick an algorithm they know how to break.
Now, I have no knowledge of any of this, so I have no idea if DJB's argument is insanely paranoid like the author of the thread implies (with the GIF in the 3rd tweet). All I can see is that the author's claim is a gross mischaracterization of what DJB wrote.
There is so much to unpack in that thread and its references. A lot of he said she said.
For example, one reference being used as evidence that djb is evil complains about being insulted that their employer (also djb's employer, presumed to be on djb's side) suggested seeing a company doctor after being on sick leave for a while. This is 100% standard practice in the Netherlands and the doctor is independent, not from the company themselves, and keeps things confidential. It's how we resolve the conflict where you can't just claim you're sick for unspecified reasons indefinitely and continue to expect money, but the employer isn't entitled to know your medical dossier either. This lets you have medical confidentiality and long-term sick leave where the employer can trust that appropriate action is being taken because they trust the impartial doctor to verify that. This is brought up as part of the conflict between djb, the author, and the university they work for. This isn't the only thing they allude to not knowing about while abuse was alleged to be allowed to happen by djb and others. I believe most of what is written, but at the same time, the problem is clearly being exacerbated by not using coworkers, friends, or even google/ddg to find out what legal system you've moved into. Djb even suggested they should take legal action, and HR offered arbitration, but the person declined both. So now the evidence amounts to their word on a blog and the alleged perpetrators faced zero consequences.
As much as such references serve to convince me of djb=evil, they also convince me there may be more to the story than one side.
Obviously I've just highlighted one thing here, there's a lot more he-said-she-said going on elsewhere in the threads that could give one pause in believing one side verbatim, even if they're likely right in spirit
I do believe he has increasingly argued in bad faith and alienated his peers to the point that they're (we're) unwilling to engage with him, which from the outside can look like his points are unrefutable.
Strong agree. I've heard Bernstein described before now has having "all the subtlety of The Incredible Hulk". Quite possibly there's some things he can get away with only because he's a brilliant cryptographer.
Designing curve25519 was, in terms of practical impact, an achievement I'd put in the same category as inventing RSA or Diffie-Hellman. Not because the ideas were new, but because they came together in a way that produces something that "just works" in practice, and you don't have to worry about invalid curve points and twist attacks and accidentally using the addition formula for a point doubling and many other things. The idea that instead of a framework where you can plug in your own parameter choices and some of them might be secure, you can just build a crypto library that does one thing well, was certainly new enough that no-one else seemed to be doing it at the time. The fact that when I need a key for real, most of the time I do `ssh-keygen -t ed25519` or the equivalent in other systems speaks for itself.
As does the fact that github has deprecated the ssh-dss key type and recommends ed25519 and the default: in the contest between Ed25519 and DSA/ECDSA for digital signatures, Bernstein wins hands down and NIST has egg on their face. Although I have no proof of malice, I haven't yet heard a rational explanation for just how badly ECDSA mangles the Schnorr protocol in exactly the way that means a lot of implementations end up with horrible security holes.
And then there's the Snowden leaks and DUAL_EC. "The NSA has interfered with crypto standards in the past, reliable leaks show it was part of their mission statement, and they could be doing so again." is to me a statement backed up by plausible evidence that's very far from the usual conspiracy theories. This is not faked-moon-landings territory.
And I should also say, there are a lot of ways of being evil that to my knowledge no-one has ever accused Bernstein of: as far as I know, he's never been accused of raping or sexually assaulting anyone, nor has he said anything particularly racist or pushed any far-right ideology. He has been accused of insulting and occasionally threatening people who disagree with him on technical matters, but he's not what we usually mean by "bad/evil person, avoid if possible".
I'd say he has a fairly spotless track record in cryptographic protocol design, and a fairly stained one in interacting with other humans. When he's pushing back against design decisions that actually are stupid/evil, that's an asset; in lots of other cases it's not.
> I haven't yet heard a rational explanation for just how badly ECDSA mangles the Schnorr protocol in exactly the way that means a lot of implementations end up with horrible security holes.
I thought schnorr was under patent for a bit, so an open alternative was needed? Also ECDSA does allow for recovery of the public key from the signature, which can be useful.
Regrettably, while we are discussing the author's motives, we may inadvertently overlook the miscalculation by NIST. The crux of NIST's primary error lies in improperly multiplying two costs when they should have been added.
If this assertion holds true, it would be prudent to at least revisit and revise their draft !
Ironically, the style and substance of DJB's engagement with his peers and with NIST is likely to sour both against his claims[0], credible though they(might) be. DJB's impression of NIST "stonewalling" could very well be their reluctance in engaging with an adversarial and increasingly deranged private citizen.
> We disagree with his analysis,” says Dustin Moody at NIST. “It’s a question for which there isn’t scientific certainty and intelligent people can have different views. We respect Dan’s opinion, but don’t agree with what he says.
That's great for a PopSci article, but I(and many others, I'm sure) would like to see the details of this analysis hashed out. DJB had his chance at making this happen, and blew it. However, that doesn't mean his questions[0] should go unanswered.
[0]: specifically talking about the calculation of the Kyber-512 security level here. Not his more conspiratorial claims.
> It’s a question for which there isn’t scientific certainty and intelligent people can have different views.
WTF is that? No, it's not a question where intelligent people can have different views. DJB is literally claiming the NSA is claiming something similar to "3 + 3 = 9". That claim is either correct or not.
There's a paywall in front of the article that I have no intention to deal with after seeing what's on the comments. But a phrase like (and I could find it before the paywall rits) this is absolutely dishonest.
There's our problem - right there!
If a body as important as NIST is not so utterly transparent that any random interested person cannot comb through every meeting, memo, and coffee break conversation then it needs disbanding and replacing with something that properly serves the public.
We have bastardised technology to create a world of panopticonic surveillance, and then misused it by scrutinising the private lives of simple citizens.
This is arse-backwards. If monitoring and auditing technology has any legitimate use the only people who should morally (though willingly) give-up some of their privacy are those that serve in public, in our parliaments, councils, congress, government agencies and standards bodies.
> All we can do is tell people
No. You can prove it, and if you cannot, step aside for leadership better suited to serving the public interest.
It seems like a huge burden. It is the kind of thing, though, that in a parallel universe would make total sense: our representatives should be beholden to us.
Which has the net effect of decreasing the ability to reach nobody-is-happy compromises.
Which is something else people say they want.
I'm unconvinced that private, smoke-filled backrooms don't have an essential place as the grease that keeps things running well.
You don't need to weaken encryption to spy on people. You just have to give them a dancing bunny and to see the dancing bunny they must say yes to "allow access to contacts" and "allow access to camera" and "allow access to microphone" and "allow access to documents" and ...
For the higher-brow version replace dancing bunny with free service.
In addition the more we adopt and make use of cloud command and control architectures the more surveilled we become, because it becomes trivial for anyone with access to the cloud provider's internals to tap everyone's behavior. This could be done with or without the knowledge of the provider itself. The more such services we use the more data points we are surrendering, and these can be aggregated to provide quite a lot of information about us in near-real-time.
Spoke about this last night in London
https://www.youtube.com/watch?v=mcWIQALtOtg
If you can't do your work for the public under public scrutiny, you shouldn't.
I think everyone here is pretty clear how they would ethically view such a thing, but view it from NIST's (/ NSA's) perspective for the sake of argument. Maybe there's a specific threat where NIST (or presumably the NSA) believes it has a mandate to insert a backdoor.
In order to successfully do this, NIST needs to maintain a very large bank of social capital and industry trust that it can spend on very narrow issues.
But over the years there have been enough strange things (Dual EC DRBG being the most notorious) that that trust, at least when it comes to crypto design, simply isn't there. My perception is that newer ECC standards promoted by NIST have been trusted substantially less than AES was when it was released, and I can think of a number of major issues over the years that would lead to this distrust.
The inevitable outcome is that NIST loses much of its influence on the industry, which certainly is not in its own interest.
And therefore want to do things-which-seem-pointless-to-everyone-else to an algorithm to guard against it.
Without disclosing what "it" is.
Everyone's quick to jump to the "NSA is weakening algorithms" explanation, but there's both historical and practical precedent for the strengthening alternative.
After all, if the US government and military use a NIST-standardized algorithm too... how is using one with known flaws good for the NSA? They have a dual mission.
That's an incredibly charitable version of their point of view. How's this for their POV: They're angry that they can't see every single piece of communications, and they think they can get away with weakening encryption because nobody can stop them legally (because the proof is classified), and nobody's going to stop them by any other avenue either.
Without any /sarcasm tags I have to take that on face value, and frankly there are few words to fully describe what a colossally stupid idea (not your idea, I am sure) that is. Belief in containable backdoors is the height of naivety and recklessly playing fast and loose with everyone's personal security, our entire economy and national security.
That is to say, even taking Hollywood Terror Plots into consideration [0], I don't believe there is ever a "mandate to insert a backdoor".
> In order to successfully do this, NIST needs to maintain a very large bank of social capital and industry trust that it can spend on very narrow issues.
Having some "trust to burn" is great for lone operatives, undercover mercs, double agents and crooks that John le Carre described as fugitives living by the seat of expedient alliances and fast goodbyes. Fine if you can disappear tomorrow, reinvent yourself and pop up somewhere else anew.
But absolutely no use for institutions holding on to any hope for permanence and the power that brings.
> The inevitable outcome is that NIST loses much of its influence on the industry, which certainly is not in its own interest.
Exactly this. And corrosion of institutional trust is a massive loss. Not for NIST or a bunch of corrupt academics who'd stop getting brown envelopes to stuff their pockets, but for the entire world.
But since you obliquely raise an interesting question... what is NIST's "interest" here?
Surely we're not saying that by spending trust "on very narrow issues" it's ultimate ploy is to deceive, defect and double-cross everything the public believe it was created to protect? [1]
I'm all for the game, subterfuge and craft, but sometimes you just bump up against the brute reality of principles and this is one of those cases. Backdoors always cost you more than you ever thought you'd save, and I've always assumed the people at a place like NIST are smart enough to know that.
[0] https://www.schneier.com/essays/archives/2005/09/terrorists_...
[1] https://cybershow.uk/episodes.php?id=16
Frankly, I would assume any modern encryption is compromised by default - the gamble is just in who compromised it and how likely it would be that they want access to your data.
Everything America does is in service of maintaining its position as the hegemonic player. The US intelligence agencies have infiltrated every university and tech company since forever. It's their job.
- Berstein, an extremely esteemed security researcher[0], published a long blog post last week[1] criticizing NIST's standardization process for new Post-Quantum-Crypto algorithms. He is focusing on the selection of Key Encapsulation Mechanisms (think TLS key exchange). Two big options are Kyber and NTRU (coauthored by Berstein).
- His main complaint is that NIST is playing fast and loose with the selection process, and had disqualified a fast NTRU variant due to barely not meeting a certain security threshold. The missing variant makes NTRU look slower and less flexible than it actually is.
- Meanwhile, NIST accepted a similar fast Kyber variant based on shaky assumptions. Berstein argues at length that it doesn't meet the security threshold either and should be disqualified. Funnily, NIST used Berstein's own research in (seemingly) incorrect fashion to argue for Kyber's security.
- There's an air of impropriety, as if NIST was favoring one algorithm over the other, for unknown reasons. And in the beginning of the post, Berstein shows the results of his recent lawsuit to reveal more information about the internal NIST process: it seems that NIST and NSA met more often than previously thought.
My interpretation leans more towards NIST making an internal mistake in evaluating the algorithms, rather than NSA pushing its agenda. One could argue that Berstein is sour that his algorithm might not be picked, and is trying underhanded tactics. On the other hand, he does have excellent reputation, and convincingly argues that NIST made an important mistake and is not transparent enough.
[0] https://www.metzdowd.com/pipermail/cryptography/2016-March/0...
[1] https://blog.cr.yp.to/20231003-countcorrectly.html
Why do you say this? The NSA has done this exact thing in the past[1], so why give them the benefit of the doubt this time?
[1] https://en.m.wikipedia.org/wiki/Dual_EC_DRBG
Meanwhile, both NTRU and Kyber are lattice-based, and their designs came from honest attempts. To be an NSA effort, there would need to exist an exploitable flaw in Kyber, but not NTRU, known only to the NSA. And it's not like NTRU as a whole got disqualified; only the fastest variant did.
That's the problem with spy agencies, you never know what they are capable of. But if it was an NSA effort, it would be, by far, the most subtle one uncovered so far.
https://blog.cr.yp.to/20231003-countcorrectly.html
I want to push back on this a little by linking this Twitter thread:
https://nitter.net/FiloSottile/status/1555669786826244096
It shows that there's a pattern of Bernstein and his associates threatening fellow cryptographers.
It's entirely possible to be a brilliant cryptographer and also a petty person, those things aren't mutually exclusive.
The tweets say DJB implied that scientists who submitted algorithms were bribed by the NSA. That's a complete misunderstanding of that DJB wrote: he argued that the NSA wouldn't need to bribe those scientists, because they hired the top experts in the field years ago, so it might be the case that they're so far ahead of what's being submitted that all they have to do is push NIST to pick an algorithm they know how to break.
Now, I have no knowledge of any of this, so I have no idea if DJB's argument is insanely paranoid like the author of the thread implies (with the GIF in the 3rd tweet). All I can see is that the author's claim is a gross mischaracterization of what DJB wrote.
Isn't paranoia an essential job requirement for cryptographers?
For example, one reference being used as evidence that djb is evil complains about being insulted that their employer (also djb's employer, presumed to be on djb's side) suggested seeing a company doctor after being on sick leave for a while. This is 100% standard practice in the Netherlands and the doctor is independent, not from the company themselves, and keeps things confidential. It's how we resolve the conflict where you can't just claim you're sick for unspecified reasons indefinitely and continue to expect money, but the employer isn't entitled to know your medical dossier either. This lets you have medical confidentiality and long-term sick leave where the employer can trust that appropriate action is being taken because they trust the impartial doctor to verify that. This is brought up as part of the conflict between djb, the author, and the university they work for. This isn't the only thing they allude to not knowing about while abuse was alleged to be allowed to happen by djb and others. I believe most of what is written, but at the same time, the problem is clearly being exacerbated by not using coworkers, friends, or even google/ddg to find out what legal system you've moved into. Djb even suggested they should take legal action, and HR offered arbitration, but the person declined both. So now the evidence amounts to their word on a blog and the alleged perpetrators faced zero consequences.
As much as such references serve to convince me of djb=evil, they also convince me there may be more to the story than one side.
Obviously I've just highlighted one thing here, there's a lot more he-said-she-said going on elsewhere in the threads that could give one pause in believing one side verbatim, even if they're likely right in spirit
I do believe he has increasingly argued in bad faith and alienated his peers to the point that they're (we're) unwilling to engage with him, which from the outside can look like his points are unrefutable.
Designing curve25519 was, in terms of practical impact, an achievement I'd put in the same category as inventing RSA or Diffie-Hellman. Not because the ideas were new, but because they came together in a way that produces something that "just works" in practice, and you don't have to worry about invalid curve points and twist attacks and accidentally using the addition formula for a point doubling and many other things. The idea that instead of a framework where you can plug in your own parameter choices and some of them might be secure, you can just build a crypto library that does one thing well, was certainly new enough that no-one else seemed to be doing it at the time. The fact that when I need a key for real, most of the time I do `ssh-keygen -t ed25519` or the equivalent in other systems speaks for itself.
As does the fact that github has deprecated the ssh-dss key type and recommends ed25519 and the default: in the contest between Ed25519 and DSA/ECDSA for digital signatures, Bernstein wins hands down and NIST has egg on their face. Although I have no proof of malice, I haven't yet heard a rational explanation for just how badly ECDSA mangles the Schnorr protocol in exactly the way that means a lot of implementations end up with horrible security holes.
And then there's the Snowden leaks and DUAL_EC. "The NSA has interfered with crypto standards in the past, reliable leaks show it was part of their mission statement, and they could be doing so again." is to me a statement backed up by plausible evidence that's very far from the usual conspiracy theories. This is not faked-moon-landings territory.
And I should also say, there are a lot of ways of being evil that to my knowledge no-one has ever accused Bernstein of: as far as I know, he's never been accused of raping or sexually assaulting anyone, nor has he said anything particularly racist or pushed any far-right ideology. He has been accused of insulting and occasionally threatening people who disagree with him on technical matters, but he's not what we usually mean by "bad/evil person, avoid if possible".
I'd say he has a fairly spotless track record in cryptographic protocol design, and a fairly stained one in interacting with other humans. When he's pushing back against design decisions that actually are stupid/evil, that's an asset; in lots of other cases it's not.
I thought schnorr was under patent for a bit, so an open alternative was needed? Also ECDSA does allow for recovery of the public key from the signature, which can be useful.
"When his schemes won’t get picked by NIST, people will think it’s because they are not backdoored, and will point at the FOIA lawsuit as evidence."
> We disagree with his analysis,” says Dustin Moody at NIST. “It’s a question for which there isn’t scientific certainty and intelligent people can have different views. We respect Dan’s opinion, but don’t agree with what he says.
That's great for a PopSci article, but I(and many others, I'm sure) would like to see the details of this analysis hashed out. DJB had his chance at making this happen, and blew it. However, that doesn't mean his questions[0] should go unanswered.
[0]: specifically talking about the calculation of the Kyber-512 security level here. Not his more conspiratorial claims.
WTF is that? No, it's not a question where intelligent people can have different views. DJB is literally claiming the NSA is claiming something similar to "3 + 3 = 9". That claim is either correct or not.
There's a paywall in front of the article that I have no intention to deal with after seeing what's on the comments. But a phrase like (and I could find it before the paywall rits) this is absolutely dishonest.
Nit: DJB is claiming that NIST is doing that, not NSA.
> There's a paywall in front of the article
turning javascript off gets around the paywall.
https://en.m.wikipedia.org/wiki/Bernstein_v._United_States
Qmail https://en.m.wikipedia.org/wiki/Qmail
Djbdns https://en.m.wikipedia.org/wiki/Djbdns
https://en.m.wikipedia.org/wiki/Daniel_J._Bernstein