Show 112: “Crypto Wars II” with Steve Bellovin and Matt Green

July 23, 2015

We thought the “crypto wars” were resolved in the late 1990s. But the introduction of encrypted devices­—specifically the release of iOS 8 and the growing number of available encrypted communication channels through public services such as Facebook and Snapchat—has resurfaced the debate. FBI Director Comey and other law enforcement groups are concerned about what they call “going dark” and are stressing the need for back door access (called extraordinary access). But is this really a good idea? Didn’t we already fight this battle during the first crypto wars? Matthew Green and Steve Bellovin, two authors of the recently released Keys Under Doormats paper, discuss the dangerous ramifications of this request.

Listen to Podcast

Transcript


Gary McGraw: Today I have two gurus with me. Matthew Green and Steve Bellovin. Today we’re going to interview 2 of 15 authors of the report Keys Under Doormats paper an important policy and technology piece published on July 6, 2015. Other authors include Harold Abelson, Ross Anderson, Josh Benaloh, Matt Blaze, Whitfield Diffie, John Gilmore, Susan Landau, Peter G. Neumann, Ronald L. Rivest, Jeffrey I. Schiller, Bruce Schneier, Michael Specter, and Daniel J. Weitzner. The two co-authors we’ll talk to today are Steve Bellovin and Matthew Green.

Steve Bellovin, who was the guest for Silver Bullet episode 81, is professor of computer science at Columbia University. He’s also worked as a researcher at AT&T labs and recently served as CTO of the Federal Trade Commission.

Matthew Green, who was the guest of episode 90 of Silver Bullet, is an assistant research professor at the Johns Hopkins Information Security Institute and a well-known practitioner of applied cryptography.

So thanks for joining me today.

Your important paper addresses the idea of including backdoors or what’s called “exceptional access” for law enforcement and government into crypto systems. Simply put, you find the idea untenable from many technical reasons. The paper appears during what’s been called “crypto wars 2” and harkens back to an earlier 1997 paper that many of the same authors wrote during the first crypto wars. So let’s start with you Steve. Who converged this group of authors and how did the writing process work and who was added since 1997?

Steve: Danny Weitzner took the lead to get this effort started. We all felt that something had to be done. We had a meeting. We discussed what our ideas were. We came up with an outline. I actually hosted that meeting here at Columbia. We supplemented that with a few conference calls and, of course, lots of email. Different people were given different sections and wrote that. Danny and Mike Specter were the final editors putting it all together. We all commented, made little changes and so on.

Gary: How long did that process take?

Steve: I think the total process took about two months. We would have liked a little more time but we wanted to be relevant to what’s going on in Washington and that set the clock. The effort actually started late February, but really got moving more in early March and then we got into high-gear in June.

Gary: The first paper, the one in 1997, had a very similar set of co-authors. Who was added this time?

Steve: Matt Green, Susan Landau, Mike Specter and I think Danny, who, for various organizational reasons, had to stay in the background the first time around.

Gary: Where did the idea of a crypto backdoor come from way back in the early days? What happened in ’97? Why won’t it die?

Steve: There were a whole pile of issues that were being fought over in the ‘90s. The NSA had long been concerned about access to good crypto by anybody else. You can even date that to the apparent debate within the National Security Agency (NSA) around 1975 or so about how strong the data encryption standard (DES) was going to be. Some people still believe that the 56-bit key length was a compromise between those who wanted strong crypto and those that wanted to continue to have at least some access to it.

The particular incident that ultimately triggered the 1997 report was the introduction of the “Clipper Chip,” a particular design for what became called key escrow—a way for government to have access to keys of encrypted communication. And that idea in various forms was around for several years. We wrote that report and a whole pile of other things that went into it. Ultimately, in ’99, the government basically gave up and liberalized the crypto export rules and effectively said “OK, you guys win. We’re not going to do it. It’s bad for the economy and bad for the country.”

Gary: And then obviously something happened and FBI Director Comey and other law enforcement people decided to look into this notion of backdoors again, recently.

Steve: For the last several years, the FBI has been complaining about what they call the “going dark” problem. More and more forms of encrypted communication have become common. There are just more and more forms of communication. Even if it’s not encrypted, they don’t necessarily know how to deal with it. Everything from Facebook pages, text messages—of course, but other forms of text messages. Things like Snapchat, voice communication channels and multiple-player games. There are many different forms of communication and they have made the case to Congress over the last five years that they’re “going dark.”

Gary: Is there evidence that that’s true? Or do you think that claim is a little overblown?

Steve: The evidence is, at best, weak. The wiretap reports that they release show very few encrypted conversations and many fewer of those they can’t get access to the plain text in some other way. On the other hand, there are two other factors:

  1. They claim that certain forms of communication they don’t even bother trying for a warrant because they know they can’t do anything with it because it will be strongly encrypted.
  2. There has also been the growth of encrypted devices. And this hit a head 6-8 months ago with the release of iOS 8 where Apple strongly encrypted more of the storage of the iPhone by default.

It came out in the hearing the day after our report was released that national security organizations, like the NSA, care about communications cryptography, local police departments care about device cryptography, the FBI cares about both. So the issue of device cryptography was not really an issue in the 90s as it is now.

Gary: So Matthew, as a person who came to this in-between iterations, and was pretty young during the crypto war’s first iteration, how did you get involved and what do you think has changed? Why is this an issue that’s important to you?

Matthew: Yeah, I was pretty young during the first iteration. In fact, I came to it just as it was winding up. I started working for ATT also as a staff member about 1999. And this was just during the period when crypto wars were being won and I thought that was the end of it and there would never be another cypto war in my lifetime. So it was very surprising to me about 2011/2012 when this notion of CALEA II (Commission on Accreditation of Law Enforcement) began to be booted. Where people began discussing wiretapping on non-electronic devices and dealing with encrypted data. And that all died away in 2013 with the Snowdon revelations. And then it came back. And it’s come back with kind of a vengeance.

My perspective on it is really the big difference between what’s happening now and what happened in the 1990s is that for the first time ever people are actually using crypto. Not the 1-2% mark, but actually 30-40% of the people are using it. Now that it’s default in all iPhones for both device encryption and text messaging, this is a substantial difference. This is a qualitative difference in terms of what crypto means to law enforcement.

Gary: So that’s what you think is really driving the law enforcement angle?

Matthew: Yes. They’re not afraid of crypto as long as it’s a few goofballs using PGP (Pretty Good Privacy). They’re scared of it when it’s everybody with an iPhone.

Gary: Let me ask you this question too. (First, pretend this is not me asking.) If crypto is so mathematically sound, why do you say in the paper that unanticipated security flaws will pop up? And why can’t these just be fixed?

Matthew: We’re not very good at doing the stuff even without backdoors. If you look at the history of TLS (Transport Layer Security) over the last year or two, we had at least two major vulnerabilities where TLS, that’s the protocol used to secure every web connection, has just broken down completely to the point that you can actually intercept every connection. Of course, we had Heartbleed last year. So we’re talking about a pretty bad situation before you get the backdoors. Adding backdoors is what’s really scary to us.

Gary: So the thought is that security engineering is in such a poor state that it’s very unlikely that we’d get something right.

Matthew: That’s one of the biggest concerns for me.

Gary: Steve, do you have anything to add on that issue?

Steve: Yes, there are a couple of issues. Certainly, part of the issue with TLS is implementation flaws are very common and the more complicated the protocol, the harder it is to do the implementation. Our mathematical tools are not where we would like them to be. I want to harken back to the Needham-Schroder protocol, which is the oldest cryptographic protocol in the open literatures, published in 1979. At the very end of the paper they said, “Hey, you know this stuff looks hard. We think that people are going to make a lot of mistakes doing it. And that was one of the more prescient comments I’d ever seen in a technical paper. And it’s interesting to recount the history of that protocol. They published it in ’79 and four years later, Denning and Sacco found a problem and they proposed a fix. And three years after that, Needham and Schroder looked at the fix and said, “Oops, you got it wrong. Here’s how to fix your fix.”

Then in 1995, Lowe said, “Oh, there’s another flaw even more serious in the original Needham-Schroder protocol and one variant of it and, by the way, in modern notation that protocol would be three messages long and the flaw is so obvious I could explain it in five minutes to an undergraduate who’s got minimal crypto background.”

And it went unnoticed from 1979 to 1995 and this is the oldest protocol in the open literature!

Gary: So, not only are we bad at building systems that don’t involve complicated mathematics, but complicated mathematics in protocols makes it even more difficult.

Steve: And, I would add that the need for a shorter protocol was formally verified. And the verification was simply wrong. It wasn’t a strong enough verification method.

Matthew: We’re constantly trying to develop what it means to formally verify something. Even the definitions are still changing and it’s 2015. So it’s a moving target.

Gary: I wanted to ask you something, Matthew. You guys talk about “forward secrecy” and that’s an important concept when it comes to communications cryptography. So what is “forward secrecy” and why does it matter to this situation?

Matthew: With a lot of older encryption protocols, the idea is that I have one public key and then anybody can encrypt to that and that’s great until I lose my laptop and my private key gets stolen. Or I accidently back my laptop up to the cloud and then my private key is up there. And then every email I’ve ever sent that’s encrypted is now basically readable by the person that gets that private key.

The idea of “forward secrecy” is that we fix that problem by basically deriving new keys for every message sent, or at least very regularly. So, even if my key gets stolen at a certain time, my older emails are not going to be readable. We’re going to throw away those old keys. And that’s built in to a lot of modern communication systems like the Signal protocol which is used in WhatsApp. Unfortunately, it’s very hard to combine that with these exceptional access key escrow ideas because they break that whole idea by putting in a master backdoor.

Gary: Steve, maybe you could explain that. Why does exceptional access make systems more complex and why does complexity even really matter?

Steve: The way I look at it, security is a systems problem. It’s not a property of any one component, it’s a property of everything put together. The old line about a chain being as strong as its weakest link very much applies. So maybe I’ll crypt-analyze the algorithm—we have new results coming out on RC4 (Rivest Cipher 4)—or maybe I’ll break the protocol, or maybe I’ll break the code, or maybe I’ll break the way you’re actually using it.

When I write an academic paper, I don’t want something that’s trivial. I want something that’s academically elegant. When I’m out in the real world, I don’t really care. If I have to bribe somebody with a candy bar for their password, I’ll do that. If I have to crypt-analyze it, I’ll do that. There’re just so many new pieces. The standard cryptographic protocol, the kind we’ve been working on for 35 years, is a two-party protocol. And even that, we’ve noted, is very hard to get right. TLS is a classic example of one that has been very troublesome in recent years. But, with exceptional access, I suddenly have to add a third party and I have to restrict who that third party is and only let them have it under certain circumstances, and only if they’re the right third party and that’s got to be in the protocol and it’s got to be in the code. And you can make really, really subtle mistakes or really, really stupid mistakes.

So there was a design in the late ‘90s to add a so-called ADK—Additional Decryption Key—field to a certificate. So that when I sent an encrypted email, it would encrypt it to the recipient and it would encrypt it to somebody else. Well, it turns out they forget to put the ADK field in the protected part of the certificate, which meant that anybody could substitute in their own key, therefore you’re going to encrypt it to your enemy as well as to your recipient.

Again, trivial sort of thing to fix once you know, but you’ve got to notice this in the first place. And these things can go unnoticed for years.

Gary: I guess, exceptional access also has a tendency to concentrate targets because it’s one key for many other possible communication channels, at least in the designs that have been talked about so far. Is that right, Matt?

Matthew: Yes. We have this problem with who’s going to hold that key. In the ‘90s there were proposals to split keys and put them in two different government organizations, you would need a warrant to get those two components. But then the question is, how do you recombine those shares, and does that place where you recombine them become a weak point?

There’s also a question, should the vendor, should Apple or Google hold one share of that key? Than how do they secure it? So there are many problems here that just haven’t been thought through very well.

Steve: Yeah. Don’t neglect the process. Back in the days of Clipper, I was talking to someone who was a former prosecutor and she was terrified of the Clipper Chip. You think she’d be the absolute perfect proponent for it, no. She was a narcotics prosecutor and she was convinced that some of the big drug gangs had the money, the resources and the ruthlessness to get access to the key escrow database. They were going to extort their way in. They were going to bribe their way in. They were going to blackmail their way in. But they could do it. They were powerful enough. They had the resources. She was terrified that they were going to use this to spy on law enforcement communications. How do you secure that? And not just the technical matters, it’s the process. How does a key escrow agency verify the request from a law enforcement agency that they’ve never heard of in some town they don’t know where it is, which may be in another country, and does the request comply with U.S. law? All of this is part of the system, not just the over-the-wire components or the end-points. It’s the escrow agency and everything that goes with it as well.

Gary: So Matthew, on Sunday, the seemingly thoroughly-confused editors of the Washington Post agreed with FBI Director Comey that backdoors were necessary. How can they be so confused? And, isn’t this partially our fault in the technical community?

Matthew: I think that the Silicon Valley has a problem where they don’t devote enough resources to educating policy makers. They spend money in Washington, DC, but I’m not sure it does everything that they think it’s doing. So I think part of the problem here is the people that make decisions don’t really understand the consequences. They’re getting one distorted view. I think another problem here is that there’s a tendency to think, well, it can be done. People tell us they’ve worked out a solution on a whiteboard that makes it easy, but they don’t understand actual engineering challenges. Like everything Steve is saying, where you have these incredibly complex systems that will exist, that have never existed before, that have to be built. And only after we build them will we find out how vulnerable they are. So it’s easy to look at a theoretical model of this and say, “It seems easy.”

Gary: Why do you think the technology press is not able to understand the tech?

Matthew: The technology press has actually been doing a fairly good job, I think, of expressing this. The problem is they’re running into a brick wall. They’re writing articles, there have been reports like ours that try to explain it well, but at the end of the day, you have, for example, Director Comey saying, “Well, I’ve been told by experts that it’s very difficult or impossible, but essentially, I believe it’s possible.” It’s hard to educate with technology when you have people who are really motivated to make this happen.

Gary: Steve, you’ve spent some time in Washington at the FTC (Federal Trade Commission), and you’ve run into these policy-makers yourself, directly. What do you have to add about the notion that people who may not understand the subtleties of the technology or have a certain position where the technology doesn’t matter to them?

Steve: When you’re not in a field, it’s very easy to overlook, or not understand or not realize the complexity of the field, the expertise that it actually takes. It’s not just technology. Let’s talk about that other favorite thing in Washington—climate change. This has, of course, been debated for many, many years. But I’ll actually harken back to something I believe was said in the ‘90s by former New Hampshire Governor, John Sununu. He said, “I don’t really believe in this.” And he whips up a model on his own PC. It was a simplified model. And he ran his model and said, “No, that’s not a problem.” Why does he think a model that he can put on his PC and he’s not an expert in the field in any way can represent the complexity of atmospheric dynamics and so on and so on?

Or, let’s take something else we all know about—antitrust. We all know what antitrust is—big, bad monopolies abusing their power. When I was at the FTC, I got to see up-close and personal just how sophisticated and deep some of the analyses were in deciding whether or not there was an antitrust problem. They’re real mathematical analyses. Could I do those? No, absolutely not. I don’t have the economics background. But the Bureau of Economics at the FTC sure does. It’s not just a legal matter. It’s a very complicated, sophisticated economics matter that you just don’t see.

Gary: Why is it then that the policymakers don’t make use of things like the economists at the FTC talking about antitrust situations or scientists at NOAA or at NASA talking about global warming? Or scientists who do very good jobs for the Government when it comes to security to talk about this key escrow stuff?

Steve: There’s a big component of wishful thinking. If you don’t want it to be true, it can’t be. Especially when you really can’t understand deeply why it should be. My personal yardstick for deciding whether or not somebody is competent is how they respond to unpleasant facts. Late Senator Moynihan put it this way, “You’re entitled to your own opinions, you’re not entitled to your own facts.” Start with the facts, and then say, “What is a response to these facts that’s compatible with my ideology or philosophy.” Not “These facts don’t exist.” Maybe you need different resources. Maybe you need a different approach. OK, climate change. Is the right approach regulation, cap and trade, let things happen anyway? All of those are ideological positions, but don’t tell me it’s not happening.

The same is true for this crypto-system. If you don’t like the consequences of cryptography, how do you respond? If you don’t like the fact that we can’t build these golden key backdoor exceptional access mechanisms securely, well, OK, fine, what do you do about it?

Matt Blaze and Susan Landau and I–all three of us are authors of this paper–and one of Matt’s students, Andy Clark, did a couple of papers. One of which appeared in IEEE Security & Privacy on lawful hacking called, “Going Bright.” With a proper warrant, hack into computers to go around the crypto. There are certainly issues there, but, to us, it was better than weakening the security mechanisms and adding new vulnerabilities. That was our response. There are possibly other responses, but make sure your response is compatible with the facts as best you know them.

Matthew: I want to add just one thing to that, what I do think is that the people that are making these pleas have a source of information—technical information. For better or for worse, that source of technical information is the National Security Agency. And the people at the NSA are telling them, yes, escrow can be done. It’s possible, they can build systems that do it. They’re being essentially asked by their political masters, commander-in-chief, they’re being asked, “we need this, can you do it?”

Gary: Those guys are working at cross-purposes though, because they’re charged with eavesdropping on all the communications and breaking systems and so, it’s obvious why they would say you can do it even if you can’t.

Steve: Even if you give them full credit for doing the right thing and trying to secure U.S. systems, I think that you end up with a group of people who maybe aren’t experienced at building systems. If you have people in Silicon Valley making these decisions, they know what’s actually involved. If you have people at the national security agency saying, “Yes, it can be done. We worked it out on a whiteboard. So don’t these scientists tell us that it can’t be done.” I think that’s where you wind up with a disconnect. There are plenty of people at the NSA who understand the real-world implementation difficulties of this because that’s their job too. They know how they break systems and they do it by exploiting mistakes. Let’s harken back to history with the enigma machine. I’ve seen a post-war declassified memo by the British say, “hey, if the Germans had used the enigma correctly, we couldn’t have broken it.” They didn’t use it correctly. Which was good for the world, but there were just mistakes made in the use of what should have been a sound design.

Gary: Let me ask you both this question. Why would criminals and terrorists and other boogey men, bad guys not use real crypto without backdoors that’s already been published and is widely available now?

Steve: I think a lot of them will. The ones they are most concerned about will roll their own, will go around these requirements, and so you’re not going to get the benefit against the enemies you really care about. On the other hand, a lot of people will take the easy way out. What’s off the shelf? What’s in iPhone? What’s in Windows? What’s in the Android phone? I will turn it on. Justice Scalia is fond of saying, “We’ve never held there’s any law against taking advantage of stupid criminals.” That’s certainly a large part of it too. They will catch the low-hanging fruit, the stupid guys, but it won’t help them against the ones they really care about.

If you want to look at it a little bit more sophisticated, they’ll say, “Well, it’s really hard to do crypto if they’ve tried to roll their own, they’ll make their own mistakes too and we can exploit those.” But I don’t think they think that far ahead.

Matthew: I also think that if you can keep the use of crypto down to 2% of email or any given medium, it’s a whole lot easier to do analysis then and figure out who are the bad guys, just by looking at metadata. Whereas, if 50% of people are using encryption, life gets more difficult.

Gary: So it’s just a large-number law? What, if any, impact did the OPM attack have on your thinking about this issue, Matt?

Matthew: I’m trying to be fair about this. I think what we learned from OPM is that the federal government’s IP infrastructure is not in great shape. It’s really not. To be a little bit fair about this, holding a master key would be obviously, hopefully would be better secured using hardware than something like a big ole’ database running with Cobol code trying to drive it. However, on the flip-side, it’s a whole lot more difficult to steal a multi-gigabyte database than it is to store a handful of relatively small keys and numbers. So I am not confident about the federal government, at least the non-NSA portions of the federal government’s ability to keep a secret like this.

Gary: What do you think Steve?

Steve: The OPM hack shows how hard it is to secure a large system. I think you could do a whole separate podcast on IT security and the federal government and its many weaknesses and failings, but that’s a separate issue. I’m not worried about the technical basis of storing one key that you never use. It’s when you have to use it that you get into trouble. We’ve seen no publicized leaks of the master key for the NSA’s designed and operated secure telephone network, the STU-III the SPEs and so on. They know how to secure a key. But, the whole process for gaining access to and using it, that’s also part of the system, and that’s where I worry. One key, go lock up in an HSM (hardware security module) some place. Using it, that’s when you get into trouble.

Gary: So, let’s wrap this up with an open-ended question. Steve, you first. What can we do to better educate the government, especially lawmakers in the executive branch about security, security engineering, sound technology, cypto in the real world and so forth?

Steve: We just have to keep trying and explaining that this really is hard. And just because Silicon Valley can produce marvelous things, doesn’t mean that they can do anything. Every time you see a blue screen of death. Every time you’ve got to go reboot your machine. Every time you have to install a security patch, whether it’s Microsoft’s Patch Tuesday or anything else. That represents flaws that an enemy could exploit. And the fact that we are constantly seeing these updates, is a sign of just how bad our software is. I wrote more than 20 years ago that the real security problem was buggy software. It still is.

Gary: I believe you on that one. Matthew, how about you?

Matthew: One of the things that really struck me about this debate is how many different requirements there are and how it’s backed everybody into a corner. It seems like there’s a requirement that Washington DC not dictate cryptographic design to Silicon Valley. That seems reasonable. But simultaneously, there’s a set of requirements that maybe plaintext be accessible in certain circumstances very quickly to fight crimes. And at the same time, President Obama says we need strong encryption. And the FBI is saying we need this exceptional access. When you try to fit these all together, you end up in a situation where it’s really hard. You either end up Silicon Valley developing their own solutions with nobody supervising them, in which case, some companies will just really do it badly. And then on the flip-side, you end up with a situation where Washington DC is writing a design document and sending it to Google and Apple to implement, which really is not pleasant either. I don’t know how they’re actually going to resolve that.

Gary: Well thanks you two for talking about this for a while. I’d like to encourage everybody to download the paper, Keys Under the Doormats, which is available all over the net, and read it. It’s a quick read and it’s thorough and it’s really important. And most importantly, let your political representatives know that you care about this stuff and that we have to get it right. So thanks you two.

McGraw: This has been a Silver Bullet security podcast with Gary McGraw. Silver Bullet is co-sponsored by Cigital and IEEE Security and Privacy magazine and syndicated by SearchSecurity. The May/June 2015 issue of IEEE S&P magazine focuses on diversity, crypto, and identity management. The issue also features our Silver Bullet interview with Bart Preneel of KU Leuven. Show links, notes and an online discussion can be found on the Silver Bullet webpage at www.cigital.com/podcast. This is Gary McGraw.

show 112 - Steve Bellovin

show 112 - Matt Green