Show 121: Marty Hellman Discusses Cryptography and Nuclear Non-Proliferation

April 26, 2016

Martin E. Hellman is Professor Emeritus of Electrical Engineering at Stanford University. A graduate of New York University, Martin went on to earn both a Master’s degree and Ph.D. in Electrical Engineering from Stanford. He is the author of over 70 technical papers, holder of 12 U.S. patents, co-inventor of public key cryptography, and the 2015 Turing Award recipient. Listen as Gary interviews Martin about his cutting-edge career, involvement in the crypto wars, and his work with nuclear non-proliferation and risk management.

Listen to Podcast


Gary McGraw: This is a Silver Bullet Security Podcast with Gary McGraw. I’m your host Gary McGraw, CTO of Cigital and author of Software Security. This podcast series is co-sponsored by Cigital and IEEE Security and Privacy Magazine. For more, see and This is the 121st in a series of interviews with security gurus, and I have a real guru here with me today, Marty Hellman. Hi Marty.

Marty Hellman: Hi, how you doing?

McGraw: I’m good. Martin E. Hellman, AKA Marty, is professor emeritus of electrical engineering at Stanford. Professor Hellman worked at IBM Watson and MIT before returning to Stanford in 1971. He holds a BE from NYU, and an MS and PhD from Stanford (hence the return) both in electrical engineering. Marty’s best known for his invention with Diffie and Merkle of public key cryptography. He has also been a long time contributor to the computer privacy debate, and was a key player in the first crypto wars, which we’re going to talk about later. Marty has won too many awards to mention, but I will mention the most recent one, which was the 2015 Turing Award, sometimes called the Nobel Prize of computer science. Marty and his wife Dorothy are very active in nuclear non-proliferation and peace issues, and are writing a book about that now. They live in Stanford’s Palo Alto campus. Their granddaughters, Zoey and Celeste, are now in college, and grandson Max plays the bassoon. So, thanks for joining me today.

Hellman: Thank you, actually I need to mention my daughters Sonia and Gretchen, otherwise they’ll get upset.

McGraw: Oh, it’s okay to skip a generation I think.

Hellman: Sonia’s a therapist also working on making the world a better place working with people; and Gretchen worked in information security, and is now combining that with coaching.

McGraw: Oh, very nice. So I want to focus our conversation on two things today. Your work in cryptography, and your work in nuclear non-proliferation and risk management. Let’s start with crypto and move on to the second topic later. What got you interested in crypto in the mid ’60s? Can you tell us the story of how Kahn’s book and Horst Feistel’s ideas and Claude Shannon’s brilliance caught your attention?

Hellman: Sure, it’s always fun to remember that. So when I left Stanford in ’68 and went to IBM research for a year, two things happened. I went to my first information theory symposium, and that was the area I had done my PhD thesis in, information theory. And the banquet speaker that year, January 1969, was David Kahn, a great historian of cryptography. He had just finished a couple years before his best-seller book The Code Breakers, and that certainly put the idea in my head. And also when I was working at IBM that year, I was in the same department with Horst Feistel, that’s F-E-I-S-T-E-L, who is widely regarded as the father of IBM’s cryptographic research effort that led to the data encryption standard and really formed a foundation on which Whit, and Ralph, and I—and others (Rivest, Shamir, Adleman) all built. And I was in the same department with Horst, and so while I didn’t work on cryptography, I was exposed to it.

McGraw: It was floating around in the air.

Hellman: Yes, and also IBM was spending good money on it, which reinforced my belief that there was a commercial market for encryption when it didn’t seem that way in ’68, ’69 to almost anyone else. And then I went to MIT in ’71 as an Assistant Professor—I’m sorry, ’69 to ’71, for two years as an Assistant Professor—and Peter Elias, now passed on, who was one of the original contributors to information theory, worked with Claude Shannon, showed me Shannon’s, little known at that time, 1949 paper relating information theory and cryptography.

McGraw: Yeah, fantastic.

Hellman: And I realized, I’m an information theorist, maybe I can do cryptography. So those were the three key events.

McGraw: Yeah, that’s great. And I guess when you really started working on it, there were countervailing winds, one might say.

Hellman: Yeah, well actually at first, IBM was given a pretty free hand by NSA. Feistel had worked on government crypto, and had clearances, and IBM also had huge contracts with NSA, so they cleared everything with them. And at first, there was an openness, but I think it was 1974 when Whit and I both independently showed up at Yorktown Heights, IBM Yorktown, a secrecy order had descended on them, and so that was the beginning of the ill winds, as you call them.

McGraw: Yeah, or countervailing winds, I suppose.

Hellman: Well, countervailing, and then when there was a threat to throw me in jail, we regarded them as ill.

McGraw: Let’s get into that in a minute. So when did you connect up with Whit and Ralph?

Hellman: Well let’s see, Whit first. So sometime probably the summer of ’74, spring ’74, I was at Yorktown talking with Alan Konheim and others. Konheim headed the math department, which was where cryptography was, and he’s now at Santa Barbara. And when I was there, I had visited there several times before, but this time they were a little more mum. They just had this secrecy order descend on them, management was trying to get them to work on operating system security. They felt that cryptography had been solved, there was nothing more to do. So they said, “We can’t tell you much, and also we’re being encouraged to work on other things.” So I left. Whit showed up probably a couple of months later, and Konheim told him roughly the same thing, but one additional thing.

He said, “You know, Marty Hellman was here a little while ago, saying roughly the same things, when you get back out to Stanford,” because Whit had worked here at the AI lab, “you ought to look him up.” So it was in the fall of 1974 that Whit called me and I set up a very short, maybe half hour, at most hour meeting that ended eleven o’clock that night.

McGraw: You know that’s exactly what Whit said in an earlier episode of Silver Bullet, we talked to him a few episodes ago.

Hellman: Yeah, it was a fantastic meeting. I’d been working in isolation and well, I kind of revel in doing things against the grain. It was nice to find someone else finally who saw things somewhat the way I did pretty much. Ralph was somewhat later, Ralph Merkle. He was an undergraduate and then a master’s student at Berkeley, and he proposed the key distribution part, not the digital signature part, of public key cryptography, and actually did it a little before Whit and me, but independently. We didn’t know of his work.

One of Whit’s friends put us in touch, and again, I could just see Ralph’s brilliance and appreciated what he was working on. So when he finished his master’s, I brought him to Stanford to do his Ph.D. There’s a neat little story with that, can I…

McGraw: Oh please, yeah, absolutely. That’s what this is for.

Hellman: When I mentioned to Ralph that I’d like him to, you know, if he wanted to, I’d love to have him come do his Ph.D. under me at Stanford, because no one at Berkeley appreciated what he was doing. In fact, he has a copy of the CS244 project proposal he was taking, which he dropped out of after that. The professor wrote on it, “He proposed developing public key cryptography as project one, and something much more mundane as project two.”

McGraw: Yeah, and he said project two sounds more interesting.

Hellman: Yeah, but he also said, “Perhaps because your description of project one is so muddled.” And admittedly, it was so outside the mainstream of thought, and Ralph was young, I’ll forgive the professor. Anyway, I said to Ralph, “You know, if you’d like to come do your Ph.D. here, I’d love to have you.” And he said, “But I can’t afford it.” And I explained to him that the research assistantship I’d give him at Stanford would give him the same stipend he got at Berkeley as a TA, and the tuition would be paid for by the research assistantship. So that’s how Ralph came to Stanford.

McGraw: Well, that’s a great story, yeah. You know, like-minded people come together eventually, usually. So tell us about the first crypto wars. What were they about and how did we win? Let’s do that and let’s cover the clipper chip fight at the same time, which is sort of the second crypto wars, but everybody gets them all mushed together in their minds.

Hellman: Right, and now we’re in the third. So, you only want me to talk about how we won, not how we lost, because we also lost. I’ll tell you both.

McGraw: You can talk about both, how you won and lost, simultaneously.

Hellman: Okay. In March 1975, so about six months after Whit showed up and started working with me here at Stanford, and I don’t think we yet knew about Ralph. We knew the National Bureau of Standards, now NIST, the National Institute of Standards and Technology, was going to come out with a proposed encryption standard for commercial use. This is what’s now called the Data Encryption Standard. March ’75 they put in the Federal Register a proposal for this Data Encryption Standard, or DES.

Whit and I had been anxiously awaiting the announcement. We looked at it and quickly realized that the 56-bit key size was at best marginal, and potentially disastrous. And with Moore’s Law and the passage of time, would be in fact be totally inadequate. So we initially thought this was a mistake, and we wrote letters—actually Whit—we wrote letters to NBS pointing out the problem, and to which they didn’t really respond. And so.

McGraw: Those letters somehow never arrived.

Hellman: Well, you know, we were naïve enough to think they actually wanted comments on it. We didn’t realize, as I now do, that once something’s in the Federal Register as a proposed standard, it’s really a de facto standard.

McGraw: It’s over, yeah. Yep.

Hellman: And after about four or five months, and certainly by six months, I realized we had a political fight on our hands, not a technical fight. And it was becoming clearer, and now it’s extremely clear, that the reduced key size was NSA’s doing, and that’s been admitted.

McGraw: Yeah, and even triple DES and the triple DES cracker that was built later by Kocher showed how right you guys were theoretically.

Hellman: Right, although that deep crack, as that machine was called, only broke single DES, not triple DES.

McGraw: Oh, okay. I’m sorry, I thought it was triple DES.

Hellman: No, no. Triple DES is probably still okay. But single DES, then that’s the thing, there was an easy way to make it better by tripling it, but that increased the cost by a factor of three. They could have done the same thing just by increasing the key size.

McGraw: Right, right.

Hellman: So anyway, when we realized we had a political fight and actually started to fight it as a political fight, we got David Kahn to write an op-ed in the New York Times on this after he checked us out. And Gina Kolata covered it at Science Magazine somewhat after that, and it really started to hit the fan. So the first element of the first crypto war was DES key size with NBS, and really NSA speaking through NBS, saying that the key size was adequate for the purposes for which the standard was intended. Which we disagreed with, because you might protect hundreds of millions of dollars’ worth of data with it. You know, we’re seeing things like that today.

McGraw: I guess the same philosophy of, “Well, we have this capability now and we’re going to go dark” was what was driving that poor tactic.

Hellman: Right, right that’s the current, third crypto war, which I’ll get to in a second. So that was the first element, was the key size issue. The second element was when we, especially after we developed public key cryptography soon after that, where suddenly you could change your keys every minute if you wanted to, because key distribution before that involved sending registered letters or couriers, a very expensive process; whereas public key cryptography allows you to do it over the Internet.

McGraw: Yeah, an open channel.

Hellman: Right. It sounds impossible, but it does work. And so you could change keys much more frequently, and that made DES’s 56-bit key size, which was, by the way—there’s different ways to look at it. We saw 56 bits as too small. The people in communications, intelligence, signals intelligence, within NSA who were listening in on foreign powers, and terrorists, and things like that, they saw 56 bits as way too much. Because think about it, if everything prior to that was—most everything was unencrypted. They could search, even in those days, millions or billions of words for a dollar, looking for key words.

Once you have even a small barrier, even if it costs a dollar to break, to get a key, that’s a huge increase in cost.

McGraw: Yeah, absolutely, it changes the economics.

Hellman: Right. And so the first crypto war was about two things, the key size of DES and the freedom for us to publish our papers, both on key size of DES and public key cryptography.

McGraw: And you were, and they threatened to jail you over that work?

Hellman: Well, NSA never, well we had, and especially in those days NSA always talked in code. Everybody talked in code.

McGraw: That’s right, that’s when they didn’t really exist still.

Hellman: Right. No Such Agency, Never Say Anything. But it was also the IEEE. When the IEEE got the threatening letter from a guy who worked at NSA, written from his home address with no indication he worked there. But Whit was able to verify that he did. So he sent a letter to the IEEE as a concerned IEEE member saying, “It bothers me that the IEEE is breaking the law by publishing papers in certain areas.” He never said exactly what. He never mentions me by name. But he lists like six journal issues, and I had a paper in every one but one. And so, that’s code in itself. Secondly, the IEEE responded—

McGraw: And they, I’m sorry, but they were using the ITAR justification, right?

Hellman: Right, International Traffic and Arms Regulations. It’s illegal to export an F16 without a license, obviously. It’s illegal to export the plans without an export license, and anything cryptographic was regarded as an implement of war. So by publishing our papers, his position was that we were violating International Traffic and Arms Regulations, and I forget whether it was 5 or 10 years in jail that he cited as the punishment.

McGraw: Yeah. You know, the thought of you and also Whit as arms dealers is somewhat ironic.

Hellman: Yes it is. So anyway, when the IEEE responded to him, they copied me. But they didn’t copy me as “Martin Hellman, troublemaker,” which, you know, they use code too. They sent it to me because I was on the Board of Governors of the Information Theory Group, which had published several of the papers. But they didn’t send it to all the governors. So that’s what I mean that people tend to talk in code. That was the first crypto war.

McGraw: And you got past that by just doing it, right?

Hellman: Well, two things. We lost the key size. DES stayed at 56 bits. I mean, we proposed triple DES and that was used by a lot of people, but a lot of people didn’t. So we really lost on the key size until AES, the current Advanced Encryption Standard came out about 15 years ago.

McGraw: When I got started in this stuff, in using applied cryptography, we were using triple DES on smart cards at the time.

Hellman: We lost the key size issue until later, but we did win in the long run on that, because not only did the key size go up in AES, they adopted it in the way we said DES should have been adopted: a transparent, open adoption process with critiques, and not just the algorithm coming full blown from the brow of Zeus sort of thing.

McGraw: Yeah, so that’s good. But then later they made a step backwards when they put out crappy elliptic curves to use in the ECC.

Hellman: Step backward from our point of view, forward from their point of view. But the thing we did win on was the freedom to publish. So we won that battle.


McGraw: Right, and you did that by presenting a paper at Cornell, I guess in lieu or instead of your students who had done some of the work, because you were in a better position to take the heat.

Hellman: Yeah. Well when I got this letter from J.A. Meyer, was the guy’s name who wrote to the IEEE, I took it to Stanford’s general counsel for two reasons. The IEEE responded they’re well aware of the ITAR, but they always regarded it as the author’s and the author’s institution’s responsibility to make sure they weren’t in violation. So I had to. Stanford was potentially liable. Plus, if I was prosecuted, I wanted to make sure I had Stanford’s financial backing because that can bankrupt you to defend yourself.

McGraw: Yeah, no doubt.

Hellman: So I had a meeting with John Schwartz, who was Stanford’s general counsel. We had one meeting and then he says, “Well let me review it.” He came back a few days or a week later and he said—I’ll never forget this conversation—he said, “It’s my legal opinion that if the ITAR are construed broadly enough to cover a publication of your papers, it’s unconstitutional. But,” he said, “I’ve got to warn you, the only way to settle this is in a court case. So if you’re prosecuted, we will defend you. If you’re convicted, we’ll appeal. But again, I’ve got to warn you, if all appeals are exhausted, we can’t go to jail for you.”

McGraw: Well that, it’s a real act of bravery to do what you did. Seriously. And it’s important for people to realize that sometimes you have to stick with your principles.

Hellman: Yeah, it’s funny. I have a friend who was a Marine Captain during the Iraq War, and I was corresponding with him as he was going on deployments. He said, “People keep telling me how brave I am,” he said, “I just signed up for Naval ROTC so I didn’t have to pay tuition.” And I said, “I understand, because a lot of people tell me I was courageous or brave or whatever to do this, but you know, I just kind of fell into it.”

McGraw: That’s great.

Hellman: And once NBS and NSA are lying to you, as they were at that point, then they’re starting a fight with me from my perspective. Although, I imagine they saw it the other way. It just was a natural thing. I couldn’t let it drop.

McGraw: So let’s talk about clipper chip really briefly, and then your view of the current crypto war.

Hellman: Okay. So clipper chip and key escrow, of which clipper chip was an instance, were in the mid ’90s. The idea here is that every telephone that has encryption, every computer that has encryption, will have a master key built into it that allows, if you get the master key, you can get any session key, any key used for a conversation, for example. And the master keys would be escrowed, stored by an escrow authority under careful lock and key. But if there was a court order, then the master key would be given to the FBI, or local law enforcement, or NSA so they could listen in.

McGraw: And we know how well the government stores records…OPM.

Hellman: Well, that is a problem. Yeah, Office of Personnel Management is an example. And there was a big fight over this; this was the second crypto war. And Congress actually—it wasn’t solely about that—but a large part of a National Research Council Committee charge from Congress concerned this, and I served on that in the mid ’90s. And we had a former Attorney General, Benjamin Civiletti, representing the FBI and law enforcement’s interests. We had Ann Caracristi, former Deputy Director of NSA, representing their interests, and we reached unanimous conclusion.

McGraw: Was Brian Snow on that panel or not?

Hellman: No, he was not. One of the big issues was key escrow and clipper chip, and we just couldn’t see how to make it work. And so in our final report, called “Crisis: Cryptography’s Role in Securing Information Society,” which is freely available online on the National Research Council’s website—

McGraw: As all their publications are, yeah.

Hellman: Right, in PDF. We recommended that the government experiment with key escrow for its own purposes. And if it could figure out how to solve the problems that we couldn’t see how to solve, we didn’t necessarily say all of it, that then they come back to us, and they never did. And one of the problems is, who holds the keys internationally?

McGraw: Of course. Yeah. There’s no ultimate jurisdiction today, really.

Hellman: Right. And as you pointed out, it creates a huge target, this database of master keys, even if you could solve the first problem.

McGraw: Absolutely. Building yourself an Achilles’ heel on purpose.

Hellman: Exactly. And the third crypto war, which we’re in now, is very similar. It’s almost a repeat of the second. It’s almost like people forgot what happened 20 years ago.

McGraw: It is truly amazing to me, because I was around during the second crypto war, and it’s like it never happened.

Hellman: Right. And so I signed onto—now the Apple case has been dropped—but I did sign onto an amicus curiae brief (a friend of the court brief) that EFF (the Electronic Frontier Foundation) put together, backing Apple’s position. And I have an op-ed in The Hill, which is a kind of DC-centric newsletter with Ron Rivest, who everybody listening to this will know, the leader of MIT’s cryptographic effort that revolutionized cryptography. Me, the leader of Stanford’s group that did the same. So we have the two key people from that era, and then we had the two technical Ph.D. in Congress, Jerry McNerney is a mathematician and Bill Foster is a physicist. Now the four of us wrote an op-ed, which you can find online, arguing that, “Hey, this is like a repeat of the second crypto war and we need to get smarter.”

We’ll be right back after this message.

If you like what you’re hearing on Silver Bullet, make sure to check out my other projects on There you can find writings, videos, and even original music.

McGraw: So, you know today the WhatsApp people decided to add end-to-end crypto to their product set. And they turned it on. So there are two billion more crypto users starting today.

Hellman: Yeah, well this is one of the things, like Dianne Feinstein is my Senator, and she’s on the intelligence committee and—

McGraw: She’s getting ready to introduce a bill which is going to be a mess, I’m sure.

Hellman: Yeah, and I like her on most things, but this one she’s wrong on. You can’t repeal mathematics, you can’t—I mean if you make it illegal for the manufacturers like Apple, then there will be app developers who will do it.

McGraw: Absolutely. Genie’s out of the bottle, so it’s even further out now that two billion more people are using end-to-end crypto. So, let’s end up this section of the talk. Congrats on your recent Turing Award, by the way. That’s just fantastic.

Hellman: Thank you.

McGraw: We’re all very proud. I was surprised that you hadn’t gotten it yet, as I told you at Paul’s thing. So, would you mind telling the story about when the ACM called you up and you learned you’d won? Because I love that, it’s a very charming story.

Hellman: So, let’s see. I’m an Electrical Engineer rather than a Computer Scientist, although there’s obviously a lot of overlap. And so, when I got the call from the ACM—must have been very late February—just a few days before the announcement was made, we sped it up because of the RSA conference. They told me I’d won the ACM Turing Award, congratulations, and I was very pleased. I knew it was their top award, but I didn’t realize that there was a million dollars connected with it, and maybe partly because that was only last year. And so, this was very nice, but my wife and I are kind of half joking about what we’re going to do with another plaque.

McGraw: I love it. The plaque closet’s full.

Hellman: Well, actually, I’m in my study, and I’ve got them stacked up next to my bookcase so that I can see some of them. Because there’s just not enough wall space. But the next day, I’m talking to one of the public relations guys at Stanford, and they’re making a big deal of this, bigger than I would have expected. And he says to me, he said, “So Marty,” he’s Australian, “So this may sound crass, but you’re going to be asked this. What are you going to do with all the money?” And I say, “What money?” And then I find out Google has made this a million dollar prize. I go in and talk with my wife about it. And very quickly we decided this came at a great time—both the publicity and the money—to help push our book that we’re working on on reducing the divorce rate, ending needless wars, and getting rid of the nuclear threat.

McGraw: Yeah, that’s a perfect segue, so let’s talk about that for a while. Nuclear non-proliferation, or as you deftly put it, “reducing nuclear risk to acceptable levels.” What got you started working on that issue?

Hellman: Oh very simple, 1980, my wife Dorothy. Three word answer. We’d been married in ’67, 1967, so this was 13 years into our marriage, two kids, a house that we couldn’t afford for quite a while, normal life. Our marriage, we didn’t know it at that time because we didn’t take time to look at it, but our marriage was in trouble. And Dorothy had enough intuitive sense to be looking for catalysts, and she was working at Touche Ross, now Deloitte Touche, as an auditor. One of the partners was involved in this crazy organization from our then point of view, which was concerned with environmental degradation. He invited Dorothy and me to a weekend seminar on the bigger issues of life. And the group always worked at two levels, the macro level was initially the environment, the micro level has always been if you’re married, making peace in your marriage if you don’t have it already, and I know very few people who do. And when Reagan became President, we realized the greatest environmental threat of all was a nuclear war.

McGraw: Yeah, kind of screws up the planet for a few thousands of years.

Hellman: Yeah, if you know, maybe forever, yes. And then as we researched the problem, and it was a very unusual group, people would call it a peace group but it was something different. It was a human potential, human growth group, because the real problem is that our technology has outpaced, our technological progress has outpaced our maturation as a species.

McGraw: I totally agree with that. And I think that the pace of technology, especially technology adoption, has moved to be even quicker in the last hundred years. It’s sort of shockingly fast for us as a species.

Hellman: Absolutely. And so the problem isn’t nuclear weapons, or genetic engineering, or artificial intelligence potentially 20 years out. The real problem is the chasm between that god-like physical power through technology and our (at best) irresponsible adolescent behavior as a species. And so the bad news is if we don’t grow up we’re going to destroy ourselves. The good news is we have to grow up. And we’ll stop making a lot of mistakes. We’ll stop getting into needless wars. We will deal with the environment and take a long term perspective. So that’s how I got involved.

McGraw: Tell us your TNT vest analogy. I think that’s a really apt analogy.

Hellman: Yeah. So almost nobody’s concerned with nuclear weapons these days. They think about it as the problem of the past, but there’s still about 15,000 nuclear weapons in the world. We’ve got about six or seven thousand in this country alone. But people act as if there’s no risk, but the way I put it is, imagine if a guy wearing a TNT vest were to walk into the room where you are now, but you knew he wasn’t a suicide bomber. So he says, “Nothing to worry about. I don’t have the button for setting this off. There are two buttons in very safe hands so there’s nothing to worry about. One’s in D.C. with President Obama and one’s in Moscow with President Putin, so just sit down and relax.” You would get out of, “Oh by the way, there are buttons in Paris, and London, and Beijing, and Pyongyang, and the terrorists are trying to get them. But don’t worry, just sit here.” We’d still get out of that room as fast as we could. What I say in this analogy is why just because we can’t see the weapons controlled by those buttons, the real buttons, why have we as a species, as a society, sat here complacently assuming for decades that just because the earth’s explosive vest has not yet gone off, it never will?

McGraw: Right, right. So let’s talk a little bit about the risk of nuclear terrorism. Do you think that that is changing, recently?

Hellman: Well certainly over the last 20 years it has become much greater. And I think it’s a combination of terrorists realizing they could do that; but the risk of nuclear war is actually the greater risk, contrary to what the president has said, contrary to what Colin Powell has said. There’s a very simple argument. A nuclear war would kill at least a billion people, 10 to the 9th. A nuclear terrorist incident would kill at most 10 to the 5th, 100,000.

That ratio is 10,000 to 1. Now, so nuclear war could only be the lesser risk if it is 10,000 times less likely or more, you know 10,000 times or more or less likely than nuclear terrorism. And that just seems unlikely.

McGraw: Yeah, mathematically speaking. Most people don’t think of risk in terms of math, which is strange, but they don’t.

Hellman: Right. And actually, the elevator pitch for this is, as we call it in Silicon Valley: even if we could expect nuclear deterrence threatening to destroy civilization in an effort to preserve the peace, even if we could expect that to work for 500 years before it failed and we destroyed ourselves, which seems optimistic by the way, that’s equivalent to playing Russian roulette with the life of a newborn child.

McGraw: It’s like one in ten, right?

Hellman: Well, in Russian roulette the risk is one in six. And one-sixth of 500 years is about 83 years, that child’s roughly expected lifetime. So why are we not looking at this? In fact, I was on a call this morning with a senator’s aide, to try to get Congress to authorize a National Research Council study, and nobody’s very interested. There are some people in Congress interested, but it’s impossible to get anything through.

McGraw: Yeah. Well our political system is in a quagmire currently. So let’s talk about the book that you’re writing with Dorothy now. The title is A New Map for Relationships, Creating True Love at Home, and Peace on the Planet. What’s the book about? Why are you writing it? How’s it coming along, and so on?

Hellman: Sure, first a plug. If you go to or go to my Stanford web page and look for “blog,” you’ll find I have some blog posts on it.

McGraw: There’ll be a link associated with this issue when we put it up.

Hellman: Ah okay. So why are we writing it? There are a couple of reasons. One reason is, almost no one is interested in nuclear weapons, or war and peace. They express mild interest, but no one really gets involved. They feel it’s like too big an issue. What difference can I make as well? Whereas, if the first step to solving those problems is to produce greater peace in our marriages and other relationships, who but you can take that step? And that’s how it worked for us.

We worked on both problems at the same time. We reached a point where we have not had a single fight in 10 or 15 years. Now, I didn’t think that was possible. I have to give Dorothy credit for that vision. And it was a 20 year process to go from fighting all the time, in my perspective, to never fighting. We had to learn a lot of things, but working on both the global threats and the insanity in the fights we got into in our marriage, the same fights over and over again, actually speeded up both processes.

McGraw: So the personal is informing your views of the international? Is that—

Hellman: And vice versa.

McGraw: And vice versa.

Hellman: Yeah, so for example—

McGraw: Well, who’s the North Korea of your marriage?

Hellman: Neither of us, thank God. Otherwise we’d be divorced or one of us would be dead. You know, North Korea, as horrible and despicable as that regime is, when it comes to nuclear weapons agreements, nuclear agreements, they actually have a good track record, contrary to what the press tells you. This comes not just—and again on my blog I’ve got a lot of coverage of that, and getting hard facts. Sieg Hecker, who’s the former Director of Los Alamos for over a decade and a colleague and friend of mine now, I rely heavily on things he’s written and lectures he’s given, and he says these things.

He’s been to North Korea seven times on track two diplomacy. The easy way to summarize that, at the end of a guest lecture he gave about four years ago in a seminar I was teaching on “Nuclear weapons: Risk and Hope.” He gave a guest lecture on North Korea, and at the end of the lecture, a student asks him, “You know, Professor Hecker, it sounds like from what you’ve told us today, like North Korea would have absolutely no weapons today, no nuclear weapons if President Bush hadn’t done what he did in 2002.” That was the logical consequence of what he told us. We don’t know for sure, because you can’t go and do parallel universe experiments—

McGraw: No, it’s a counterfactual, but it’s worth thinking about.

Hellman: Yeah. One little piece of evidence I’ll throw out, North Korea had a reactor that would have made 10 bombs worth of plutonium a year, and was only a year or two away from completion in 1994 when the agreed framework was signed. And they, as part of the agreed framework, they stopped construction of that reactor, and it started to get so badly rusted that the last time Hecker was in North Korea in 2010, he saw them dismantling it with cranes because it was going to fall down. So, it’s bad that they have nuclear weapons today, but they’d have over a hundred if it weren’t for the agreed framework, and they’d probably have none if President Bush hadn’t thrown out the agreed framework.

McGraw: I think that’s worth talking about. These frameworks like SALT and SALT II, and the things that we worked on with the Russians during the Cold War actually worked. And the agreement that we just put in place with the Iranians is very similar in nature.

Hellman: Absolutely. Yeah, so actually I found out I was one of the nation’s top 29 nuclear weapons experts when the New York Times called me that.

McGraw: Well, congratulations on that too.

Hellman: And I am probably the world’s expert on nuclear risk because almost no one else works on it. Dick Garwin, who is one of the world’s top nuclear weapons experts, Edward Teller credited him with being the key—

McGraw: Isn’t he the guy who just retired yesterday, or last week?

Hellman: Oh, I haven’t seen that. He may have.

McGraw: Yeah. There was a big piece on him in the Post, which was great, and it talked about his work in all the frameworks.

Hellman: Yeah, and he was one of the key people that made the first H-bomb work. But unlike Teller, he then worked on arms control. And Dick Garwin, who I admire and trust—Fermi, who was his Ph.D. advisor, called him the one true genius he ever met.

McGraw: Wow. That’s not faint praise.

Hellman: No, and I’ve heard similar things from other people. Dick is just amazing. Anyway, he asked me if I’d sign this letter in support of the Iran nuclear agreement. I agreed with it when I read it, I trust him, and so I signed it. And so you really have to look at the alternative to this agreement.

McGraw: Yeah. Well, I want to turn this slightly broader, towards all the listeners here, and ask you about technologists and engineers, not to mention software developers, and some notion of an ethical background or an understandings of the philosophy of the things that they’re building. Don’t you think that’s something that we’re not doing a very good job teaching these days?

Hellman: I absolutely agree. And it’s a very hard thing to teach. I have a section in my book called “the devil on my shoulder,” and it’s when I was trying to decide whether to go public with DES key size controversy. Two guys from NSA flew out and told me, “You’re wrong, but please shut up. If you keep talking this way, you’re going to cause grave harm to national security.” This was just before we went public.

And I sat down, I think it was that night, to figure out the right thing to do. And the idea popped into my head, “Forget about what’s right and wrong. Run with it. You’ve got a tiger by the tail.” And at the time, I thought I dealt with that devil on my shoulder, so to speak, and concluded that the right thing to do was to go public. But when I watched a documentary, Day After Trinity, about the Manhattan Project, the people who worked on that project were asked, “What was your motivation?” They all said Nazi Germany. If Hitler got the bomb first, it would be horrible. And then they’re asked later, “So when Germany was defeated and Japan was our only adversary, why did you keep working?” And they don’t know, watching that video—

McGraw: Momentum.

Hellman: Not just momentum, I think they fooled themselves. I realized I had fooled myself. I thought I had dealt with the devil on my shoulder, but I had done what I think they did and I think most people do. They figured out what they wanted to do, which was to work on this project. They had socially acceptable reasons that they could admit, like Hitler, and socially unacceptable reasons that they hid, even from their own conscious mind. Is my brain powerful enough to destroy a city? Could I be the war hero and have the girls fall at my feet instead of that football quarterback guy? And I had similar things with the DES, and I vowed watching that film that I would never fool myself again. I had not caused the same kind of damage as they, but I could see where I could have.

McGraw: Yeah. And I think that that’s something that’s worth some conversations in school. When people who build our modern systems have to grapple with these things, actually they don’t have to grapple with these things, and they ought to have to grapple with these things.

Hellman: Right. And I think it needs to be done with case studies like the one I just gave you. Because it happened later, maybe five or ten years later, RSA Data Security wasn’t paying us royalties on Stanford’s patent, and so they ended up selling their company for 250 million dollars, and we made almost nothing on our patent. And someone came to me, a guy whose name is Lou Morris, the President of Cylink at the time, came to me and said, “You help me get an exclusive license to Stanford’s patents and we’ll get those RSA bastards by the balls.” He’s a little scrappy Jewish guy from Philadelphia. And I didn’t want to go with Cylink if it was for revenge because I’d made this vow I wouldn’t fool myself. But I was so pissed at them at the time that I couldn’t be sure I wasn’t fooling myself.

I went to Dorothy, and she said, “Oh, simple solution.” Niels Reimers, who was the head of technology licensing at Stanford, had the same business interests I did, but he didn’t have the emotional involvement in this, “So let him make the decision. We should go with Cylink,” but I know I didn’t fool myself. The good news, by the way, is I’m friends with RSA and Jim Bidzos these days. We put that behind us, that was then.

McGraw: Well, this has been absolutely fascinating. I think I could talk to you for hours, but I’m not sure people would listen to the whole episode.

Hellman: Yeah. No, we should stop here. Okay, Gary. Well thank you very much.

McGraw: So let me ask you one other question, Marty. So how long has it been since you’ve been soaring?

Hellman: Oh, last August. So I have 2700 hours in gliders, which is a lot.

McGraw: Holy cow.

Hellman: I had a motor glider which helped me do that, but I sold it six years ago, and I’ve only been up twice or three times in those six years. So I don’t really soar that much anymore.

McGraw: Did you ever fly an aerobatic plane?

Hellman: I never piloted an aerobatic plane. I almost got sick in one when another guy did aerobatics.

McGraw: Me too, even though I was told what was going to happen, my body was rebelling against the whole phenomenon.

Well that sounds like a really cool hobby.

Hellman: Yeah. Oh and actually my Stanford webpage has a link to soaring, and you can then go, you can see a great flight that I did, something like a six hour flight with maybe 45 minutes of engine time.

McGraw: That is so awesome. So how many thermals did you catch during that flight?

Hellman: Oh, lots. And sometimes you don’t even need thermals. There are times when you can dolphin fly, when you get under a cloud street, it’s a line of clouds that has a lot of lift under them, you just slowdown in the lift and speed up in the sink, and, without stopping to circle, which means you’re going zero velocity forward when you’re circling. You can make great time that way. So, great day that day.

McGraw: Oh cool, cool. Well thanks a lot for this conversation. It’s been absolutely fascinating.

Hellman: Well, I’ve enjoyed it too. Thanks very much, Gary.

McGraw: This has been a Silver Bullet Security Podcast with Gary McGraw. Silver Bullet is co-sponsored by Cigital and IEEE Security and Privacy Magazine, and syndicated by SearchSecurity. The January/February issue of IEEE S&P Magazine is all about software security, privacy, safety, and dependability. The issue also includes our interview with Mudge, AKA Peiter Zatko.

Make sure to watch episode 120 of Silver Bullet, a video featuring me, interviewed by Marcus Ranum, which is about 10 years in a row of Silver Bullet. This is the 10 years plus one, Marty. Show links, notes, and an online discussion can be found on the Silver Bullet webpage, at This is Gary McGraw.

show 121 - Martin Hellman