Show 120: Silver Bullet Celebrates 10 Years! Marcus Ranum Interviews Gary McGraw

March 30, 2016

To celebrate 10 straight years of the monthly Silver Bullet Security Podcast, we’re flipping the mic. During the past decade, Dr. Gary McGraw has interviewed some of the security industry’s most influential gurus. A globally recognized authority on security and software, he is the VP Security Technology of Synopsys and the author of eight bestselling books on software security—and for the 120th Silver Bullet interview, he’s not the one asking the questions. In this landmark episode, firewall inventor Marcus Ranum takes on the role of Silver Bullet host to interview Gary on a variety of topics including evolutionary biology and security, the Internet of Things, hard core cyber insurgency, advisory board work, software security, tinfoil hats, the surveillance state, and more. Watch Marcus and Gary celebrate a decade of Silver Bullet in this special video edition.

Listen to Podcast

Transcript


Marcus Ranum: This is a Silver Bullet Podcast with Gary McGraw. I’m your stunt double Gary for this session, Marcus Ranum. This podcast series is co-sponsored by Cigital and IEEE Security and Privacy Magazine where a portion of this interview will appear in print. Today we’re celebrating the 120th episode of Silver Bullet. That is exactly one decade of shows delivered on a monthly basis—120 in a row—by producing a video in the Shenandoah Valley of Virginia where there are Black Hawk helicopters and weird flying reptiles. This is the 120th in a series of interviews with security gurus and I’m pleased to have Silver Bullet’s usual host with me today. Hi Gary.

Gary McGraw: Hi Marcus. Thanks for doing this.

Ranum: Gary McGraw is the CTO of Cigital, Inc., a software security consulting firm with headquarters in the Washington D.C. area and 13 offices throughout the world. He’s a globally recognized authority on security and software, and the author of eight bestselling books on this topic. Dr. McGraw has also written over 100 peer-reviewed scientific publications and authors a periodic security column for SearchSecurity. Besides serving as a strategic counselor for top businesses and IT executives, Gary’s on the advisory boards for Max Financial, Intrepid, and RavenWhite. His dual PhD is in Cognitive Science and Computer Science from Indiana University where he serves on the Dean’s Advisory Council for the School of Informatics. Gary lives on the Shenandoah River with his wife, Amy, and his son, Eli. His son, Jack, is a student at NYU.

So, you’re one of the founders of the field of software security. How did you get into that?

McGraw: You know, I was a programming languages weenie, besides being a cognitive science guy, when I was in grad school. And so, when Java came out I was excited that there was this programming language for the web and got interested in the Java thing. It turned out to be much more like C++ than it was like a functional programming language like Scheme. They were making all these claims about security, so we started breaking Java.

Ranum: So you got that excited about Java?

McGraw: I got excited by breaking Java and wishing that it were better, and wondering why those guys who built it made some of the decisions they made from a programming languages perspective when they built it.

Ranum: Did you believe the marketing hype that Java was a secure programming language? I mean, it was designed for elevator control systems if I recall. Then, Sun Microsystems needed a web programming language and went…..Java!

McGraw: It was called Oak in the early days. It was actually for set-top box cable systems.

Ranum: Oh. I thought it was for elevators.

McGraw: It’s all the same. It was very closely based on P-Code from the 70s.

Ranum: Yeah. Programming languages are just programming languages.

McGraw: So, when it came out and they were saying “everything you build in Java is secure; Java is secure,” it made us wonder what the heck they were talking about. So, we broke that a lot. I did a lot of work with Ed Felten from Princeton and his guys who were in grad school at the time. After doing that work, it made me wonder why really good people screwed this all up when it came to software security. Like, if Bill Joy and Guy Steele can’t get it right, what chance do mere mortals have?

Ranum: It seems like the software industry, the way it’s structured right now, is vastly more rewarding for people who are part of the problem rather than people who are part of the solution.

McGraw: Isn’t that a great irony?

Ranum: Yeah. So, you know, if you actually try to produce good software, you’re going to be later in the release cycle and the other guys are going to have five million customers before you’re able to release anything.

McGraw: There’s that, and there’s also the idea of being incented to put bugs in that you later take out. Like that one Dilbert. Remember when Wally was actually coding one day?

Ranum: “I just coded myself a new minivan.”

McGraw: Exactly. That one.

Ranum: If Netscape had waited three releases to release the browser, they would have never been what they were. And, as soon as we saw the Netscape IPO—I mean, where were you when Netscape went public?

McGraw: I was actually still at Indiana when Netscape went public. And, you know, Netscape really didn’t have Java in it in the beginning.

Ranum: No. It isn’t a Java problem. Kent Landfield and I were in the data center at Sterling Software the day the IPO happened, and we watched that ticker go through the roof. It was the beginning of the 20-year beta test.

McGraw: Yeah. You knew the world was going to change. Well, now we have DevOps where we can make that cycle even tighter.

Ranum: Yes. Agile, right? As opposed to awkward. I kind of like awkward.

McGraw: Agile is when you run away from your code that you just wrote as quickly as possible.

Ranum: Right. So, you’ve been on advisory boards for many companies. I’ve been on some of those as well. We kind of imagine it’s—I don’t know what people out there imagine it’s like, but why don’t you tell us a little bit about what being on an advisory board is. What do you do?

McGraw: Part of it is you’re supposed to be a third set of eyes and kind of a person from Mars perspective to the company that you’re giving advice to. Usually, they’ll put on a formal presentation maybe about a new design or a new product idea or some sort of strategic move that they’re going to make as a company. The advisory board weighs in on that—helps to get the flaws out of the design and talks to the engineering team directly. Sometimes it even takes a look at code which we did in the very early days at Fortify. And, just generally provides outside council that is highly trustworthy.

Ranum: Well I think it’s really interesting, when you’re in the sausage factory and you’re actually packing that sausage and someone comes from the outside and goes “did you think about doing this?” and you go “No!” You know, we’ve all had those moments.

McGraw: It really happens when you’re coding, right!

Ranum: Absolutely.

McGraw: You know, you’ve got this bug and you can’t figure out the bug. Your friends call you for dinner because you’re supposed to meet them for pizza, and they’re like “where the heck are you?” And then, they finally come by your office and you’re still working on this bug and it’s driving you crazy. The guy who comes into your office just says “oh, why’d you do that?” And you go “Ah! That was the bug! That was it.”

Ranum: I remember when Andrew Lambeth and I were at V-One, we spent an entire day trying to figure out why this one time conversion routine that he’d written, and I’d reviewed, kept coming up with the wrong time. Finally someone said “did you take into account daylight savings time?”

McGraw: Of course.

Ranum: It should have been suspicious that it was exactly one hour off.

Ok. So, the Internet of half-baked Things is upon us. The other day I found this little wonderful item on Amazon. [hands McGraw some paper]

McGraw: Bluetooth 7000 dental professional toothbrush.

Ranum: Talks Bluetooth to the base. And so you know Bluetooth isn’t a problem. It’s a local network. But how does the base get the data to the cloud? So you know the base of this toothbrush has got an IP address, and it’s DHCP-ing something on your network, and it’s going to some website, and personal information about how often you brush your teeth, but more likely your YouTube password, is also going to be out there on that site.

McGraw: Yeah.

Ranum: The thing is, people look at this and they go “well, there’s no way that a toothbrush can actually hurt people, so we don’t have to worry about securing a toothbrush.”

McGraw: But they don’t realize that it’s taking a resource that is a finite quantity.

Ranum: Correct.

McGraw: And that’s the key thing to think about while you’re designing these things.

Ranum: It also means if I’m out on the Internet and I can get access to your toothbrush records, I can tell when you’re on vacation because you’re not home brushing your teeth.

McGraw: You’re actually in St. Croix brushing your teeth.

Ranum: And I know it’s time to break into your house and steel your toothbrush.

McGraw: Of course.

Ranum: So, how do you see this playing out? Are you going to be doing application whitelisting on your toothbrush? What if somebody gets a buffer overrunning your toothbrush? Or, more to the point, if you’re running a website and you find yourself under DDoS attack from 10,000 toothbrushes?

McGraw: Software updates for everything. I think there’s some possibility that because these devices are so small, we might be able to write better code because it’s going to be smaller, tighter code. But, that might be truly insane.

Ranum: I don’t think that’s going to happen because it’s going to be running Debian inside when somebody just said “whoa, here’s an operating system.”

McGraw: Yeah, that’s what happened to a lot of these embedded systems, including some car chips, as you know. The difference between a Ford 350 and a Ford 150 is a bunch of EEPROM settings. And so, you might get a chip that’s super cheap because there are bajillions of 8088 chips sitting around, for example, in a warehouse. And, it’s cheaper to put that into your dishwasher than it is to store it in a warehouse. So you end up having way too much computing capacity in your toothbrush, in your dishwasher, and everywhere else.

Ranum: You know, the other big problem with this is that when you’re dealing with a physical device that gets shipped through Amazon or whatever, you kind of have to get the software pretty much right the first time because you can’t de-stock it and re-ship it when you’ve shipped a million lightbulbs. It’s not the same thing.

McGraw: Well, you know, even if it were the same thing, the same thing doesn’t work too well, say, in the mobile space. If you think about Sony, for example, or Sony Mobile who has to have all these devices going way back to five years ago and support them all, you know that that kernel and all the code that’s associated with Android isn’t being updated to keep things up to normal. As a consumer company, those guys are sort of on the spot to make sure that that phone, even if it’s five, still works.

Ranum: And here’s the great part—you know that if they’re shipping that using some version of a Linux, or BSD, or whatever kernel, it’s got the half-baked version of IPv6 in it, which is enabled. And all it needs is the correct packet sent and it’s going to go turn into a brick on you.

McGraw: Isn’t that nice.

Ranum: I know. So, what are some of the most screwed up things, without naming any names, of the Internet of half-baked Things that you’ve seen?

McGraw: Well, you know, the most screwed up things—the things that worry me the most—(I’m going to turn this one on you)—are when the code really matters. Like, control code for a vehicle, for an airplane, or nuclear power plant control code. You can’t screw that up. You can’t say “oops, I’ve made a mistake; let me just DevOpsy that later” because you may create a 30,000 year problem if you screw up the code.

We’ve learned a lot of lessons from high-assurance land, some of which we have been borrowing over the years into nonsense software land (where we exist, you know—banks and everything else, consumer devices). And now it’s trickling down to even sillier things like toothbrushes. But we can still borrow these good ideas from high-assurance software and apply some of them while we’re building these things.

Ranum: So, what do you think about Charles Perrow’s Normal Accident Theory? I mean,—does Normal Accident Theory apply when you start talking about a toothbrush? Do we have a situation where you could have your toothbrushes exhaust your DHCP address and cause a reactor failure someplace else? I mean—is that our future?

McGraw: It might be because everything is ridiculously interconnected. So, if you’re dumping noise from all of your toothbrushes onto the net, and a really important message needs to get by but the toothbrush traffic is too heavy, what are you going to do?

Ranum: All of the toothbrushes are updating themselves at exactly the wrong time and crash the…! You know, I think Perrow has got a point, by the way don’t ever read that book while you’re landing in an airplane.

McGraw: [laughs] I know. Well look, here’s kind of a funny thing, when you’re in computer science school, on the first day of class, you don’t get the bejesus scared out of you. If you’re in mechanical engineering class, you go and you see that bridge shake itself to death, and they go “this could be your bridge” and you’re 20 and you’re going “oh my god I don’t want to make a bridge that falls into the ocean.”  You know?

Ranum: Right. It could be your legacy.

McGraw: Exactly. In computer science school, it’s kind of like, “oh, let’s write ‘hello world,’ you can build anything you want.”

Ranum: But you know, we do kind of get that. Eric Allman used to come up to me at USENIX and go “why do you hate me? What did I ever do to you?” And I’d say “you didn’t do anything to me, you just gave us sendmail, syslog….”

McGraw: [laughs] sendmail. Yeah.

Ranum: I mean either sendmail or syslog would have been enough of a legacy in computer security.

McGraw: My USENIX story like that involves Larry Wall. It’s like, you know, I was sitting on a panel with him one time and he said “Perl is fantastic because there are six ways to do everything.” I was right after him and I said “Perl really sucks from a security perspective because there are six ways to do everything and you only thought of blocking four of them.”

Ranum: Ok, so obviously this is going to get a lot worse before it gets better. At least I think it’s going to get a lot worse.

McGraw: I’m optimistic about the whole thing—you know that.

Ranum: Are you?

McGraw: Yeah.

Ranum: Years ago I was saying the only thing that’s going to get the world community to take this problem seriously was a “software Chernobyl.”

McGraw: Right.

Ranum: And I’m afraid that, you know—I don’t want to be a Cassandra, but it seems like eventually it’s going to be inevitable that there’s going to be some disaster.

McGraw: There have been some, you know. The Ariane 5 rocket, the Therac-25 that burned a bunch of people to death with radiation. We do know that life-critical systems have to be done better. And I actually think that we’re making progress—even building consumer-grade commercial systems and we’ve been doing a better job with that over the last 20 years. So, it feels to me like the trend is in the right direction, not the wrong direction. Although, you might argue, “well, we’re growing so fast that it will never catch up.”

Ranum: I’m not sure I’m concerned with the growth factor as much as it’s the governments are uncorking this new genie as well now we can screw you up by causing you economic damage. By making your new reactor no longer function, it’s going to cost you millions of dollars.

McGraw: Or slow down your nuclear weapons program by making your centrifuges fail to work properly.

Ranum: Exactly. Which of course is a violation of the Geneva Conventions to do because you’re not supposed to mess with nuclear power—

McGraw: Turns out war doesn’t follow the rules all the time, Marcus.

Ranum: As it turns out, war never follows rules. And that’s where, you know, I always like to introduce the concept of a weapon of privilege in cyberwar. A weapon of privilege is one that I can use against you but if you even dream about using it against me you better wake up and apologize.

McGraw: Yeah. Well, there are a lot of those because we have a nuclear deterrent in the United States as well, so you’ve got to factor that in. You know, something could go kinetic. But, I think one of the important things to realize about cyber warfare in this whole thing is that the best way to avoid it is to engineer ourselves stuff that’s really much better and way more expensive, and basically outspend our adversaries building stuff properly. That’s the deterrent. The deterrent is that our engineering is great and yours isn’t so great. Go.

Ranum: Yeah. I agree. You could actually deny your enemy the ability to have a seat at the battlefield. You make it so much more expensive for them, which is basically what’s happened with the nuclear weapons around the world, is that they made it so expensive to play in that space that just the cost ought to deter pretty much anybody rational. Although, one of the popular things for people to say is “it’s irrational to want to have nuclear weapons” but actually it’s profoundly rational to want to have nuclear weapons.

McGraw: Yeah. Well, in a mutually assured destruction situation, that’s true. But, I think cyberwar is complicated economically because the cost of building, say, a disruptive thing that was destructive of some information-based important piece of life, like, I don’t know, the water system, or something like that, or the power control systems, is cheap. I mean, developing something to screw that up might cost 10 million dollars, not 10 billion dollars. And the number of countries out there that have even a six billion dollar budget for defense is pretty big.

Ranum: You know, if you’re an individual who is motivated, you can manipulate a government’s response to something simply by saying “I’m going to make it so expensive for you to collect my taxes, that it’s no longer worth collecting my taxes.”

McGraw: Right. I remember talking with you about that in Amsterdam about five years ago, it was?

Ranum: Yeah.

McGraw: We were walking in the rain, remember that?

Ranum: Yeah. And you said that that was some really morally repellant stuff I was working on. And, you’re one of the reasons why I never published that. I look at the anonymoids and generally, I’m kind of sympathetic to what they’re doing.

So, on the doctrines of cyber insurgency stuff, I made you this tin foil hat that you can wear.

McGraw: Thanks, I think. Let me put it on. [puts on tin foil hat]

Ranum: I mean, this is goofy, but the surveillance state, as it’s currently growing, where we’ve got the government that is basically asserting its electronic domain over much of what we’re doing. But, you know, if you actually wanted to launch attacks against the government, on a five to 10 year life cycle, instead of this basic smash and grab stuff that we’re seeing, how do you prevent somebody from embedding themselves in the development organizations producing software that’s running the federal government?

McGraw: So, the Tea Party insurgency becomes actually a bunch of dev guys.

Ranum: It would be the new version. I mean, imagine if—you know this. Like, if you and I wanted to put together a project to take the IRS offline, we’d get three network administrators, two systems administrators, and five programmers and they’d never know what hit them. It would just be a five year long effort. I think the big issue on cyber insurgency is that governments don’t really think on an effective 10 year time cycle.

McGraw: I think you’re right, but I think that some organizations do, and I know for a fact that some financial services organizations are thinking about that sort of thing. Redoing everything.

Ranum: The people who run lotteries?

McGraw: Not the people who run lotteries. Financial services. You know, the people who run big banks and move around bits that count as money.

Ranum: What’s the end game here? If you were responsible for a government agency that was particularly critical, how do you prevent somebody who thinks on a 10 year assault cycle from embedding programmers in the development process?

McGraw: I mean, you don’t even need to do that. Are you kidding? The database is sitting there in the clear at the OPM. So, all you’ve got to do is just go grab it. I think that you’re worried about a possible world that’s worth thinking about, but it’s also way more effort than it takes to take a government down today.

Ranum: I definitely agree with that, but the way I like to think about strategy is—

McGraw: I’m wondering why I have my tin foil hat on though.

Ranum: It’s because I was talking about the surveillance state.

McGraw: Oh. You were talking in the clear and I was good. But, I didn’t actually say it.

Ranum: You probably didn’t hear it.

McGraw: [laughs]

Ranum: See, the tin foil hat’s working. But, I mean, here’s the problem. Suppose I’m someone—we hear consistently that the NSA and the surveillance state have been inside the algorithm development and have mooted SSL from the beginning. And some of us knew this was going to happen. Right? What if somebody was inside the NSA’s code development or inside the CIA’s code development and mooted the NSA or CIA’s code development? And in that situation, I think a five year or 10 year program cycle for that attack is more realistic.

McGraw: And, you know that sort of thing is happening. I mean, you have to build something from first principles including the compilers and the chip fab stuff and the—everything.

Ranum: Yet they run Windows.

McGraw: Some parts of them run Windows. So the question is whether there’s really true compartmentalization in some areas. And one would hope the answer is yes. But, we don’t really know. But you know what, Marcus, if you think about society and how we’ve sort of kludged our way along as a species, just barely surviving, we’ve been doing it for 30,000 years. It’s not like the end of the world has come because there’s a new domain now. It’s just the same crappy kludges that we’ve always had because it’s evolutionary pressure on this one planet that we’re talking about.

Ranum: You know, I was at Source Boston, and I think—

McGraw: I’m taking my hat off.

Ranum: Yeah, you can take that off now. Let’s get back to being a little more serious. You remember my evolutionary biologist friend, Robin? I took her to Source Boston and one of the keynote speakers was talking about evolutionary models in security. In the question period, she stood up at the end and said “I am an actual evolutionary biologist and I’ve got to tell you, for evolutionary algorithms to work, you have to be willing to suffer billions of casualties.

McGraw: Exactly. You need to have generations wiped out. That’s the way the genome moves because we’re talking about genomes moving.

Ranum: And we’re not prepared to suffer that in computing. What happens when an entire operating system’s tree dies? Or, is that what is happening to Java? That branch of programming is dead. At this point you’d be crazy to put any investment into writing more Java code.

McGraw: There’s an awful lot of people still writing enterprise Java code to do things.

Ranum: There are people writing Fortran too.

McGraw: Yeah. I don’t think any old code ever dies. I think that’s one of the really terrible things about programming is, if you wrote it 20 years ago, it’s still somewhere and somebody’s going to find it and run it. And, it has your name on it!

Here’s the best part. So, all these vendors that are creating security mechanisms with millions of lines of new code think that they’re building more security to put in front of the broken stuff, but they’re actually increasing the pile of broken stuff and the attack surface at the same time. So, if you realize that the cloud is somebody else’s computer, and you realize that the security mechanisms are somebody else’s code, it makes you think twice about the way we’re doing some parts of computer security.

Ranum: But, I mean, it seems to me like the federal government is outsourcing running a lot of stuff to beltway bandits that write the code.

McGraw: Who don’t know anything about software security. If you think about the vast middle market, and these people that are system administrators for mom and pop shops, or whatever—or just cousin Joe who knows something about Windows. Those people are probably better replaced by the engineers at Google, or Amazon, or Microsoft.

Ranum: Absolutely.

McGraw: So, in some sense, that movement to the cloud can be a good thing from a security perspective, even while for some people who do know what they’re doing can be a bad thing. It really depends on where you sit and how you want to operate your enterprise.

Ranum: Well, it’s a basic business problem. You decide whether to change your own oil in your car or whether to let somebody else do it for you. And then, the other question is if you know how to change your own oil in your car you’ve got a better chance of telling if they guy didn’t do it right.

McGraw: Yeah, that’s a good point.

Ranum: You know nothing about how oil changes work and you drive away and there’s oil splashing out the bottom of your oil pan—if you know nothing about it, you might go “huh.”

McGraw: Well, then you become an anti-vaxxer. And you think that science is bad even though you like your electricity.

Ranum: That’s right.

Ok, so you play violin, you mix drinks, you raise goats, you live on a farm where birds poop on your interviewers. But, what else do you do?  What are some other fun things that you do just to keep yourself busy?

McGraw: Well, you know the music thing is really important, but you mentioned that. The party thing is important. What else do I do? I read a lot. I like to read fiction.

Ranum: What are you reading right now?

McGraw: I just started this book and I can’t recall the name of it, but the book I read right before that is a new one by a woman from NYU who just graduated named Julia Pierpont and I think it’s called 10,000 pieces, or 10,000 things (Among the 10,000 Things). Something like that. Fantastic to see a young person writing with such insight about human existence. That’s really cool. So, it kind of refreshes my feeling about humanity not being a terrible horrible disaster when you read the thinking of some of these people that they put into their fiction. That makes me happy.

Ranum: Are you an optimist or a pessimist?

McGraw: I am a radical moderate neutralist.

Ranum: So, you’re like one of those radical lefto-rightists?

McGraw: Lefto-rightists. Yeah. Sort of. I mean, I’m a liberal but I own a bunch of guns which you and I have shot many times.

Ranum: Well, owning guns doesn’t make you conservative.

McGraw: I know, but that’s what the gun people think, generally speaking.

Ranum: This has been a Silver Bullet Security Podcast with Gary McGraw; hosted today by stunt double Gary McGraw. Silver Bullet is co-sponsored by Cigital and IEEE Security and Privacy Magazine and syndicated by Inform IT.

show 120 - Gary McGraw