Gary McGraw: This is a Silver Bullet Security Podcast with Gary McGraw. I’m your host, Gary McGraw, CTO of Cigital and author of Software Security. This podcast series is co-sponsored by Cigital and IEEE Security and Privacy Magazine. This is the 124th show in a series of interviews with security gurus, and I’m super pleased to have with me today, Lance Cottrell.
Lance Cottrell: Hey, Gary. How’s it going?
McGraw: Good. Dr. Lance Cottrell is Chief Scientist at Ntrepid where he works on the Passages product. Lance founded Anonymizer, Inc. in 1995 and took it to exit in 2008. He’s been at the cutting edge of Internet privacy, anonymity, and security for over 20 years. Lance is also active with startups and is on the board of the North Bay Angels and a mentor for SoCo Nexus Sprout. Lance has a BA in Physics from UC Santa Cruz and a MA and PhD in Physics from UC San Diego. He lives in Sonoma County. Thanks for joining us today.
Cottrell: Thanks, and almost a PhD in physics. Actually, I started my company with the idea of finishing the PhD in my spare time, so I have most of the thesis in a box.
McGraw: You got an ABD, huh?
Cottrell: I have an ABD, exactly.
McGraw: Okay, well we’ll take that back. You founded Anonymizer, Inc. in November 1995 when I was just getting started in computer security myself. Why did you found it?
Cottrell: I really got started because of the whole second crypto wars. The first crypto wars were a little too early for me, but when the Clipper Chip came along, I was active on the Internet because physicists were the people who really had access to the Internet. I had a Sun SPARCstation on my desk that directly connected to the backbone. I got interested in these privacy and security issues, and so I joined the cypherpunks and started building and operating Internet privacy tools. I wrote an anonymous remailer called Mixmaster and that got really exciting to me; rather more exciting than my thesis was and I realized—
McGraw: What did your thesis advisor say about that?
Cottrell: Well, he was a little disappointed. But, it’s hard to hold someone back when their passion is really pulling them. There were 10 people in the world who really cared about the outcome of my thesis, and frankly, I had already told all of them what the answer was. Whereas, with this, I was getting tens of thousands of people using the software I was writing, and I was excited about the issue. I realized that the open source tools I was building were never going to be usable by the public. The cypherpunk tools’ first instruction was always ‘run this makefile on your Linux command line’ which sort of ruled my mother out.
McGraw: Your mother should have been geekier, Lance.
Cottrell: Well, my father could have managed it.
McGraw: What was the biggest technical challenge that faced the Anonymizer work?
McGraw: That was right at the time when (Ed) Felten and I were writing about Java security and we coined the term ‘malicious mobile code’ which was causing that problem, I suppose.
McGraw: What are your philosophical musings about anonymity and security in their interaction given your 20-year involvement in all this stuff?
Cottrell: I think anonymity is an interesting case. You often see a lot of people who feel like there may be a place for anonymity. But, fundamentally, the world should be identified. I kind of take the opposite tack which is: that which does not have to be identified should probably support anonymity (although, in many cases, a lot of the things that people attribute to anonymity). They say people are cruel on the Internet because they’re anonymous. But this has more to do with the fact that there’s not an effective reputation mechanism and it’s not a face to face. It’s a distant thing. You don’t see the person flinch when you say something. And I think if you look at Facebook and the harassment that takes place there, largely in true name, you’ll see that the anonymity is not likely the magic problem.
McGraw: Yeah. Sometimes you need anonymity for some transactions too, and sometimes you don’t. But it seems to be the case that generally speaking, security can be the enemy of anonymity when you take a naïve approach to security.
Cottrell: Exactly. I think a lot of people have taken sort of identity and authentication and conflated them, and they don’t necessarily have to be conflated. What I care about if I’m exchanging an ongoing dialogue with someone is, generally, that I’m talking to the same person—not necessarily that I know that that was the name on their birth certificate. Of course, we’ve seen that blow up in Facebook’s face when they’re trying to deal with transgender people who want to use some other name.
McGraw: That’s right. Its those crazy wrinkles. But you know all those systems are ripe for abuse in the sense that you can create an identity and then fold it when it gets to be a pain, and create new identity, and just keep hopping along.
Cottrell: I think that’s actually a really interesting case because we do a terrible job of managing reputation online. That is—the difference between a Facebook account that was created yesterday and a Facebook account that’s been robustly operated for years in an upstanding way—it’s not easy to tell the difference. So it enables that kind of hopping. I think that’s less a problem of identification than it is of how you manage a system to allow you to build reputation. This is an idea that has been discussed heavily for years in science fiction. Vernor Vinge talked about kind of reputation and identity back in the early 80s. Ender’s Game has a robust discussion of how you would take pseudononymous identities and build them up. But yet, somehow, we’ve never actually bothered to build any of those ideas into the social networks that we’ve created. I think that’s a failure of imagination rather than of impossible security problems.
McGraw: Right. Yeah. That’s probably true. We’ve also made some interesting forward progress in anonymity and privacy. I’m interested to know what you think about the Tor network.
Cottrell: The Tor networks are really an interesting case and they’re almost as old as my systems. They took a radically different tack. Their idea is: how do you not have to trust any central authority? They are radically opposed to investing anyone with the keys to the kingdom, which I can respect. But the problem is that ends up—at least as they’ve implemented it—any volunteer then can become part of the network and you’re then vulnerable to a set of attacks by any of those volunteers.
McGraw: Exactly. Well, that’s what the FBI in some sense did, you know, looking at the exit node stuff.
Cottrell: Exactly. And you have to assume that some large fraction of exit nodes are run by people who do not have your best interests at heart by whatever definition you want to choose for that. In fact, it’s kind of surprising that there are as many legitimate, altruistic people running these nodes as there are, because there’s a huge amount of incentive for other groups to be running lots and lots of these things.
McGraw: Recently everybody believes that’s what’s happened.
Cottrell: Yeah. I think that’s probably a fair assumption. If I were running an intelligence service, I would certainly own a whole lot of Tor nodes just to spy on them, yeah.
McGraw: Thomas Rid wrote a great paper on Tor and the dark web. There are some obviously unseemly aspects of all human societies and the dark web is one of those. What’s your reaction to the kind of stuff that shows up in these anonymizing systems?
Cottrell: That’s an interesting problem. You get these black markets taking place. At some level, I think a lot of the issues that we get upset about on the Internet are really just recapitulating the same things that happen in the real world. But, when you put the words ‘on the Internet’ at the end of it, everyone gets sort of disproportionately excited about it.
McGraw: I think that’s true. For example, if you look at the newspaper coverage from the 1800s, there used to be “telephone murders,” where “the telephone” played a (often minor) role in a murder. And so that was news, so maybe it’s just that.
Cottrell: I think that’s certainly part of it. I remember people would talk about my anonymity products and say, “How can you ethically allow people to send anonymous email?” Back then, I would say, “Well you know, on most street corners, there are these blue anonymous communication boxes.”
McGraw: Right. Well, but they are not really anonymous. I mean, the postal service actually scans all those addresses and keeps them in a giant database.
Cottrell: Sure. But you don’t have to put a return address on it, and you can lie about it, and it’s easy to drive across town. They’re pretty robust for ransom notes and things like that. Of course, there’re classified ads in the newspaper that have been used forever that you can place quite anonymously and, you know, numbers stations, and all of these other things which, again, have been recapitulated on the Internet. If you remember, Usenet groups, there’s a group called alt.anonymous.messages which is largely filled with blocks of ciphertext and nothing else.
McGraw: I actually used NN because I liked the idea of “no news is good news” back in those days. But that was a long time ago.
Let’s talk about attribution which is sort of related. It’s kind of the counter of this problem. So, when you’re trying to do attribution, where are we these days? Can we do it?
Cottrell: I tend to think that we—and you see these reports that say, “We know that this hack was caused by this group with this sort of attribution.” My personal feeling is that’s really pretty weak. Unless there’s some really unusual zero-day exploit that has some code fragments that you’re confident haven’t been reused. But these things, as soon as they get out there, they start getting repurposed and false flagging is so easy.
McGraw: False flagging seems like the most obvious way to defeat that kind of attribution.
Cottrell: Exactly. If you know people are going to suspect, say, the Russians for a certain activity, if you false flag as Russia, you’d probably stop the investigation right there because people found what they expected to find and they’re not going to dig much deeper. The anonymity tools are easily robust enough, I think, to in almost all cases prevent any technical traceback. So, now you’re looking into more subtle things like language analysis which is surprisingly powerful and effective. We built a tool for that to recognize authors and do author identification in our company, just as a little side project. I had a lot of fun because back on the cypherpunk mailing list, I had maintained over a couple of years a pseudononymous identity in addition to my true name activity. And so I challenged my team to spot me.
McGraw: And did they?
Cottrell: Oh yeah. They tagged me instantly.
McGraw: It stuck out like a sore thumb.
Cottrell: I was shocked. I was not thinking that they were going to be as successful as they were, but they came back and said, “This is the top of the list.” And I was like, “Oh yeah, that was me.”
McGraw: Was it that you always agreed with yourself in flame wars?
Cottrell: No. In fact, I made a real point of not doing that. It was the way I write. It was looking at n-grams of my writing patterns. And it turns out that’s actually really difficult to hide.
McGraw: So let’s talk about two attribution issues in particular. The Sony hack. What do you think about the attribution job there?
Cottrell: I don’t have access to the raw data. It seems an awfully convenient story, and it’s a nice story, but it doesn’t feel that compelling to me. I certainly don’t feel like that case has been made.
McGraw: Right. And then I guess the one that’s all in the news the last couple of days, the alleged DNC hack carried out by the Russians.
Cottrell: That seems even weaker to me. And it also, appears that there was more than one hack probably. When a company has been pierced three or four times it becomes even more difficult to understand the attribution of which of those players is doing which bit of damage that you’re seeing coming out the other side.
McGraw: Yeah, apparently the CrowdStrike guys said that those two groups had not even known the other people were in.
Cottrell: I mean—the funny thing is, in general, many of these organizations, especially political organizations, are so weakly secured—Internet security is just not even on their radar that, you know, the two or three hackers that we know of—it’s probably exactly that. It’s that we know of. This may have an interesting effect, though, on that whole crypto security debate that the law enforcement guys have been saying, “Only bad guys have things to hide. We want to make sure that, you know, you don’t have too much crypto, you don’t have end-to-end. We want to make sure there’s backdoors. We want to make sure (back to the Clipper chip) there’s going to be some magic key that we’re going to protect in some way.” As a victim of the OPM breach, I’m somewhat suspicious of that.
McGraw: They’re not very good at that key protection part. The Chinese have my fingerprints too.
Cottrell: Yeah. Exactly. So then when this starts hitting home, that, “Oh, I’ve been breached. All of my emails got hacked,” we may start seeing some of the politicians saying, “You know what? This crypto thing—that might have been a good thing if all of this was not available on the servers.”
McGraw: That’s a little bit technically inaccurate, though, because it wasn’t really the network that was being monitored. In these ATP situations, you have access to the entire machine and the file system. It was the endpoint that was actually hacked, which means that end-to-end encryption wouldn’t really help.
Cottrell: No. And I haven’t seen—I actually have not looked through the dumps to see. Is it just, you know, the data from a single endpoint or were they able to actually pull, say, the entire mail server’s contents down? You’re right. That would certainly make a difference in the impact, and it’s much harder to defend against. If someone gets your endpoint, you’re in a pretty difficult situation.
McGraw: They haven’t dumped all the gigabytes of stuff they apparently have yet from what I understand. But one interesting thing is, it’s clear that some of the files had actually been tampered with at least in the first release. There were metadata tags in some of the files that were in Russian, for example. And in the second release, those had disappeared.
Cottrell: These are becoming, and certainly Sony was, and this as well—these guys are learning how to make it orchestrated PR/propaganda events. It’s no longer “everything drops at once” but it’s metered out, it’s dripped. And why not, then, start introducing new information if you can do 90% of the damage with something that’s made up even if you get caught. A lie goes around the world before truth gets its pants on.
McGraw: There you go.
Cottrell: It can be very effective for what—but then you start to get into a very conspiracy theory kind of mindset about this.
McGraw: So, WikiLeaks. Force for good? Force for evil? Relevant? Irrelevant? What do you think?
Cottrell: I think probably net, net, it has done a lot of good. But some of the picking and choosing is a little problematic. And I think, now that it’s been around for a while, we’re starting to see these targeted hacks where people are leaking with a very strong agenda. And we may now start to see leak wars or a bunch of liberal hackers are going to start trying to hack all of all of Trump’s and the RNC’s stuff so they can counter leak and back and forth. It’s an interesting problem. It’s similar to the whole Snowden issue. I have a lot of problems with what he did, but I’m also kind of glad we know about some of the stuff that he revealed.
McGraw: Yeah. I guess if you have leaks that are being editorialized by the technology that’s supposed to just publish them without comment, in some sense, that leads to the leak war situation, and, you know, you end up with WikiLeaks and MyRealWikiLeaks and whatever. They’ll be competing sources of information and all of a sudden they will have an editorial stance.
Cottrell: Well, and then you’ve got, with some of the leaks about Afghanistan, they’re saying, “Well you should have censored, for example, the names of civilians in the country so that they don’t get killed.” That’s certainly sort of a fairly moral argument. But once you start introducing the question of editing the content, then bright lines tend to be the only thing you can enforce. This is when I realized back with Anonymizer that we spend a lot of time thinking about what if someone bad uses our anonymity tools, you know? How do you enable the good purposes without the bad purposes? And the conclusion we finally came to was: if the data exists, I can be forced to hand it over for reasons that I would have a problem with. And we ended up architecting the system from the ground up. It was intentionally architected ignorance. We had no way of knowing it, so we could not be compelled to provide it.
McGraw: In the days of national security letters, that’s pretty essential.
Cottrell: Absolutely right. Because the government can compel a lot of things. But so far it looks like—and the Apple decision was one I watched really closely—it looks like they can’t compel you to re-architect your systems to enable information gathering that they would like to have. Absence a law, because clearly CALEA did exactly that. It forces the phone companies to modify their phone switches to enable the following kinds of intercept. And so far, none of that’s really applied to the kind of services I run. And that’s a huge tension because once it’s in there, of course, it’s global. We think of it in terms of “what about US law enforcement.” And you may hold whatever position you hold about US law enforcement but, in general, you know, they’re fairly upstanding. They do a good job. They’re not evil, but that’s certainly not true of law enforcement everywhere in the world. These things have huge collateral damage effects in countries that are much less upstanding about their law enforcement and attention to the rule of law and so forth.
McGraw: Yeah. An example of that from the WikiLeaks case is what’s going on with some of the people that were outed in various WikiLeaks documents surrounding Turkey. In particular, lots of women that were activists had, you know, accidently been outed, sideswiped by WikiLeaks. And it seems that, according to the people in Turkey, at least, the government is using that information to round people up.
Cottrell: Right. Exactly what you would expect to happen. This data is valuable to someone and it was released with one purpose. But once it’s out there, you have no control of the purpose to which it’s put. I think that is one of the moral complications about WikiLeaks is how do you draw that line? And they’ve largely taken the bright line, “We’re going to release everything,” but I don’t think they’ve taken full moral responsibility for the implications of that.
McGraw: Yeah, and the drip certainly calls into question this kind of moral purity idea.
Cottrell: Exactly right. When you turn from being a “Here’s the data we have” to an orchestrated PR campaign that’s carefully timed around political events, you know, “We’re gonna drip this out during the Democratic convention,” well that’s not accidental timing, right?
McGraw: Right. We’ll be right back after this message.
If you like what you’re hearing on Silver Bullet, make sure to check out my other projects on garymcgraw.com. There you can find writings, videos, and even original music.
McGraw: So let’s switch gears a little bit. You’ve been working on secure Web browsing now for a few years. Where does that stand? And then what impact does mobile security and geolocation have on your thinking about secure Web browsing?
Cottrell: We started thinking about browser security because we had fairly good solutions to securing most of the other paths into our own business, and we originally were looking at this internally. You can sit and scan all the attachments. You can look at sources and emails pretty well. You can restrict and train people not to put thumb drives in their computers, but the Web is one of those things that you have to touch. It can contain really any kind of content and it’s real time. You don’t have time to run detonators and to execute the code in a sandbox or something like that. And browsers are gigantic and hairy beasts with bazillions of lines of code, so they’re going to have enormous numbers of vulnerabilities. So that’s where we focused. And we quickly realized that actually writing a browser that wasn’t going to be full of holes was going to be impossible. So we focused on the virtualization and isolation approach: putting a conventional browser in a virtual machine, hardening the heck out of that machine, and then making sure it can’t talk to the local desktop, which works really well on the desktop.
The mobile situation is much more complicated, especially because with Apple, they really lock down the kind of things you can do and it’s a catch 22. On the one hand, that hugely improves security, but on the other hand, it hugely limits the kind of security tools you can build.
McGraw: Right. Let’s talk about the implications of geolocation too when it comes to the mobile stuff. You’re basically putting a browser on a phone and then it’s got access to various other information that may not be coming from your PC all the time.
Cottrell: That’s right. We’ve all gotten really trained to carry around these location devices that are tracking us at all times. And God knows I’ve been giving enough to Nintendo and friends recently with catching all the Pokémon.
McGraw: Are you just catching them on your driveway?
Cottrell: That’s right. Just walking to and from the mailbox, so no competition. But you’re right. The amount of location information that we’re giving out and to huge numbers of people, right? So the phone company knows where you are. The phone manufacturer, Google, knows who you are. Each one of these other apps knows who you are. The websites are all requesting information. And a lot of the browsers will, at least, alert you and say, “Do you want to provide this information to this person at this time?” Even laptops have geo information because most of the high-resolution geo information is actually coming from Wi-Fi locations. Just looking at what Wi-Fi nodes are in sight, even if you’re not logged into them, that provides very high accuracy information about your location.
McGraw: So assuming we get a handle on this secure Web browsing for boxes that are mostly stationary laptops, then we have a much harder problem with mobile devices I think.
Cottrell: Yeah. Exactly. I think virtualization—and what we’ve done is inside the virtual machine, we VPN the traffic out to one of our nodes and then bounce it out. So the IP address is not associated with you. And we scrub out all the identifiers because things like cookies and super cookies, that are much harder to delete, provide really effective tracking. There’re so many ways of tracking like Panopticlick, and the browser fingerprint, and now canvas fingerprinting, where they’re actually fingerprinting the video hardware and firmware in your computer which provides yet more bits of data. Virtualization helps with that in that you can make sure that the fingerprints all look exactly the same, and that’s a pretty big win. So that, you know, you’re now identifiable as using a tool but are indistinguishable from everyone else using the tool.
McGraw: I guess that’s good. The Crowds solution from way back when in the 90s that AT&T did required a crowd. If you have a mix network and there’re only two people on it, it’s not very powerful.
Cottrell: Exactly. That’s the interesting thing about anonymity tools is you have to have a large number of users. Your anonymity is entirely dependent on the size of that anonymity set at any given moment. And getting back to Tor, one of the problems they’ve got is it’s real time, which allows all sorts of other attacks as opposed to say email which is store and forward which allows much better protection of identity because you can’t then watch the individual variations of each channel. They’ve done attacks on crypto where they were able to reconstruct voice communications by sniffing an encrypted link and just looking at the bandwidth variations.
You started way back with the Mixmaster remailer that you mentioned before. So the last thing I want to talk about is have we made any progress with email privacy and security, you think?
Cottrell: You know, no. It’s one of those things. Email security is terrible. The tools are still awkward to use. They don’t integrate well with the clients and S/MIME is built into everyone’s email clients now and almost no one uses it. It still leaks metadata. It leaks the subject line. It leaks to and from and all the traffic. But it would be better than nothing and yet still almost no one uses it.
McGraw: Well, I had to re-up my cert. Saturday, which I did, and I haven’t gotten around to populating all my devices with the new crypto yet, which I need to do. It’s going to take me an hour to do them all.
Cottrell: Exactly. It’s not easy. And you know, you take this seriously, but your average person, there’s no way they’re going to spend an hour working out how to export and import the certs to all of their different devices. It’s this huge, hairy mess. And I think it’s going to have to be a provider-side support where, like the Web, SSL was built into every browser. And we know there’s all kinds of problems with SSL, and there’re weaknesses, and there’re attacks, and there’re issues with the certificate authorities, but it’s still a thousand percent better than unencrypted HTML. And the same thing, I think, is true of the Web that we understand how to build things that are fairly secure and fairly anonymous, but what we’re doing is you know a tenth of a percent of that. And so anything we can do would be a huge step forward.
McGraw: I guess one of the challenges is that the companies that are providing the free email services that everybody uses, I’m thinking Microsoft, and Google, and Yahoo, or Verizon, as the case may be. Those guys, the reason their service is free is because you are the product. You can have targeted ads based on the content of your email. And if the mail’s encrypted, that makes it that much harder to do.
Cottrell: Absolutely. I think it’s interesting that they don’t even give an option to pay. And maybe they feel like it would send a bad message and give people a bad taste in their mouth. But I am just hungry for many of these people for an option to say, “Can I give you 50 bucks a year or whatever, and stop reading what I’m doing, and make everything encrypted, and keep it secure and manage that.” I could, I suppose, run my own mail server, but I have a fairly—I know I’m not going to maintain that mail server at my house with the love, and care, and patching, and security that it deserves because I’m busy doing other things. That’s not my job.
McGraw: Right. Interesting. So let’s talk about some of those other things that you do. What is the best batch of Pinot Noir that’s ever come from your Sonoma grapes?
Cottrell: I’ve only been making wine out of my own grapes now for two years. The 14s are in bottle. They’re tasting pretty good, but I’m learning what I’m doing. The 15s are tasting amazing and I’ll probably be bottling them in maybe a month. The grapes outside my window have just gone through veraisons. They’ve just turned from green to red. It looks like a pretty big harvest, so I think 2016 may be a good year and a reasonable quantity.
McGraw: That’s much better than running mail servers, Lance.
Cottrell: That’s right. The grapes outside the window are a distraction from patching the server in the basement.
McGraw: Cool. Well, thanks for your time today. It’s been really interesting.
Cottrell: Thanks very much. It was great talking to you, Gary.
McGraw: This has been a Silver Bullet Security Podcast with Gary McGraw. Silver Bullet is co-sponsored by Cigital and IEEE Security and Privacy Magazine, and syndicated by Search Security. The May-June issue IEEE S&P focuses on the economics of cyber security. It also features our interview with Jacob West in which we discuss the IEEE Center for Security Design.
Check out the video we shot to celebrate episode 120 of Silver Bullet: 10 years of the Silver Bullet Security Podcast. You can find that on the net.