Show 146: Nicholas Weaver discusses network security, botnets, and cryptocurrency

Nicholas Weaver joined ICSI as a postdoctoral fellow in 2003. The following year he was hired as a senior staff researcher where he continues to conduct research on network security and measurement, worms, botnets, and other internet-scale attacks. He received his bachelor’s degree in astrophysics and computer science in 1995 from UC Berkeley. He also earned his Ph.D. in computer science from Berkeley in 2003 where he continues to teach courses. Although his dissertation work involved FPGA architectures, he has been focused on computer security since 2001. Dr. Weaver lives in Berkeley.


Listen as Gary and Nicholas discuss the Spectre vulnerability, botnet attacks, research tech transfer, cryptocurrencies and blockchain technology, and more.

Listen to Podcast

Transcript

Gary McGraw: This is a Silver Bullet Security Podcast with Gary McGraw. I’m your host, Gary McGraw, vice president of security technology at Synopsys and author of “Software Security.” This podcast series is co-sponsored by Synopsys and IEEE Security & Privacy magazine, where a portion of this interview will appear in print. For more, see www.computer.org/security and www.synopsys.com/silverbullet. This is the 146th in a series of interviews with security gurus, and I’m super pleased to have today with me Nick Weaver. Hi, Nick.

 

Nick Weaver: Hello.

 

Gary: Nick Weaver is a staff researcher at ICSI. He also teaches courses at Berkeley. Nick joined ICSI in 2003 as a postdoc after earning a Ph.D. in computer science from UC Berkeley. Nick’s research focuses on network security, worms, botnets, and other internet-scale attacks. He also works on network measurement. Dr. Weaver holds a B.A. in astrophysics and computer science. His thesis work was on FPGA architectures, but he’s focused on computer security since 2001. Nick lives in Berkeley. Thanks for joining us today.

 

Nick: Thank you very much.

 

Gary: So I started my own foray into computer security in 1995, when Java came out. And you yourself have watched the field evolve for around 20 years. Do you think we’re making any progress in the field?

 

Nick: I think we’re making some progress. So programming languages started getting better with Java, especially around when Java added lambda but called it anonymous inner classes.

 

Gary: Yep.

 

Nick: And recently, I’ve been really impressed with Go as a programming language. And we’re now at the point where there’s no excuse to not do something new in a safe language, just because the safe languages are now so much easier than the alternative.

 

Gary: Yeah, I agree with that perspective, but then you have JavaScript. So what about that?

 

Nick: Well, the web browser is an abomination on the attack surface, but it has gotten better. So if you look at like Chrome, Chrome is going with a very high level of isolation. And so even though JavaScript is designed around this notion of some pretty lightweight security barriers people can break down, Chrome has then wrapped that up in processes and processes. And I think it’s a testament to their security architecture that if you go into a super paranoid mode—that unfortunately breaks printing—the Spectre-type branch prediction side channels no longer work.

 

Gary: Yeah, that is impressive. So in some sense, we’ve made good progress on real things that you can use, but we need to get people to use them a little better maybe.

 

Nick: Yes. But the progress has been substantial. Or you look at like the iPhone. The iPhone is a really solid device that has this really nice cryptographic and security engineering, top to bottom, that I can go out and buy for $700 or even $300. A device where a full zero-day remote code execution path is $1.5 million? That’s huge.

 

Gary: That is huge. So research in academia and high science changed perceptibly to include software security and programming languages stuff in the late ’90s. What other big changes in the way science is done in computer security have you seen?

 

Nick: I think one of the nice things that we’re starting to see is an emphasis on how hardware and faults interact with software security. So Meltdown was a bug. It was “Let’s allow speculative execution across a security barrier and do Traps late.” But Spectre is a side channel, so the Spectre-type attacks are cache side channels. And I think we’re coming to a very interesting realization that we can’t actually do a security barrier across a cache. We have to flush every single cache. And that’s a big thing that has developed based on a lot of people doing very interesting hardware analysis.

 

Gary: Yeah, I agree with your assessment there. The funny thing is if you went back in time and talked to the engineers about that boundary, they, I believe, thought they were designing an efficiency boundary and not a trust boundary. And it evolved into a trust boundary over time. So in some sense, we’re more sophisticated in our understanding of security, but also the requirements were different 20 years ago, when those designs were being done.

 

Nick: Not necessarily. Even 20 years ago, the user space / kernel distinction was always a trust boundary.

 

Gary: But the caching stuff and the efficiency stuff was driven by the fact that clock speed was the way we used to sell chips back then. I mean, I remember buying maybe a 486 with 4.7 megahertz, and that was a big deal.

 

Nick: Yeah, and the interesting thing is if you look at the history of computer architecture, starting actually around seven, eight years ago, we hit a brick wall on performance for single thread. And if you look at the 20 years prior to that, the big gains were out-of-order speculative execution and caches, caches, caches.

 

Gary: Time to rethink those thoughts.

 

Nick: No, it’s just that caches are incompatible with security barriers. So it just means we can’t do cheap security barriers. It’s just isolation barriers are not cheap and never will be cheap.

 

Gary: Right, right. Well put. So, ICSI is a nonprofit computer science research center. How is ICSI funded? Do you guys write grants to support your research? How does that work?

 

Nick: Yeah, it’s almost entirely grant-funded. So as a researcher at ICSI, I’m very project- and grant-focused. And this is why I’m doing more lecturing at Berkeley, is because as a lecturer, I don’t need to worry about research grants.

 

Gary: Yeah, I started out in soft-money research land myself a long, long time ago. What are your views on tech transfer of basic research of the sort done at ICSI out into the world?

 

Nick: Well, we tend to be very focused on that as a side consequence. As a research lab, we like building things that work. So for example, the Bro Network Security Monitor was developed at ICSI, and that’s being commercialized right now. Ten years ago, there was the eXtensible Open Router Project, and there was a significant attempt to tech-transfer that.

 

There’s also systems that we’ve ended up building that have monetization models that don’t match industry but are productized. So the Netalyzr network analysis tool that we originally wrote in Java in the web browser and now runs on Android phones, we keep that running because it pays us in research results. So we’re able to turn the service into publications, and therefore, we have a monetization strategy that couldn’t actually work out in the real world but works for us, and we end up supporting a large number of users that way.

 

Gary: Great, that’s good stuff. My own personal experience with tech transfer is it takes about a decade to move something from, you know, fundamental research in science out into the world as a product, depending on the model that you’re following. But do you think that that timeframe is about right, or have you experienced different timeframes?

 

Nick: Well, I’ve experienced short and long. I’ve watched the Bro Project take basically 15-plus years as basically a user and observer, while with Netalyzr, we basically productized it from day one because the product was how we did the research.

 

Gary: Right, yeah, yeah. That’s a good turn on the model. I like that. Let’s turn to a different topic. You and I seem to share the same skeptical stance when it comes to cryptocurrencies and blockchain. So can you briefly give us a synopsis of your recent “Burn It With Fire” webinar, please?

 

Nick: So I’ve come to this basically after five-plus years of watching the field and occasionally publishing on it. And what it comes down to is there’s actually three totally separate concepts. There’s the concept of the cryptocurrencies themselves, there’s the concept of the public blockchains, and then there’s the concept of the private or permissioned blockchains.

 

Now, let’s start with the latter. What is a private or permissioned blockchain? Just simply an append-only data structure with a limited number of authorized writers, a.k.a. a get archive. There’s nothing fundamental in a private blockchain that hasn’t been understood in the field for 20-plus years. It’s just it has a buzzword that causes idiots to throw money at the problem.

 

So if you see a private or permissioned blockchain project, it means either one of two things. Either it’s a delusional piece of techno-utopianism, or somebody smart in IT knows that there are real problems with what data you store, how you access it, data provenance, and all this other stuff, and has bandied around this buzzword because idiots up in management will now throw money at him to solve the real interesting hard problems.

 

Gary: Well, that can be a good thing sometimes, I suppose, although it’s always better to be straightforward about what you’re working on. So that’s one of the three. What about the other two?

 

Nick: OK. So the public blockchains are a global data structure where the idea is there’s no centralized point of trust but anybody can append to. Now, these systems are, let’s say, not actually distributed as advertised. So the Bitcoin blockchain is actually effectively controlled by only three entities. But in an attempt to be distributed, there’s this somehow religious notion that being distributed trust is somehow good in and of itself. The result is systems that are either grossly inefficient or insecure.

 

So the biggest tool that’s used for these systems is what’s called proof of work. And proof of work is best described as proof of waste. The idea is that for somebody to rewrite the history, they have to do as much useless work as was done to create the history in the first place. Now, this is great if you do a lot of useless work, except then it’s inefficient. If you make the system efficient so you don’t do a lot of useless work, you run into the problem of now you actually don’t have any real protection.

 

And so, for example, Bitcoin, since the proof of work is paid for, ends up using as much power as New York City, and it’s just an obscene waste of energy. And at the same time, these distributed public append-only ledgers only have been useful for cryptocurrencies. And so now it’s time to address the elephant in the room, the notion of the cryptocurrency itself.

 

Gary: Right, back to one. Here we go.

 

Nick: So cryptocurrencies don’t actually work as currency. They are provably inferior and can never be superior to the alternatives for real-world payments unless you need what is known as censorship resistance. So if I want to transfer you 500 bucks by PayPal or Venmo or whatever, we have these trusted intermediaries called banks, and they make it relatively cheap. However, there’s a problem. If I want to transfer $500 to you for drugs or the like, these central authorities don’t like it. And the only way to do censorship-resistant transactions without a cryptocurrency is cash. And cash requires physical proximity and mass. A million dollars in U.S. dollars weighs 10 kilograms. That’s a considerable amount of stuff to be lugging around.

 

So what a cryptocurrency is, the idea is “Well, let’s do a direct peer-to-peer payment system so that there are no central intermediaries, but let’s do it electronic.” So this has been used quite practically for drug dealers, extortionists, fake hitmen, and all sorts of things like that. But if I want to do any payment that one of the central authorities will process, the cryptocurrencies provably don’t work.

 

So let’s say I want to buy a couch from Overstock.com using Bitcoin. I have to turn my dollars into Bitcoin, because I don’t want to keep it in Bitcoin because the price is jumping up and down. That’s expensive. Transfer the Bitcoin. That’s relatively cheap right now, but it’s been upwards of $30 in the past. And then the recipient on the other side has to convert the Bitcoin back into dollars. And so you have these two mandatory currency conversion steps for any real-world transaction. And even Overstock, the one public company that supposedly embraces cryptocurrency, they only keep a few hundred thousand dollars’ worth of cryptocurrency. They are converting to dollars, let alone whatever other merchants are.

 

So cryptocurrencies do not work for legitimate purchases if you don’t believe in the cryptocurrency. But let us suppose you believe in the vision of the great Satoshi. Then you don’t want to use cryptocurrencies either, because they’re baked in with these monetary policies that are designed to be deflationary. And the first rule of a deflationary currency is never spend your deflationary currency, lest your pizza of regret that you bought for 10,000 Bitcoins is now worth more than the cube of money from “Breaking Bad.”

 

Gary: I love it. We’ll be right back after this message.

 

If you like what you’re hearing on Silver Bullet, make sure to check out my other projects on garymcgraw.com. There, you can find writings, videos, and even original music.

 

Gary: There’s one aspect of Bitcoins and cryptocurrency that I think—actually just cryptocurrency—that people don’t understand about, and it’s this notion of Tethers. Can you talk about that for a second?

 

Nick: There is a way to make a cryptocurrency work. You have to have an entity that takes dollars and gives you crypto-dollars at par and vice versa, that will take the crypto-dollars and return you dollars. This is called a bank, and these are called banknotes, and it’s recreating the 18th-century banking system. Now, this can work, but one of three things has to happen. Either you have regulation and enforce money laundering laws and everything else, in which case, you have a system that ends up being no cheaper, no more expensive than Visa or Venmo or anything else, so what’s the point?

 

Option number two is you can have what’s known as a wildcat bank. This is a bank that prints banknotes that are actually unbacked. And this is a term from 18th-century banking.

 

And the third option is you can have liberty reserve, where you actually do back up your reserves, you actually will redeem your digital banknotes, but you don’t follow the money laundering laws—in which case you end up being a guest of the federal government for the next 15–20 years. And at the same time, the money that the average person had basically is tied up temporarily or forever when the Feds shut down the institution.

 

Tether is a specific cryptocurrency that promises to be backed by dollars, that they promise that there’s this one-to-one ratio, where you give them dollars, they give you Tethers, and vice versa. The problem is this is almost certainly a wildcat bank, because they managed to produce some 2 billion Tethers in the space of a few months. And they are tied to a Bitcoin exchange that is otherwise cut off from banking, and it may have been the direct reason why the Bitcoin price shot up so much. The option number two is they could be facilitating criminal money laundering, in which case those behind Tether are liable to be guests of the federal government.

 

This is, however, what actually enables most of the Bitcoin exchanges. Very few of the cryptocurrency exchanges actually are connected to the U.S. banking system. So you have Coinbase, you have Gemini, and you have Kraken—who obviously should actually be shut down for other reasons of criminal activity, but that’s neither here nor there. All the rest of the exchanges pretty much, you can’t actually transfer money into and out of them. And so these are where all the hundreds and hundreds of different cryptocurrencies are actually traded on.

 

Tether has become this de facto reserve currency. If you look at like Bitcoin trading volume, most of it is actually on Tether-denominated exchanges and is not actually being exchanged for dollars. But these notional crypto-dollars that may or may not be backed up may or may not be a criminal enterprise. But the flow just seems to continue on, and it’s really actually surprised me that it’s lasted this long.

 

Gary: Yeah, it really is absolutely stunning, this stuff. Thanks, that was extremely helpful. I think a lot of people need to have their eyes opened on this stuff, and you’re one of the main people doing that.

 

Nick: Yeah, and I feel I have an obligation to. So there’s a selection bias in the cryptocurrency space that occurred over the past five to seven years. It mostly only attracted believers. Most people would look at it, go, “This is garbage,” and ignore it, or they’d become true believers. There was no economic model for somebody to go, “Oh, this is BS,” but keep looking at the field, except for just a few academics. And so I was in a position where I have a monetization model in dealing with cryptocurrency. I can turn the madness into papers. And so I kept looking at the field. And in the recent run-up, I’ve come to the conclusion that it’s no longer harm-limited to a small population of self-inflicted believers. It is spilling out into the regular public.

 

So let’s look at ransomware. About four or five years ago, we had the first ransomware epidemic. And I believe it was Giovanni Vigna was able to somehow get control of one of these ransomware purveyors’ server infrastructure. At the time, you could pay with either Bitcoin or Green Dot MoneyPaks. And effectively, everybody paid with Green Dot because you could walk into 7-Eleven, buy a MoneyPak, and get your data back. Now, this ended up ceasing when the U.S. Treasury said, “Hey, Green Dot, it’s time to clean up your act on money laundering.” And now when you register a Green Dot card, you have to provide your Social Security number, and there’s no longer gangs in Europe going around to ATMs harvesting Green Dot cards. This actually disrupted the ransomware epidemic for a while, and now it’s come back with Bitcoin only.

 

I don’t really care about the drug dealers. Silk Road was entertaining to watch. That Ross Ulbricht managed to lose so much money on fake hitmen was just simply an additional amusement. But the ransomware epidemic is affecting real people who actually are innocent victims. And fortunately, I think the cryptocurrency space can die with proper application of regulation because of how the regulations already are. But it’s become important for me to, I think, advocate for the need to clean up this space and that cryptocurrencies don’t provide benefit to society. They don’t provide benefit to all of us who aren’t interested in committing crimes. But they do enable these problems, and I think it’s important to speak out.

 

Another thing is just the amount of scams in the space are just incredible. So effectively, every initial coin offering these days should be called a scam because it’s an unregistered security and wouldn’t even past the laugh test at a “Shark Tank” show. But these are causing huge damage. Or we’ve got these people hyping smart contracts. And most of the cryptocurrency community seems intent on speedrunning 500 years of economic history for choosing their bad ideas, but smart contracts are actually a new bad idea.

 

So the idea behind a smart contract is that I write a program that’s not really a smart contract—it’s a finance bot. Because if it’s a contract, you have this exception-handling mechanism called a judge in the legal system. If I can walk up to a smart contract, say, “Give me all your money,” and it does, is that even theft? Well, it would be theft in the real world because we believe in justifying things and this exception-handling mechanism of the judge and jury and all that.

 

Smart contracts are instead “Let’s take the idea of a contract that’s standardized and written in a formal language—it’s called legalese—and instead rewrite it in a language that is uglier than JavaScript, that has all sorts of pitfalls for programmers, eliminate the exception-handling mechanism, and then require that the code be bug-free.”

 

Gary: Except for I can assure you that we’ve looked at the code, and it’s not bug-free.

 

Nick: Oh, and it’s so amusingly not bug-free. So I like to use three examples. The first is the DAO, the decentralized autonomous organization. The idea is “Let’s create a self-voting mutual fund for how we can invest our cryptocurrency in other projects.” Now, that there’s actually nothing to invest was neither here nor there, but basically 10% of all Ethereum at the time ended up in this basically self-creating, self-perpetuating, not-quite-a-Ponzi Ponzi scheme. And this was all fine and good until somebody noticed there was a reentrancy bug that allowed them to say, “Hey, DAO. I’m an investor. Give me all my money, and in the process, repeat the thing of ‘Hey, DAO, give me all my money.’” And because there was a transfer, then update, and you could reentrantly call this code, it basically sucked all the money out.

 

Gary: That’s a nice race condition. It’s not so bad.

 

Nick: Yeah, it’s a nice race condition. And the problem is, well, the money that was stolen was mostly belonging to the people who came up with Ethereum in the first place, and so they basically did a code release that changed it and undid history. So their notion that code is law and there’s no central authorities and no way to undo things was basically revealed to be a transparent lie when it’s their money on the line.

 

Gary: Exactly.

 

Nick: So that’s number one. Number two is the Proof of Weak Hands fair Ponzi scheme, that smart contracts are actually good for writing Ponzi schemes, and people have. And so like the Proof of Weak Hands 1.0, well, it collected several million bucks before one bug locked it up, so nobody could transfer any more money into it, and another bug allowed somebody to steal all the money in it. I think they’re up to 3.0 now. 3.0 has yet to have a fatal bug, but we’ll see how long that lasts.

 

And finally, there’s the Parity multisig wallet. So one of the problems of cryptocurrencies is you can’t actually store your cryptocurrency on an internet-connected computer, because if somebody gets onto your computer, they get your private key and steal all your money. We actually had this happen to us in the early days of Bitcoin. And if security researchers can’t use Bitcoin on an internet-connected computer, nobody can. So the idea is let’s make it a two-party check system. So we’ll have three private keys, and you have to use two of them to transfer the currency. Now, this gives you good controls if you can theoretically maintain at least two of your cryptographic keys.

 

So some systems, like Bitcoin, offer it as a primitive. Ethereum was built as a smart contract on top of things. And this was the Parity multisig wallet, which collected some hundreds of millions of dollars, including an ICO by the guy behind the Parity multisig wallet, until somebody noticed that there was a bug where you could go up to one of these wallets, say, “Hey, wallet, you belong to me. Hey, wallet, give me all your money,” and started cleaning these out. And the only reason this wasn’t a $150 million theft is somebody else noticed this was going on, stole all the money first, and then gave it back to the victims once the victims had upgraded code.

 

Gary: Unbelievable.

 

Nick: Which gets better. So now there’s the upgraded wallet code. Now, in order for efficiency, everybody refers to the same wallet contract, and there was a bug in this contract. Some random loser came along and said, “Hey, contract, you belong to me now.” And the contract said, “Okey-doke, yeah, I do. OK. Oh, crap, this shouldn’t have happened.” “Hey, contract, kill yourself.” The contract committed suicide, and now $150 million worth of cryptocurrency is locked up and effectively inaccessible unless the central authorities that aren’t supposed to exist change the code to unlock this. But we’re not done yet. The pièce de résistance: The lead programmer and shining light behind this fiasco is the guy who invented the programming language in the first place.

 

Gary: Perfect.

 

Nick: The problem is these things are designed to be nonupgradable, but there are hacks that allow you to update them. So if your money is tied up in somebody else’s contract, because their contract is the service, you either have a choice. Either that contract has to have been bug-free when created—not good—or that contract has to be upgradable, in which case, you have to trust that they upgrade the contract properly and don’t cause damage in the process.

 

Gary: So you have a central authority again.

 

Nick: You have a central authority. So for example, there was a bug discovered in some of these smart contracts that run these ICOs, where somebody was able to create—what was it?—200 billion new tokens. And well, the people in charge of that particular smart contract was able to undo the process. But that means also if they can destroy the hack-created tokens, if you’re invested in them, they can then destroy your tokens too if they feel like.

 

Gary: Exactly. So you have to trust them.

 

Nick: You have to trust them. So this is the ultimate irony in all these systems, is their belief in this mantra that lack of trust and decentralization is good in and of itself, ignoring the huge advantages you get with just even the lightest smattering of centralized trust. Yet they end up building systems that aren’t even decentralized. So they build things that are orders of magnitude less efficient than they could be with just a smattering of centralized trust, but which have central authorities and aren’t distributed anyway.

 

Gary: I think their real design decision was “I would like to have all the trust belongs to me.”

 

Nick: No, they truly believe, the cryptocurrency community truly believes in this idea of decentralization, that you should have to trust nobody.

 

Gary: They’re just bad at implementing it.

 

Nick: They don’t understand the costs involved in that, and they cannot seem to ever implement it that way, anyway.

 

Gary: Right. Well, this is incredibly interesting. I have a few other questions I want to ask you in like 22 seconds. So can we do a…

 

Nick: Lightning round.

 

Gary: A lightning round. OK, here we go. So in your research on botnets, have you noticed a switch from technical concerns to social and political concerns?

 

Nick: I haven’t noticed, but I haven’t been focusing on that question.

 

Gary: So you haven’t really thought about it?

 

Nick: No.

 

Gary: Because, you know, the Facebook record’s kind of like East German Stasi’s wet dream, if you think about it, which is not good. So you’ve been focused more on this other stuff instead?

 

Nick: Yeah.

 

Gary: What’s your view on hoarding zero-days?

 

Nick: Uncertain. It depends on who’s hoarding them, how dense they are. It’s a complicated essay in its own right.

 

Gary: OK, let’s turn to election security. So voter registration system weaknesses equal bad.

 

Nick: Equal catastrophic, that I have scenarios where the Russian Chaos Monkeys could cause massive damage in the 2018 with targeted deregistration attacks.

 

Gary: We should expect that to happen.

 

Nick: Oh, we should count on it.

 

Gary: Yep. And then what was it like to work with Krebs on the PharmaLeaks stuff?

 

Nick: Oh, tons of fun. If you can ever co-author with Krebs, do it.

 

Gary: Krebs is a fun guy. All right. So the last thing is a very personal issue. You suffer from depression that’s treated by therapy and medication. And you talk about that, I think, so that others can benefit from the good aspects of treatment and therapy. Tell us a little bit about that.

 

Nick: So yeah, I’ve basically had in my life multiple depression meltdowns, and therapy and drugs saved my life twice as a student. And both times, after about a year, I’d just go off the medication, and a couple years later, the same thing would happen again. And just after the third incident, I basically realized, “Oop, I’m not repeating that mistake again.” And the other thing is I’ve observed that when I’m teaching students, I’ve been there. I’ve done that. I keep a copy of my transcript so that they can laugh at it. And one of the things I’ve realized is that every semester, I include in my first slide deck the notion that “Yes, I’ve been there. I’ve done that. This is not good. There’s help available. We have help.” And every semester, at least one student has proven that it’s been worthwhile, because they’ll come up to me afterwards.

 

Gary: That’s super important work, so thanks for doing that. I like people that talk about these things.

 

Nick: Oh, it’s the way to stop it. It’s, for some people, a medical concern. And don’t be embarrassed about medical concerns.

 

Gary: Exactly. Yep. Last question. So what is your favorite fiction book, or your favorite fiction book you’re reading at the moment? I just read “Exit West” by Mohsin Hamid, which was really, really great.

 

Nick: Let’s just say I’m a huge fan of “The Laundry Files.”

 

Gary: Excellent. Well, thanks a lot. This has been absolutely fascinating. I think we should have done maybe three Silver Bullets instead of just one. This has been a Silver Bullet Security Podcast with Gary McGraw. Silver Bullet is co-sponsored by Synopsys and IEEE Security & Privacy magazine and syndicated by Search Security. The March/April issue of IEEE S&P magazine features our interview with Bank of America CISO Craig Froelich. The issue is devoted to hacking without humans and covers the DARPA Cyber Grand Challenge, focused on both offense and defense. Show links, notes, and an online discussion can be found on the Silver Bullet webpage at www.synopsys.com/silverbullet. This is Gary McGraw.

 

 

 


Nick Weaver