Show 131: Kate Pearce Discusses the Relationship Between Biology and Security

February 28, 2017

Kate Pearce is a Senior Security Consultant at Cisco within the Customer Solutions division. In her career, Kate approaches security from diverse perspectives encompassing defenders, builders, assessors, and attackers. Her approach blends business, academic, and assessment contexts with a clear focus on evidence-driven security approaches. Kate holds an MSc and a BSc in Computer Science from the University of Canterbury. A repatriated Kiwi, she currently lives in Wellington, New Zealand with her wife and cat.

Listen as Gary and Kate discuss the state of the software security industry, gender perspectives in the security space, the relationship between biology and security, and more.

Listen to Podcast


Gary McGraw: This is a Silver Bullet Security Podcast with Gary McGraw. I’m your host, Gary McGraw, Vice President of Security Technology at Synopsys and author of “Software Security.” This podcast series is cosponsored by Synopsys and IEEE Security and Privacy Magazine, where a portion of this interview will appear in print. For more, see and This is the 131st in a series of interviews with security gurus, and I’m super pleased to have today with me Kate Pearce. Hi, Kate.


Kate Pearce: Hey.


McGraw: Kate Pearce is Senior Security Consultant at Cisco in the Customer Solutions division. In her security career, Kate comes at security from all perspectives, encompassing defenders, builders, assessors, and attackers. Kate’s diverse background includes time spent at Neohapsis and Cigna. Her approach blends business, academic, and assessment context with a clear focus on evidence-driven approaches. Kate’s a repatriated Kiwi living in New Zealand now with time served in the United States. Kate holds an MSC and a BSC from University of Canterbury, and she lives in Wellington, New Zealand, with her wife and a cat. I presume only one cat. Is that correct?


Pearce: Only one cat. Yup.


McGraw: Only one cat so far. So thanks for joining us today, Kate.


Pearce: Yeah, thanks. A hundred thirty-one’s a lot.


McGraw: It is a lot. Once per month. And I usually don’t really miss the deadline. We’ve only backdated one or two episodes, but don’t tell any of my listeners, please.


Pearce: To consider me amongst the gurus, you know, you must be running out of mountains.


McGraw: No, everybody can be a guru. So tell us how you got started in security and, in particular, how your journey from EE to CE to CS to security and so on informs your approach.


Pearce: So this is really interesting because the way I describe it now, you know, the buzzword thing is sort of transistor through boardroom. But the way it works sort of historically is, as you say, I went to university to do electrical engineering. That actually goes through the back to...when I was in high school, I was, you know, doing very well in science and ended up going on a sort of a robotics summer course, basically. For a couple of days, you go down there to the university, and you made, I think it was, a line-following robot.


McGraw: Nice. Was it with Fred Martin’s mini-boards from MIT?


Pearce: No, actually I’d have to look. It was a long time ago. That, that was nearly 20 years ago. But that sort of got my interest in it, and I enjoyed it, and that got me started looking at, well, I like computers and I like buildings things, so how about I do electrical engineering? When I got down there to university, and after a couple of years doing sort of the usual exploration, where I looked into other avenues of life—and I’d been working full-time at a supermarket and studying to be a preacher, but that’s a story for another podcast—I ended up at university doing electrical engineering. And overall it was...I really, really enjoyed it.


As a part of that, you do the software side and the programming side, as well as more of the hardware, the foundational mathematics and physics and things. And at the end of my first year, they introduced computer engineering as a new program. I was like, well, I enjoyed the software more than I enjoy hardware because in my first couple of years at uni, I got quite frustrated with hardware’s invisible faults. So, you know, if a resistor’s burnt out, you can’t always tell. If the capacitor is dry...You can always inspect code in some capacity, you might have to dig, but sometimes hardware, you can get transient faults, and I decided that just wasn’t playing fair.


So I went and did computer engineering, which is, as you know, halfway between electrical engineering and computer science. I did that for a couple of years, and then I noticed sort of another trend, which is I was doing really, really well and enjoying the software side, but when it came to the complex mathematics and the hardware side, I wasn’t enjoying it as much, and I wasn’t doing as well. So I’m like, why am I putting myself through a degree that’s half of two things if I hate one of the half?


McGraw: So you switched over to computer science.


Pearce: So I switched over to computer science. Yes.


McGraw: And then how did security get folded into that?


Pearce: So one of the...well, a couple of the courses that were available to fill out the year to finish my degree were networking and security courses.


McGraw: Gotcha.


Pearce: I did those, and I built up the relationships and played with things and found that I really sort of enjoyed the adversarial fight of it and the game theory part where it’s not you against physics. It’s you against, well, yourself, and other individuals, systems. And that sort of was a far more of a dynamic thing that I enjoyed.


McGraw: Yeah, that completely makes sense. Do you think that your background in the physics and hardware aspects of EE inform the way that you approach systems today? Did that change the way you view systems as a whole, or do you not really know?


Pearce: I would say it’s critical to doing what I do. And it’s not just about my understanding of the systems; it’s about my understanding of the mind-set of the people who build those levels—knowing how engineers are trained to think, analyze, knowing how one goes about building a system and how the different components and abstractions sort of clamp together and build up. And it comes in a really bizarre time because, you know, things like how memory paging works at a very low level or how the architecture of casing works on a CPU—all of these different little things, sometimes they can be really, really critical to some bizarre behavior, you know, nine levels up the stack.


McGraw: Mm-hmm. Yeah, that’s very good.


Pearce: I say nine even though there’s only seven.


McGraw: Well, you know, whatever. The whole idea of calling something “application security” because it’s the seventh thing up the stack makes me giggle. So you spend a lot of time as a security consultant in addition to working on staff in a corporation as a security person. And so I’m wondering which aspects of consulting life make you happy and how those contrast with, maybe, life in corporation as a security person.


Pearce: Sure. I’ve also, during my consulting time, done a couple of embeds where I’ve been in organizations for long periods, so that sort of also ties into it. And I guess when you’re in a corporation, you have both more and less freedom, in any large organization. Because you may have ownership, but you’re usually far more constrained in your ability to take risks or speak truth to power. Additionally, the problems can be much longer term, and so you don’t tend to get as much variety.


So if you can pull off change, it’s more concrete. But if for some reason you’re unable to, you can be stuck just digging trenches, effectively, particularly in security, where you could be doing, you know, help desk work, or what essentially is help desk work, just because you’re the person who knows what they’re dealing with. And that is one of the reasons I like consulting personally, is I really like approaching things widely and with a wide variety, because I’m the sort of person who goes very wide and understands all the different things and how they fit together rather than deep, like quite a few security practitioners seem to end up.


Part of that was also a deliberate choice because I knew I would be moving back to New Zealand in about five years. And in a small market, I can’t afford to over-specialize. There are probably three malware analysts in New Zealand. Two of them work for the GCSB, and one probably works as a consultant covering the rest of the market.


McGraw: Yeah, and the other guy would rather be a cryptographer, because I know...


Pearce: Yeah, exactly. And so I needed to be very versatile. And they were also really good at an org like Neohapsis, because Neohapsis was a consultancy of about 50 consultants that was acquired by Cisco a couple of years ago.


McGraw: Yup, that I know. Run by my friend James.


Pearce: James is a good guy. He’s going on and doing wonderful things now. Yeah, in a small consultancy, you get approached to deal with some of the most bizarre and interesting problems. And yeah, you’ve got to be versatile because if there’s the person that does X and they’re on a job for six months, well, someone has to do it. And so the people who can spin up very, very quickly are really, really valuable. I guess that’s not just consulting.


McGraw: No, I totally see that. But there’s also that aspect, which I think you alluded to briefly, of since you’re hired to tell somebody the obvious when you’re a consultant, when you do it, they actually really enjoy that aspect of what you’re doing. You know, you’re brought in, in some sense, to figure out what the problem is and fix it, and so the proclivity that you have of doing a quick analysis and figuring out how to get something done really fits into consulting nicely, seems like.


Pearce: Yeah, absolutely. And a good consultant is even more than the technology and the getting things done. It’s, I guess the standard word is, customer satisfaction, but if they’re not happy, it doesn’t matter how good a job you did.


McGraw: Right. Yep.


Pearce: A saying I have—a little bit cynical—is “A good consultant gives the customer what they need and makes them think it’s what they’ve wanted all along.”


McGraw: Well, you know, we have 500 consultants here at Synopsys in the Software Integrity Group, and that’s definitely true. But also, you know, there’s a lot of consulting that has to do with figuring out what they want you to tell them, which everybody—that is, they already know the answer, they just need someone else to tell them the answer, which is hilarious. Hilarious, but happens all time.


Pearce: Yeah, and that gets particularly complicated with generally early-career people, who don’t diagnose the problem correctly. “I called you in to tick a box, or to deal with management risk.” I want to actually touch the system.


McGraw: Yeah. So let’s change gears. Knowing how systems break is a terribly important part of security engineering. But I’m not sure we can teach, say, developers how systems break in a useful manner. So what’s your view on what standard-issue developers need to know about exploiting software and breaking systems?


Pearce: I actually don’t know that developers need to know a lot about breaking systems specifically. I would argue it’s more about the general, the way they’ve built trust into their systems and the failure cases. The particular mechanisms of failure, outside of very specific circumstances, I think are less important than understanding your data flows and your trust flows. And in the end, the main, the biggest choice, I believe, any development shop can make right now is the choice of technologies, particularly with developers tending to jump on the latest hot thing.


McGraw: Absolutely. Yep. Couldn’t agree more with that. That’s right. So I used to think that we should try to teach developers how to exploit software, and that’s why I wrote that book “Exploiting Software.” And then it turned out that that was wrong, so I had to change my mind about that. But I do think that you know how—you have to know a lot about how systems break and about exploiting software if you’re going to be a good, solid security analyst, obviously. So I’m interested to know how knowledge of attack informs your own current work right now.


Pearce: So knowledge of attack is really, really good for understanding what’s feasible when you’re looking at a system, because all too often—and this is tools as much as people in security—they hype up what is the latest thing or the thing that is the most dramatic rather than what’s realistic. And so, you know, you’ll see things like an actively exploitable thing that is hard to exploit versus an SSL issue that takes the computational power of AWS for a week. And they’ll be weighted the same way, and that’s just bizarre. And understanding how an attacker can actually come at something really, really helps the defender.


But at the same time, you know, look at the way we teach in, say, driving, or even in medicine. We teach people how to handle it when things start going awry. We don’t say, “Here’s how you crash a car without dying.” Or maybe we do—


McGraw: Maybe we should.


Pearce: But we don’t tell them to crash.


McGraw: We do crank up their insurance rates.


Pearce: But take medicine, you know. It’s “If someone is dying, here’s how you stop them dying.” It’s not so much how you accelerate the dying and how you stop that acceleration if someone else is doing it.


McGraw: Right, right, right. Yeah, that’s interesting.


Pearce: Adversarial is a bit different. But this sort of ties more generally in to some things that I’ve been playing with the last couple of years my partner is a microbiologist. We have a lot of discussions. And I actually believe most security stuff will never be solved, but it’s not because of technology problems or even human problems. It’s purely about resource usage and short-term/long-term trade-off and evolutionary systems.


In biology, if you have an organism that’s really, really, really good at defending itself, it tends to put so much resource into that that it adapts more slowly or it breeds more slowly. You see the same sort of thing in organizations, where in certain verticals, high investment is good. But in something like particularly the startup space, what matters and what companies have picked and cultivated on is immediate revenue, not future risk. No one cares if you’re secure if you’re out of business.


McGraw: Yep. So you find that the trade-offs are different, and you also find a different amount of middle management, which is in some sense the inertia engine of a corporation. And inertia in both directions, by the way.


Pearce: Yeah, absolutely. And at the same time, you also get these misaligned incentives between the different levels. I don’t think that will ever change, you know. If someone’s going for their bonus, they’re going to do whatever it takes to attain that. That’s what they’re measured on. They’re going to ship. And you know what, maybe that’s the right choice, as much as security people like to say otherwise.


McGraw: Yeah, I have to agree with that. We’ll be right back after this message.


If you like what you’re hearing on Silver Bullet, make sure to check out my other projects on There, you can find writings, videos, and even original music.


So let’s talk about a term that bothers me a little bit. There’s this term “researcher” that’s pretty overloaded in security. I had a little rant in my ShmooCon keynote this year about that word. And in my view we need some scientific rigor in addition to hands-on work, and so I’d like the researchers to meet the researchers, in some sense. And since you’re married to a real scientist, you probably know where I’m coming from. I’m interested in your thoughts about that.


Pearce: Yeah, yeah. I must say being married to an astrobiologist with a Ph.D. certainly means that if I make claims, it’s like, “Where’s the evidence?”


McGraw: Yeah, right.


Pearce: “Citation needed. What’s the p-value of that?”


McGraw: That’s good, you know. We need more people that are married to real scientists out there in the world.


Pearce: A lot of security people are married to microbiologists, just as an aside. A lot of them. But no, you’re right, and that rigor is important. At the same time, we need to try and bring some of that quick iteration to the formal research field because sometimes they research things and by the time they finish, the field has moved on. But that’s it. So much of security is, like, they call themselves a researcher, and what they’ve effectively done is gone out into the garden, found a funny-looking bug, thrown it at their neighbor, who squawked, and then gone, “I found an awesome bug.”


McGraw: I love it. I think that’s exactly right.


Pearce: And it’s like, but what’s the long-term value?


McGraw: What about kingdom and phyla and order and, you know, all that crap? Yeah.


Pearce: What about taxonomy? What about root cause? “I just want to look pretty. My bug’s bigger than your bug.”


McGraw: So on the flip side, there is sort of security engineering of the Ross Anderson variety. And so what’s, in your view, the relationship between analysis on the one hand and construction on the other? I’m interested in this boundary, which I don’t really understand myself.


Pearce: That’s a really hard one. I would say it actually changes with problem space. So analysis can come before build or construction. If you’re in a field where you’re looking at a very long life cycle for what you’re putting in place and a very low appetite for risk—so aircraft is a classic example, or automobile is a little bit lower, but—and so the analysis comes before construction in fields requiring very high assurance. But in fields where assurance is not sort of a barrier to entry, or a barrier to market entry—particularly, you know, websites, online services—you need the typical startup stuff. The analysis can come later because it’s not a barrier to market entry but it might be a barrier to stability or scaling.


So what’s the difference between them? That’s a really difficult one because if you take something like test-driven development, that sort of merges them together.


McGraw: Mm-hmm, on purpose. And if you look at some of the modern methodologies that are intentionally speeding things up and involving the customer in the loop, you get, you know, it is more like test-driven development. So I don’t know. It’s tricky. I like your answer. I think that nuanced answer in that it depends on the domain is probably mostly right.


Pearce: Yeah, and there’s a layer on top of that too, particularly with test-driven starters. Not just are you—you’ve got to understand if you have the right test in place, which itself is a whole really difficult problem. It’s almost the halting problem again.


McGraw: Yeah, well, I mean, it’s amazing to me how little security practitioners know about testing. You know, when you were talking about EE and faults in the very beginning of this episode, I was just remembering sort of fondly and cringingly this book “Software Fault Injection” that I wrote a million years ago. And it’s just like, I wish that people knew a little bit more about testing and the science behind testing than they do today.


Pearce: Absolutely, and there’d be so much value in that, particularly things like coverage, you know, in the way that security testers—everyone has their favorite...I mean, what would it be? Probably between 20 and a couple of hundred vectors they use to test things for security fault. But they don’t tend to pick up as well the subtle things: logic bugs, state problems. They tend to worry about coverage of attack surface. And I would argue as a field we’re pretty good at that, we’re certainly better than we used to be, but when it comes to coverage of application state or state transitions, we’re really not as good as I would hope.


McGraw: I totally agree with that. I try to characterize or capture part of that with the distinction between bugs and flaws that I make. But I like the way that you said it with regard to attack surface versus, say, the dynamics of the machine’s internals. And distributed state and time is very tricky. We’ve barely scratched the surface of that one.


Pearce: Another way I explain it is the difference between unexpected behaviors and unanticipated behaviors. I’d say we’re far better at unexpected than unanticipated.


McGraw: Yeah, so Rumsfeld said that in a logically incredibly ridiculous sentence: We don’t know what we don’t know. But it turns out he was right, even for a crazy person.


Pearce: That actually—that affects security a lot. Something I’ve been pointing out recently to—well, actually in some research several years ago—we make a lot of decisions about what we don’t know on the basis of what we do. And the most simple example of this I have is actually authentication. So you have false positive, false negative, true positive, true negative, and we tend to make decisions about the one we’re worried about the most—the false accept—on the basis of the other three. The problem is we have no idea how many people we’re letting in that we don’t find.


McGraw: Exactly, yeah. So you sort of...I mean, it’s like looking for the keys under the light.


Pearce: Something like that. It’s just…I understand it, but when it’s not pure zero-sum, you have to be very, very careful that you’re not simply reinforcing what you expect, in the same way that companies that have metrics on how many things they find, well, if you stop looking, of course you’re going to find fewer bugs. And so perverse incentives and, yeah, just an incomplete knowledge without understanding what that means or the prior probability around that.


McGraw: Mm-hmm. And things like taxonomies can help us with that if we did them right.


Pearce: Definitely.


McGraw: So let’s get meta a little bit. As a practitioner, what’s your view of the state of software security writ large? And feel free to dive into particular subdomains like code review or architecture analysis if you want, but you only have two minutes.


Pearce: That’s difficult. I think software security is a lot better than it used to be when it comes to input validation. That was the better way [dcs1] for longer than we needed. When it comes to things involving good design and, as discussed earlier, state management and things like that, I would argue we’re worse than we were in the ’70s. If you look at the stuff published then people would attempt[dcs2] , and now it’s like out of scope.


McGraw: Yeah, Saltzer and Schroeder. It’s like we forgot all that stuff. It’s ridiculous.


Pearce: Yes, it’s the same way. I look at something like virtualization, and we talked about it 10 years ago like it was a shiny new thing. Well, you know, you go back to Popek and Goldberg in 1967. It’s not new. And we keep repeating things we’ve done before. We’re repeating our problems with unmanaged languages and memory and IoT. We’re repeating the things we learned from mainframe to decentralize back to mainframe with the Cloud. We’re repeating the things, like multi-tenancy, that we were dealing with in the ’50s, ’60s, and ’70s.


McGraw: Yes, so it’s going to—


Pearce: Every generation is repeating the mistakes of their forebears.


McGraw: Yup. That’s right. So I think that’s where the science kind of research can help in a sense that we might be able to cite some stuff and remember the past a little more effectively than we’re just running around being totally hands-on all the time.


Pearce: Yes, and I think a structured teaching of the past and the past of security and of architectures would be critical, because all too often we teach the way things are and sort of this path of how they got there that skips everything that doesn’t fit that tidy narrative. And we miss so much along the way.


McGraw: Mm-hmm. Good. That was pretty good. That was even under two minutes, so congratulations on that. Now kind of a strange question. So you transitioned your gender, which gives you a really interesting perspective on sexism in computer security, I think.


Pearce: It does.


McGraw: So I’m really interested to know what your experience is there and what your thoughts are and what we should do to improve things.


Pearce: It’s a hard one because I’ve definitely seen things that most people don’t have the honor of seeing both sides of. And even more so, most of the time, people who have not known my history don’t give. Which means I don’t get the transphobia and the strangeness around that. But having seen both sides, it is fascinating. I can’t explain how nice it was in hindsight to go somewhere like DEF CON and be invisible. You know, wherever I go now, everyone’s like, “Who’s that woman? Who is she recruiting for? What is she—”


McGraw: Right. What’s a marketing person doing in the room?


Pearce: “Who’s she with?” So there’s that. But there’s even more subtle stuff, you know. I’ve had the same customer, beforehand, was all like, “You’re not aggressive enough. You need to push your opinion more and let us know what you think and not let us talk over you.” And then, you know, I go back six, seven months later, having transitioned gender, and suddenly I’m getting in the way of people, I need to let them do the job, and why am I being so obstructive?


McGraw: That is just astounding. I knew you would have things like that, so yeah.


Pearce: And more than that, because I have seen both sides, I can tell both sides that, you know, women are not imagining it when they tell you these things.


McGraw: Yeah, right. Because you’ve experienced it firsthand.


Pearce: That all said, I haven’t—you know, I didn’t grow up and get a lot of the problems around that that happened to girls early on. For me it was considered natural to be interested in computers or building things or breaking things.


McGraw: Right, so you weren’t discouraged institutionally, which I think we do and you’ve talked about in some of your previous interviews.


Pearce: Yeah, and so that’s one of the reasons that I am very actively visible. It’s also one of the reasons that, except when it’s relevant, I don’t mention being transgender, because most of the time women, and men, need to see a woman up there hitting with the best of them, doing what needs to be done, and showing them that you can get that respect, and that’s hard. I mean, as alluded to, I did get a lot of that, that weirdness. The first time I spoke at Black Hat, I got the “Who are you? Are you a recruiter? Are you...” or all that, and I’m like, “I’m a speaker.” And they’re like, “Oh, what are you speaking on?” And I tell them.


McGraw: And it would blow their mind.


Pearce: And they look at me blankly. And then one person explained my own talk to me.


McGraw: What did you do about that?


Pearce: So I asked him a question about it that was unsolvable and watched him try to answer it.


McGraw: Well, that’s good. We need more of that. Thanks for answering that so honestly. I really appreciate it. I think the listeners will care about that too.


Pearce: It’s a perspective we don’t see enough. And as I say, I think people who are like me need to see people like me, and people who are not like me need to see people like me succeeding.


McGraw: Okay, one other topic. So tell us a bit about volunteering, what you do and why you do it. Because I think that’s an important thing too.


Pearce: Sure. So my volunteering comes about because, you know, when I was growing up, I came from a comparatively underprivileged situation. You know, single mother in a government house, etc. And so I didn’t get a lot of the opportunities or mentors or examples that I needed. And so when I find myself in a position where I have influence and where I have the ability to guide others through things, I do that where I can.


So I’ve been doing a lot of stuff helping out in startup communities, helping out in women’s mentoring programs or internship programs. And one of my focuses is less on the technical or even on the “Here’s how you live in this.” But it’s more on the softer stuff, like “Here’s how you deal with a power structure that, you know, you’re told you shouldn’t question.” We’re told growing up, particularly from certain backgrounds, you should, you know, just don’t make a fuss or someone will notice. Well, the world doesn’t work like that. By telling people that it’s okay to take charge of your career, it’s okay to take charge of your job when you can get away with it, that seems to be a really big thing to some of these people.


And so things like mentoring. Before someone can take advantage of advice you give them, they have to know that it’s okay to do something that isn’t the standard routine of whatever.


McGraw: Yeah, and I agree that some people just need someone to tell them that that’s okay. That questioning authority is what you’re supposed to do, so do it.


Pearce: Yeah, well, I mean, you can’t always do so, but when you can, you should. And I of my lessons of the last couple of years has been if you have a good-enough foundation, you can get away with a surprising amount of truth to power as long as you’re confident and correct.


McGraw: Yeah, yeah, yeah. Very nice. That’s a good thing to spread. Okay, last question, because we’re slightly over time: What is your favorite book of all time?


Pearce: Oh, wow. You’re asking someone who has a thousand books in her house.


McGraw: Yeah, me too.


Pearce: I think the book of all time that I sort of like—and it’s not necessarily even a book because I think it’s a good book but because of what it covers and the viewpoint it gives—is probably Bill Bryson’s “A Short History of Nearly Everything”.


McGraw: Oh, nice. That’s an interesting one to choose.


Pearce: Because it sort of covers science, technology, the world around us, and how everything got to be where it is in a super overview way. And his writing style is always pretty funny.


McGraw: Yeah, his writing style is funny. That’s a great one. Thanks. This has been absolutely interesting, and I really appreciate your time, Kate.


Pearce: Thank you so much for your time, and it’s a real honor to be on the show.


McGraw: Great.


This has been a Silver Bullet Security Podcast with Gary McGraw. Silver Bullet is cosponsored by Synopsys and IEEE Security and Privacy Magazine and syndicated by Search Security. The January/February issue of IEEE S&P Magazine has articles about containerization and also driving privacy. The issue features an interview with Marie Moe, who has become quite famous for hacking her own pacemaker. Find all 131 Silver Bullet episodes, including video of episode 120, on Show links, notes, and an online discussion can be found on the Silver Bullet web page at This is Gary McGraw.

show 131 - Kate Pearce