Show 130: Jessy Irwin Discusses How to Make Security and Privacy Accessible

January 23, 2017

Jessy Irwin is Vice President of Security and Privacy at Mercury Public Affairs. Her work focuses on human-centric technology and security. Jessy works tirelessly to make security and privacy accessible to the average person through education and awareness. As an outspoken advocate, she writes and speaks publicly about security research, strong crypto, and security education. She studied Art History and French at Virginia Tech and is now based in San Francisco.

Listen as Gary and Jessy discuss social engineering, security research, and security education and accessibility.

Listen to Podcast


Gary McGraw: This is a Silver Bullet Security Podcast with Gary McGraw. I’m your host, Gary McGraw, Vice President of Security Technology at Synopsys and author of “Software Security.” This podcast series is cosponsored by Synopsys and IEEE Security and Privacy Magazine. For more, see and This is the 130th in a series of interviews with security gurus, and I’m super pleased to have today with me Jessy Irwin. Hi, Jessy.


Jessy Irwin: Hello. Thank you for inviting me on.


McGraw: It’s going to be fun. Jessy Irwin is Vice President of Security and Privacy at Mercury Public Affairs. Her work focuses on human-centric technology and security and is centered on making security and privacy accessible to the average person through education and awareness. Jessy is an outspoken advocate for security research, strong crypto, and security education, all topics which she writes and speaks publicly about. Jessy studied art history and French at Virginia Tech, and she now lives in San Francisco. So, thanks for joining us today.


Irwin: Thank you so much. I’m really excited to get our talk on.


McGraw: How did you end up getting started in computer security? And how have you seen the field change, from kind of a general-public or person-on-the-street perspective, since you’ve been working in the field?


Irwin: I had a really nontraditional path to security. In my house growing up, my grandparents had a home office. And I remember being like 10 or 11 with this little teeny-tiny Dell laptop, and for whatever reason, we couldn’t get the modem to call out to do what they needed it to do for business, and I really don’t want to say—


McGraw: That’s definitely secure.


Irwin: Right? So the problem that I had back then—I mean, I’m 10 or 11, and I just did whatever I needed to actually get it to work. And I think that was probably my first real experience with security and computers.


McGraw: Hilarious.


Irwin: I have not really been as involved in the industry as a lot of my peers, and that’s OK. I kind of enjoy being fresh eyes on security. But I’d say over the past few years, I’ve seen a lot more attitudes start to shift in favor of being more human-centric in what we do. I think we’re really at a point where we’ve invested so much into the technology pieces of security that now really the way to get through is to attack the human elements of these systems we’ve designed.


McGraw: I totally agree with that, yeah.


Irwin: Unfortunately, a lot of what we have designed sets the human up for failure. So I like to think that I am extra, extra blue team because I think of myself as the person who is trying to fix that human element with a lot of other smart people.


McGraw: Mm-hmm. That makes sense. And I think you’re right about that. There’s a whole field of usability, of course, which there’s a big literature about and a lot of people talk about that. But I’m not sure we have all of the technical problems sorted just yet, so we have multiple kinds of problems to work on. What is your current role at Mercury Public Affairs? What do you do all day these days? This is a pretty new job for you, I think.


Irwin: It is pretty new. So I joined Mercury in October. I can’t talk about everything that I’m doing right now. I’ve got some secrets in my hair—that’s why it’s so big. But right now, I am working on a couple of events to get different areas of the world talking about creating cyber security industries or investing in technology industries that are already in, you know, their location and looking at how they can be more focused on security.


McGraw: That’s cool.


Irwin: A lot of what I’m building out as well is really just me having a love for small businesses—my grandparents ran one out of their house—and thinking about the ways we can take all of this stuff I see inside of huge enterprises and make it reasonable and not hard for other people to really be able to implement or think about. I just don’t think it’s fair that we have all of these security professionals who have millions of dollars to spend or hundreds of thousands of dollars to spend, you know, on security tools and all kinds of security hires, but over 50% of the American economy is small businesses, and some companies are never going to hire a CSO.


McGraw: Yep.


Irwin: So one of the first things I do from a marketing and communications perspective, in my role, is I will hop on the phone with the most, you know, public-facing people in a company and say, “Hey, let’s talk about what’s up with your security. Is your blog using the right plugins? Are your things up to date? Let’s make sure you’re not reusing passwords.” That’s all super simple to security people, but the average person needs to hear that probably 20 or 30 times until the behavior change actually happens.


McGraw: So I got a quick question for you. How big is the SMB market compared to the enterprise market with regard to, say, you know, IT budget and gear? I’m just trying to put my head around it.


Irwin: I actually haven’t seen any really good numbers for it, and I think that is one of the problems that I regularly encounter, right? Because from an enterprise perspective, we can say, “Well, we have eleventy billion dollars in this market, and it’s grown by twelvety billion dollars in the past year.” I don’t know that we know enough about small business behavior and security, and I think that small businesses don’t know enough about security to realize that they need certain things. So I’m not convinced that we have enough data to really make a judgment there.


McGraw: Yeah, sure. But how about just complete market value of small businesses versus all large enterprises? If you just summed them up and weren’t thinking about IT or security, is there...? I’m trying to get a feel for whether like 90% of the money in the economy is tied up in huge corporations or 10%, and I really don’t know. Do you have an idea?


Irwin: Yeah. I don’t really know. I’d have to go do some good digging in some really, really painful government documents.


McGraw: I don’t know. That’s what the Department of Commerce is supposed to do, isn’t it?


Irwin: Let’s go call them.


McGraw: All right. So let’s move on to the next question. I just gave the keynote at ShmooCon last Friday in a room full of hackers, and there was some talk of the term “researcher” or “security researcher” during that conference. So I want to know, what does that term mean to you?


Irwin: So I go back and forth on that term, but I think what I like about the term “security researcher” is it makes it really easy for all of the other people around us who don’t understand what we do. I’m primarily thinking of the media and the policy people. It makes it easy for them to talk about us without using the word “hacker” and that being heard as “criminal.” So I get like our industry doesn’t love the term sometimes and there are problems with it, but hey, I am all for anything that gets us away from terms that make us sound like criminals.


McGraw: Yes. Okay. Well, I mean, I’m so old school that “hacker” means the MIT sense of the word for me. But that’s because I started getting on the net in 1985. Let me tell you what it means to me, because it’s something completely different. The term “researcher,” to me, means a science person, you know, an academic who’s doing security research. And so it’s weird to have the hacker term “researcher” and the science term “researcher” kind of collide, and I’d actually like to see those two worlds combine forces if possible. Do you ever see it used the second way?


Irwin: I have seen it used the second way. So I sit in a really interesting spot because with the work my company does, we’re very involved with policy, and I’m consistently keeping my eyes on what’s coming out of academia. You know, Berkeley’s got all kinds of great research, Stanford has all kinds of great research coming out, and that’s where I tend to see security researcher terms and research methodology talked about.


McGraw: Right.


Irwin: What I would love to see more of—you know, instead of somebody just firing up Burp and calling themselves a researcher without an actual methodology—would be a way that our industry came together and said, you know, “To legitimize more of our research and what we do, let’s follow a particular process.” You know, we have the scientific method. Why don’t we have the hacker method for breaking things?


McGraw: That’s good. I think there are some people who cross the lines, like Matt Blaze comes to mind, and there are many others, but Matt’s a good example. I do think that people that do hacking, you know, who call themselves security researchers could do with a good dose of academia, and I think that academia could do with a good dose of hands-on. So maybe that’ll happen in the future.


Irwin: Yeah. I think there’s a lot that both sides can learn from each other. I don’t know that we have appropriate places where they can both meet and really talk about how one helps the other and how one causes problems for the other.


McGraw: Mm-hmm.


Irwin: And I think sometimes the people who are doing, you know, security research or participating in bug bounty programs don’t always have the incentive and they don’t always have the right—how do I say this lightly?—personality to be doing that work. They’re not really about building bridges.


McGraw: Right.


Irwin: So I think we need some really, really good diplomats who can help bridge those sides. We need a lot more connectors and, I don’t know, like people who are standing next to the bridge and making sure it doesn’t get burnt down.


McGraw: Interesting. So next topic: When you’re teaching normals about security—which I think you do a fair amount of out there—why is it so hard?


Irwin: I would say one of the reasons that teaching security is so hard is that there are so many misconceptions out there that get perpetuated by the media and perpetuated by the internet that people really just don’t have the tools to compare and contrast and work their way through it, so they give up. We have so much information and misinformation out there that it’s exhausting to try to find an answer.


So the way that I tend to approach security education is to really tailor the conversation to the person or the group that I’m talking to. If I’m talking to teachers, I’m not going to go pull up something that happened on “Mr. Robot.” I’m going to talk about examples of where they are being compromised by students or how parents have compromised, you know, school system security—which isn’t hard—and really make sure that they see how this could happen to their lives and be a problem.


We have too many pie-in-the-sky examples, and we are so bad in general—all of us, even me sometimes—at really bringing our security issues back down to earth.


McGraw: Yeah. Talking to normals is a real difficult thing. I mean, I’ve done it for many, many years, but you have to remember what other people don’t know. That’s the key.


Irwin: There’s that, and then there’s also the issue a great example for me, I have an old-lady gang on my street. And they know that I call them the old-lady gang. I love them. But they all have these really expensive sewing machines and quilting machines and sergers. Well, not every company that connects a sewing machine to a computer and builds software for it is going to be ready the first day that the new, you know, Mac OS comes out or the new Windows comes out...


McGraw: Right, with their drivers and stuff.


Irwin: ship a new product. So we tell everyone to upgrade software and update software all the time. But there are actual use cases where that advice doesn’t work, and we need to be able to make sure that we understand why and just not write off any reason that someone won’t update their OS, you know, as a major fail and someone being stupid.


McGraw: That’s a really good example.


Irwin: I think we need to be a lot more measured in these things and we need to recognize how much of a burden we put on users. Because the more burden we put on them or the more we make them feel bad for not doing certain things that are valid, the harder it is to get our message through.


McGraw: Do you think that people have a security mind-set naturally? Like the old-lady gang—do they get it?


Irwin: I think that people do have a mind-set, but I think it’s really hard for them to connect some of the principles they might think about in the regular physical world to what’s going on in the digital world.


McGraw: Oh, OK.


Irwin: That divide there is so abstract. I have found that with the older populations I’ve worked with, they get really excited about it once they see it. But I would say that with some of the younger populations I work with, especially teenagers, they feel like they are pretty much invincible and it’s not going to happen to them.


McGraw: That’s why they make good soldiers, you know. You’re immortal till you’re 18.


Irwin: Exactly, exactly. But I think for some people, especially people who maybe haven’t had the easiest path through the planet, they get it. But I think others who maybe have never had to deal with a hard problem in their lives, they don’t understand, you know, why you can’t just reuse a password everywhere and write it on your monitor.


McGraw: Right.


Irwin: And all that great stuff.


McGraw: Do you think you can teach people to have a better security mind-set? I assume the answer to that is yes.


Irwin: Oh, yes. I definitely think that it’s something that is teachable. I think exposing it to people in their own terms and in creative ways is the way to do it. The ultimate problem that I think we all forget about education is that we have to reinforce these principles and these ideas quite often. And the continuous part of education is the one that I think a lot of people are really, really bad at. You can’t just tell someone one time, you know, “In 10 years, this is what you have to do to make sure you’re safe,” or one video that happens at work every year is going to take care of that education. It’s tiny little touch points and little silly things, creative little pieces of content and jokes and emoji, that help remind people and maybe plant a seed that will help them, you know, a week or two weeks later, identify something they’re doing that might not be safe and change it, or at least ask about it.


McGraw: Well stated. So for what it’s worth, I don’t think you can teach all developers how to think like a bad guy, which may be the edge of what we’re talking about. A guy named Steve Lipner was instrumental in changing my mind about that after I wrote “Exploiting Software.” So it seems like you can teach people some things and you can remind them, but thinking like a bad guy might be something you can’t teach them. Am I just totally insane, or what do you think about that?


Irwin: I agree with you there. I also think that we shouldn’t have to teach everyone how to be bad to make sure that they’re good.


McGraw: Totally agree.


Irwin: I think that we do much more good when we tell people how to do things safely. You know, telling someone not to use a piece of software or telling someone not to do something rarely ever results in what you want. So if you give them enough information about the risks that they face and about what’s against them, you can influence a decision. But I just don’t think that requiring everyone to be an expert in hacking or an expert in security is the way that we make everything we do, you know, be more widely adopted, period.


McGraw: I totally agree. So, you know, related to this idea of teaching normals about security—which is an important job, and I’m glad you’re on the case—do you think it’s even harder to teach normals how to use, say, crypto?


Irwin: Oh, is that a trick question? I’m just checking.


McGraw: No, it’s just...I mean, the laugh aside, it’s not really a trick question.


Irwin: I would say—so I have a lot of feelings about this one, especially after the election that we have all just gone through. I remember—so I was in Europe at the time, so I was like a full eight hours removed from day-to-day conversation. But the number one thing that I started seeing happen was a whole bunch of people suddenly downloaded all of these impossible-to-use crypto tools and started using Signal, out of nowhere.


And I think that the thing that is hard to do is to teach people when to use crypto. The tools are getting better and better. We’re getting amazing integrations—Moxie did great work with the Signal Protocol, and I’m super, super excited that we have that wonderful thing—but cryptography and encryption are things that you use at the end of the security process. And what I keep seeing is when people feel like, you know, especially with the election, that their world may suddenly become more dangerous, instead of following the process of security—which is not reusing passwords, and updating software when you can, and not clicking on suspicious links, and all of that other good stuff we take for granted—they don’t follow that process. They just go and install a product, and then they feel better.


McGraw: It’s kind of the application of magic crypto fairy dust—or magic crypto fairy tool, if you want—because, you know, they think that that one thing will keep them safe, where you have to do SecOps the whole time.


Irwin: Exactly. So if you’re worried, for example, that you might get pulled over and you might be profiled by a police officer, using Signal is not going to save your butt if your passcode is 123456. You know, there’s no point in using and adopting all of this strong crypto if you don’t have strong passwords and that strong security operations background, which doesn’t have to require a whole ton of orchestration if you’re a regular person. It doesn’t.


But I think that we are so quick to just slap a product on something that we are really going to be at a spot where, you know, government agencies and places like Cellebrite—they’re going to recognize this, and they’re going to find ways around it because they can say, “Well, you know, all the technology people did their part in making everything strong, but here’s the place where the user was set up for failure. Here is how we know they’re going to fail. So go crack that password.”


McGraw: Okay. I want to push on that a little bit. So I was in a recent Twitter war about this ancient quote of mine from like 1996. And the quote is “Given the choice between dancing pigs and security, users will pick dancing pigs every time.” And I got in trouble with some people because they were like, “Users aren’t so stupid, and you shouldn’t denigrate users,” and blah, blah, blah. But the point really was not to give the users a choice while you’re doing the engineering but rather to make things secure by default. So, you know, doing better security engineering.


Irwin: I completely agree with you.


McGraw: So what’s your view on whether users should get to choose their security level? I mean, where do you draw the line there in terms of what users get to do?


Irwin: To be perfectly honest, over the past year or so, we have seen some great research come out about consumer attitudes on security. And what we have seen—I know that the paper I’m thinking of came from SOUPS this past year, and it was by a team from Google. I love their work, by the way. But it basically said consumers are never going to choose security as their top priority when they’re selecting an app or selecting software. What’s most likely to happen, especially for communication, is they’re going to pick what everyone else uses.


So I think really the answer—and I think we’ve seen this with Moxie and Signal, once again—is instead of expecting everyone to go move over to the super-secure thing, we can just put the super-secure part of the super-secure encrypted thing in the middle of what they already use.


McGraw: Yeah. You know what’s weird, Jessy? The guy I was having the Twitter war with is actually the security engineer for Chrome. I mean, we came to détente, but I think ultimately realizing that this is about the choices that engineers make so that users have an easier life was the key to coming together.


Irwin: Yeah. I think sometimes what happens, too, with Twitter is we get in these wars and we don’t get to have the nuance because the conversation happens so quickly. I think if you were at a table, you probably would have said, “Well, yeah, we don’t want to talk crap about users. But we should probably talk about ways that we can at least help them out, because we already know that they’re not going to put security at the top of their list.” He probably would have gotten there.


McGraw: Oh, he got there pretty easily, because he started saying, “Well, I’ve been reading, you know, Bellovin and Cheswick.” And Steve is a good friend, and he hopped on the thread and said, “You need to look at this background here and think about it this way.” So it was helpful to have a little air cover from him.


Irwin: Yeah, definitely.


McGraw: We’ll be right back after this message.


If you like what you’re hearing on Silver Bullet, make sure to check out my other projects on There, you can find writings, videos, and even original music.


So another topic, and probably the last one we can fit in. Tell me about your work in K-12 classrooms in security. It seems like you’ve done a lot of different things there and they’re all interesting. So, you know, what about that work?


Irwin: Sure. So when I first moved to San Francisco, I worked for an education-technology company. Before that, I had had the experience of dealing with students in a huge experimental education-technology class that I was part of at Virginia Tech. And what I realized is that even though I was in the middle of this whole teacher world, you know, on the weekends and when I went out to dinner, or when I went to DEF CON, none of the stuff that we talked about in the security world ever made it over to the people who are literally teaching kids how to use the internet and to put their identities and their information online. So there’s this huge disconnect.


And around the same time in my life, I decided I was going to start being the kind of person who would not accept the things that I had within my power to change. So for almost four years, anytime I found teachers, you know, meeting up to talk about new tech tools, or I found a teacher conference, or I got invited to speak at a teacher conference, I went and talked, and I gave teachers exposure to some of this stuff so that at least students could have a fair chance at being able to maybe not have their kindergarten project and their third-grade YouTube Shakespeare video, you know, passed around the office in 10 years when they’re an intern at Google or wherever else.


McGraw: Yeah. That’s very interesting. You know, that calls into question, in terms of education and teachers, you know, science doesn’t really move that fast in, say, biology, when you’re talking about very basic stuff. But science moves faster when it comes to technology stuff, and teachers have to keep up with it. So training the teachers is a real challenge, I suppose.


Irwin: It is. And if you look at public school infrastructure, you know, the ZIP code that a school is in is going to define whether they have a good-enough budget to, you know, have good infrastructure and the physical space, much less have enough money to purchase books and have enough money to keep the lights on.


Teachers never—and I mean never—get education at any point during their preservice education about privacy and security. And yet all these things that schools have done forever—I mean, we’re talking about data collection that goes well past the end of our natural life. We see people’s school records from the 1900s all the time. That is all online, at a glance, you know, at your fingertips in 30 seconds or less.


And what does it mean if it’s not secure and it’s not being taken care of appropriately? What does that mean for students if, you know, another kid unlocks the teacher’s computer and goes and decides to mark up the permanent record? And it can’t be tracked because, you know, we don’t have all of the logging software, or all of the appropriate things to really help get to the bottom of investigating what happened aren’t there. These are some really big problems, and we don’t know what they mean yet.


McGraw: Yeah, very interesting. All right. I lied to you about the last topic because I forgot about the last topic, and this is it. So let’s talk about sexism a little bit. Do you think that security is a field that’s any worse or any better than other high-tech fields? What’s your view with regard to sexism?


Irwin: So I feel like personally, my worst experiences working in technology didn’t come to me when I have worked in security. They came to me when I was working for an education company or a consumer-technology company. It doesn’t mean that we don’t have problems in security. I would say the bigger problem that I have experienced is that people look at me and say, “Well, you’re not an engineer. How can you be in security?” And I am the first person who will throw down on that and say, “Wait a second. You guys all love security engineering, and I get it, but you need people like me around to help make what you do more successful.”


And I will go to the mat any day about saying, “Yeah, I don’t look like a typical security person, and that’s good. That’s good for all of us.” And we need more people who don’t fit the current type. We need lots more women around to say, “Hey, what about this problem over here?” or “What about that one you’ve never seen?”


McGraw: I totally agree, but I also think that women engineers are just as powerful and important, you know? So you don’t want to say, “We need more women because we need to bridge the gap,” but “We just need more women, period.”


Irwin: Yes, I totally agree with that. I would say we need more women in every spot of the blue team and all the spots you can get them on on the red team, just in general. But I also would like to see us expand our definition of what a security professional looks like. Because I think if I go to another DEF CON and someone looks at me and says, “Oh, are you so-and-so’s girlfriend?” it’s going to be really hard for me to not punch them in the face.


McGraw: I think you should punch them in the face. I’m all for violence in that situation. Go for it. So with regard to sexism in the field itself, other than changing the ratio of males to females, is there something else we can do to eradicate sexism in security as a field?


Irwin: I think one of the things that I have experienced that could really be easily fixed is that we change the way that we present women in a professional space. There are so many times in a meeting where I have been talked over, or I have had an idea that I just had come out of my face come out of a male colleague’s face, and he got the credit for it. And it’s not appropriate to throw the conference room phone at someone’s face. So I’ll never do that. But I think making sure that when that happens, the men around us can be allies and say, “Hey, wait a second, that was her idea.” You know, give the idea back, give the voice back, and make sure that the recognition is given to the right person and not bundled up and shipped off to someone else who just repeated an idea.


McGraw: A really good point.


Irwin: I think that’s one easy, quick thing, and that overall will help us move forward. Because it’s actually the tiny little things that happen every single day on teams that set us back. We can all sit there and say, “Well, we’re only going to do all of these big massive things industry-wide,” but it’s the little stuff that piles up. It’s the little stuff that makes the biggest difference. And if we start with those details and make sure that the women’s voices get amplified at the table and in the meetings and when incidents and things happen, that will help us, you know, at least get more recognition and more leadership when we need it.


McGraw: Super good. I like that a lot. And I’m glad you have the opportunity to tell everybody that’s listening to Silver Bullet that, so listen up, everybody.


You are very hard to dig stuff up about on the net. You know that. I assume you do that intentionally.


Irwin: Yeah. So I’ve had some issues with people deciding that they wanted to try to threaten me or show up at my front door or stalk me or whatever.


McGraw: That sucks.


Irwin: I don’t have the appetite for that, so it’s a little bit hard. Part of it is just that I’m lazy and I don’t update things. But overall, I recognize the more information I expose about myself and my family—look, I’m an opinionated, blond-haired lady who likes to yell about security. At some point, somebody is going to decide to be a very big thorn in my side, and I don’t want to give them too much information to hurt me or the people I love. Because ultimately, you know, my significant other, my family—they didn’t sign up for being targets of what I do as work and what I do publicly. So it’s my job to protect them wherever I can.


McGraw: That seems fair. But usually, I ask this last question that has something to do with an outside interest, but I’ll just ask you something generic. So what’s your favorite piece of music at the moment?


Irwin: Oh, that’s a really good one. So what have I been listening to on repeat? OK. Actually, this is terrible. I just drove up to Tahoe with a car full of friends this past weekend, and all of those terrible songs that we used to listen to in college came on the radio, and I have been all over the Spice Girls ever since. So no Spice Girls in your past, where it’s not cool, but I have been all about them today. Usually, I’m all up in the classical music when I’m working because I can’t have lyrics. They get really—they mess up my writing, but...


McGraw: Totally agree with that, yup.


Irwin: ...Spice Girls, it has been all day, and let me tell you, I have been boogying in my office. It’s been fabulous.


McGraw: All right, last word to the Spice Girls. Thanks, Jessy. This has been really fun.


This has been a Silver Bullet Security Podcast with Gary McGraw. Silver Bullet is cosponsored by Synopsys and IEEE Security and Privacy Magazine and syndicated by Search Security. The November/December issue of IEEE S&P Magazine is devoted to real-world crypto. The issue also features our interview with Jim Manico—though frankly, that one’s more fun to listen to than to read about. Find all of the Silver Bullet episodes on under ‘Tech.’ Show links, notes, and an online discussion can be found on the Silver Bullet web page at This is Gary McGraw.

show 130 - Jessy Irwin