Show 127: Dr. Marie Moe Discusses Medical Device Security

October 25, 2016

Dr. Marie Moe is a Security Researcher at SINTEF and an Associate Professor at the Norwegian University of Science and Technology. She was previously a Team Leader at NorCERT, the Norwegian national CERT, where she managed incident response to cyberattacks against national critical infrastructure. Marie’s recent work focuses on public safety and security systems that impact human life. She is renowned for her work in medical device security; in fact, her own life depends on a pacemaker. She holds a PhD in Information Security and an MSc in Industrial Mathematics from NTNU. She lives in Trondheim, Norway with her family.

Listen as Gary and Marie discuss her research and the future of medical device security.

Listen to Podcast

Transcript


Gary McGraw: This is a Silver Bullet Security Podcast with Gary McGraw. I’m your host, Gary McGraw, CTO of Cigital and author of Software Security. This podcast series is co-sponsored by Cigital and IEEE Security and Privacy Magazine where a portion of this interview will appear in print. This is the 127th in a series of monthly interviews with security gurus and I am super pleased to have today with me Marie Moe. Hi, Marie.

Marie Moe: Hello, Gary.

McGraw: Professor Marie Moe is a Security Researcher at SINTEF in Trondheim, Norway and an Associate Professor at the Norwegian University of Science and Technology. She’s been a Team Leader at NorCERT (the Norwegian National CERT) and has taught security classes at Gjøvik University College. Marie’s recent work focuses on public safety and security systems that may impact human life. Lately Marie’s become quite renowned in medical device security since her own life depends on a pacemaker. Dr. Moe has a Ph.D. in Information Security from NTNU as well as a MSc degree in Mathematics. She lives in Trondheim, Norway with her family. So thanks for joining us today.

Moe: Thanks for having me.

McGraw: You are both a teacher and a researcher, and I think you’ve taught university courses (I assume, to undergrads and grad students). And, you do security research now for SINTEF. So which is more fun?

Moe: I think both teaching and doing the research is fun. And of course, I’ve been so lucky to get out there and talk to a lot of people and to go to conferences. I like both talking about my research, doing the research, and teaching future generations of security researchers. So I think everything is equally important, actually.

McGraw: I knew you were going to say that. What courses are you teaching or have you taught that you really enjoyed?

Moe: I’ve been responsible for one course, which is on incident response and preparedness planning. I’ve been teaching this for three years. And it’s going to go on this spring semester at NTNU. Gjøvik University College has been merged with the bigger NTNU. So that’s why some courses are changing and it seems like this will be the last time that I give this course in this form. But it’s going to continue as a more specialized course for Master’s students in incident response.

McGraw: Gotcha. I remember visiting Gjøvik, it was really fun. I guess the campuses are not really the same?

Moe: Yeah. Gjøvik is smaller than the bigger NTNU campus. It’s two hours train ride from Oslo to get to Gjøvik, so it was easy when I lived in Gjøvik but after I moved—or I didn’t live in Gjøvik, I lived in Oslo for a couple of years while I worked at NorCERT. But now, when I live in Trondheim, it’s quite a journey to get there.

McGraw: Yeah. I remember riding a bus in Norway which actually had WiFi and I was thinking, “So, if buses in Norway can have WiFi, why can’t airplanes have WiFi?”

Moe: I know that bus really well.

McGraw: It was years ago.

Moe: Yeah. We are pretty advanced in Norway when it comes to adopting technology.

McGraw: Absolutely. So tell us how the lab at SINTEF works and what kind of work you do there.

Moe: I started at SINTEF one and a half years ago. SINTEF is an independent research institute. It’s the biggest one in Scandinavia, actually, with 2,000 employees. My institute is focused on ICT research and we are doing contracted research for businesses, companies, and also for the government or for the public sector in Norway. There’s a small percentage that is funded directly from the Norwegian state. But usually we have to compete for the funding of our projects. Since I started I’ve been involved in a lot of grant writing to fund research projects—

McGraw: I know those days. I started my career that way a million years ago at what was called then Reliable Software Technologies, on soft money, writing grants for the government and inventing technology. It’s pretty fun but it’s a lot of work. What work are you most proud of that you’re doing now at SINTEF outside of the grant writing procedures?

Moe: I’m really proud of this project that I’ve initiated myself and that I’ve started working on, which is the pacemaker security project. It actually started as a hobby project of mine a year ago. I was off to give a keynote talk at hack.lu, a hacker conference in Luxembourg. The talk was supposed to be about how it is to be a security researcher who depends on a medical implant, a pacemaker. So I have this pacemaker that I’ve been depending on for five years now since it’s correcting my heart rhythm, which was really slow. I actually passed out suddenly because my heart was taking a break. So the pacemaker saved my life. But of course, being a security researcher and getting this implanted medical device, I started to do some research on my own on its security.

I had a lot of questions about how this worked and if it could have any security vulnerabilities—if it could be hacked. Some previous research from 2008 by the University of Michigan and Kevin Fu show that actually it can be hacked. But I wanted to know more about my device, of course, a little bit from my own perspective when I had this device implanted. And I’ve been talking to people about this and then I was asked to do this keynote talk and I was thinking, “What should be my key message of this talk? I’m talking to a room full of hackers, what do I want them to take away from my keynote?” What I really wanted them to do was to be aware that there are lots of insecure medical devices out there and they need to pay more attention to this problem, and we need more security research on medical devices. So I wanted to inspire hackers in the room to pick up medical devices and start doing research. I was preparing this talk and I was thinking, I’m a researcher, why am I not doing this research myself?

McGraw: Good question.

Moe: Yeah. So, I was really motivated to do this. I talked to a couple of friends about this and one of my friends, Eireann Leverett, he agreed to join me in this project as a hobby project. We just started one year ago in our spare time doing this, and several other researchers have also donated their free time to work on the project. But it wasn’t really a research project for SINTEF until I started to get lots of publicity about it.

McGraw: And then they realized, “Wait, this is good.”

Moe: Yeah. So I got a lot of support from my manager, which has been really great in this. I got accepted to give a talk at CCC last year in December, which is the biggest hacker conference in Europe. I didn’t really know what I was—I hadn’t been there before. So I just submitted a proposal and I thought, “Okay, let’s try to see if this works out.” And then I got accepted and I was kind of panicking, “Wow, I’m going to talk to this huge crowd of hackers about this project.” By coincidence, there was a Norwegian journalist in the room who picked up on this and he did a portrait interview with me. It was published in one of the biggest newspapers in Norway.

Of course, this also attracted attention in Norway and my manager approached me and said, “Your hobby project, it’s really interesting. We want to support you in this.” I got some internal funding to start working on creating a bigger research project, which is actually what I’m doing right now. I’m writing a big grant proposal to try to get some money from the Norwegian Research Council to work more on this.

McGraw: Outstanding. That’s a really great story. It shows that when you’re passionate about something, you can actually turn that into your livelihood, if you do it right. So when you first got your pacemaker and the doctors couldn’t answer some of the questions you had about the technology and how it worked, that was interesting. Do you think you understand your own pacemaker now better than they do?

Moe: Actually, I was admitted to the hospital a couple of weeks ago because of a problem with my pacemaker, and after this I’ve been in for further check-ups. The last time, actually last week I went in to the local hospital here in Trondheim, the doctor who was doing the check-up was talking to me as if I were more of an expert on this than he is.

McGraw: But are you? I think you probably are.

Moe: At least on the more technological side, I might be a bit more knowledgeable about some of these things.

McGraw: That calls into question this—who should approve modern medical devices: doctors, or technologists, or security people? What’s the deal? Because doctors don’t know anything about technology, for the most part, and technologists don’t know too much about how the body actually works. So it’s a tricky bit. What’s your opinion?

Moe: We need to bring together the different stakeholders. When it comes to securing medical devices—and especially for connected medical devices, as in this case, my pacemaker has the functionality that allows it to connect to the Internet and be sort of part of the Internet of Things or medical things—we need to include all the stakeholders in working on this. That means the device makers, of course, but also the security researchers. And then we have the regulators, policymakers, standards bodies, and of course also the healthcare providers—the physicians and patients should be included in this too. Because I’m really concerned when I’m doing this. I’m thinking about all the ethical dilemmas when I’m doing this security work—research. Because if I find some vulnerabilities, I don’t want this to create fear among patients, my co-patients.

McGraw: Yeah. What did you think about the St. Jude’s thing that happened not too long ago?

Moe: Yeah, actually I was quite upset when that happened. I felt really emotional when the news came out. I got a lot of people contacting me and wanting me to comment on it. For the first time in my life, I was being subject to a lot of outreach from financial investors, like Wall Street people, who contacted me and wanted to have my consultancy on this. So I was kind of overwhelmed for a week before I actually decided to write a piece about this on a blog in Norwegian. But my first reaction was that I can’t really believe that they didn’t contact St. Jude Medical with their findings. And also that they didn’t contact the FDA. I think that was also very—it seems like they were really financially motivated.

McGraw: Yeah, they shorted the market instead. So I actually know Justine Bone. She’s usually a reasonable person. But that didn’t seem like very reasonable behavior, especially when it comes to the life-critical issues. So, I guess that’s how you feel too.

Moe: Well, I was puzzled by that approach, but when I read the report I was a little bit calmed down because there was no real details in the report. It was not like they were disclosing zero-days in the pacemakers because you couldn’t really do anything with the detail in the report. Also Kevin Fu’s group were looking into some of their claims and actually debunked one of them—they had published a video showing that they allegedly hacked the pacemaker by bricking it, and the proof was an error message shown on the pacemaker programmer. But Kevin Fu’s group showed us you could actually get the same error message just by interrogating a pacemaker that was not connected to muscle tissue—where the wires were just open to the air, you’d also get the same kind of error message.

McGraw: I see, because of wrong conductivity through the loop.

Moe: Yeah. But they also had claimed that they had an exploit that could deplete the battery of the pacemakers, which I think is really a bad scenario. This week, totally unrelated, there was a recall actually announced by St. Jude for a brand of their implanted cardiac devices—that the battery failed, so that the device ceased to deliver therapy, and two patients actually died because of this error, or the failure.

McGraw: Yeah. I mean that goes to show you how real this stuff is. This is not theory. This is not about philosophy. This is about people’s lives.

Moe: Yeah. People’s lives and being potentially killed by software bugs or errors.

McGraw: Yeah. We’ve had safety-critical software conversations for quite a long time. Even with the medical devices, if you think about the Therac-25, which delivered radiation to patients; it had some setting issues. There’ve been some very famous software bugs over the years, but it seems like we haven’t made as much progress as we need to. What’s your feeling about that? Kevin Fu did that work in 2008. Have we made some progress since then in the field?

Moe: I have a feeling that they have made some progress, which is good. And I was invited by the FDA to be on the panel earlier this year, in January, where they were presenting their new approaches to pre-market and post-market risk management of products. So the FDA has started to work on this. Also, the medical device vendors have started to have incident reporting or vulnerability disclosure policies, which I think is a good thing, so that there actually is a place where researchers can contact the companies in case they find vulnerabilities that were lacking previously. But now they are slowly adapting this and they want to be in dialogue with security researchers, which I think is a good thing. They seem to be pretty proactive on this. After I went public with my concerns, I’ve been contacted by several vendors that wanted to talk to me and be in dialogue about this, which I think is good.

McGraw: Yes. So the attitude is right.

Moe: Some progress has been made. That was one thing I was worried about actually, when it came to the St. Jude and MedSec and Muddy Waters disclosure. How could this affect the collaboration in the future between researchers and vendors? Yeah, it’s hard to say.

McGraw: Yeah. Well, that brings up a question I wanted to push on a little bit. So, this idea that all you have to do to secure something is hack it and then fix what you find, kind of “testing security in,” doesn’t seem to be right to me. I think you have to design things properly and you have to review code, and it’s not just about testing at the end of the life cycle. So the idea of, say, running a bug bounty for pacemakers seems really like a bad idea to me. What are your thoughts about that?

Moe: I completely agree that you need to build security in. You’re the expert on this. Bug bounty is not the answer, I think, to this. I think a lot of things that you should do before you get to that point, that you’re really a mature enough organization to create a bug bounty program. So maybe that’s not the first thing that medical device vendors need to focus on. I think, as you say, that they should start with building security in and having more transparency and more testing. I really would like to see more third-party testing of devices.

McGraw: We’ll be right back after this message.

If you like what you’re hearing on Silver Bullet, make sure to check out my other projects on garymcgraw.com. There you can find writings, videos, and even original music.

McGraw: So you’re on the record as wanting medical device code to be open source. Why?

Moe: Well, it doesn’t have to be open source. But there has to be some transparency that allows the end users, like patients like me, to trust the products. Of course, you can have open source software, but you can have some other alternatives in case that doesn’t work. For example, a Software Bill of Materials is a good way to go because that means that the vendors declare all the different software that is shipped with the product. This can be a great use, for instance, a good use case for hospitals that get hit by ransomware, and they don’t really have a good enough overview of all their different products so they are vulnerable to this type of attack. Of course, they need to have their inventory management in place first before they can know and identify what needs patching.

McGraw: Absolutely.

Moe: But when they don’t know what’s in the products, the way it is today, the vendors want to control the devices directly themselves. So they actually tell the hospitals that they need to put holes in their firewalls in order to give them remote access so that they can remote-manage the devices. The hospitals are actually the ones that have to take the responsibility for this. So there’s a problem with asymmetric information, and also the person who decides what security measures to take, what type of requirements they put on the vendors—I’m not just saying persons that take the risk in case something goes wrong.

McGraw: Yeah, there are all sorts of economic imbalances like that, and that’s why I’m really pleased to hear your very subtle answer to the question because I think you’re right. If you just open source something, then you have the economic problem of who should pay to secure it. And if somebody pays to secure it, should their work benefit their competitors and so on. It really gets tricky. And I think that pushing all this through inventory is a really good idea. One of the things that is of note along those lines is that the way the TAO, the NSA attack guys, really work is pretty straightforward from what I can tell from their public statements. They just have a better inventory than their target. And they keep track of when there are problems in that inventory and they go after it. So inventory control turns out to be kind of a fundamental aspect of software security. When you think about scalability, you have to do that first.

Moe: Yeah. There’s also the issue of putting in measures that can be helpful in an investigation. So today there are not any requirements of forensic capabilities for medical devices. So if something goes wrong, it’s up to the vendor what kind of logging they have implemented. You might get crash logs, or you might get nothing. I actually have a very good example of this, if I’m using myself as an example again. What happened to me was three weeks ago, I was in an airplane on my way to give a talk at a conference in the Netherlands—the hardwear.io conference, a hardware hacking conference—and while I was up in the air, my pacemaker suddenly failed.

McGraw: Ugh. That’s pretty awful.

Moe: Yeah. I didn’t know what was going at the time. I just felt a really strange sensation. It was like my heart with beating really hard, and I could also feel that my muscle, chest muscle, was twitching in like the rhythm of my heart. I could actually see it on my chest. So I notified the aircrew, and when we landed, there was an ambulance waiting for me and they took me straight to a hospital in Amsterdam. I had to be debugged, in a way. So when they hooked me up to the pacemaker programmer, there was an error message displayed saying that a data error had been detected and the pacemaker needed a reset, a re-installation. There was a file created with the memory dump and logs from the event.

McGraw: Holy cow.

Moe: So I think this is a really good thing. Of course, it was a bit problematic, but after my pacemaker was reset again, it was back to its factory settings.

McGraw: Right. So it needed to be reprogrammed with all your settings.

Moe: So I needed to get it reprogrammed for my settings.

McGraw: Holy cow.

Moe: Luckily I had the print out from my previous visit to the pacemaker check-up so we could get my settings back. The next day, I was giving a talk at the conference. So, I was good to go after that. What’s interesting is that there were actually some files created and these files have now been sent to the vendors to investigate this issue. So I don’t know what—

McGraw: And you survived this because you understand technology and you had a backup, and you knew what your parameters were. But frankly, most people don’t have that level of understanding the way that you do.

Moe: Yes. So luckily I detected that something was wrong and I went to the hospital. It was actually a safety function of the pacemaker that went into effect. It switched into a backup mode, which is good because I’m 100% paced, so I’m depending on the pacemaker to work at all times. This safety mode felt really strange because it had a higher voltage and so on. And that was what I detected. But of course, if you don’t detect that something is wrong, you might go on unaware of this, and this can be not good for your health. And I also chatted with the pacemaker technician at the hospital, and he said that some of these devices don’t give any error message on the screen when things like this happen. So they just have to guess, and they can’t really know what’s going on. It’s a problem to not have the logging and forensic capabilities in place.

McGraw: So given your own personal experience with these things (and you’ve already explained a couple of aspects of what I’m going to ask) how can a manufacturer demonstrate to your satisfaction that the medical device they’re going to put in your body is secure?

Moe: As I said before, third-party testing would be a good thing. Not to have to just trust that the vendor says it’s secure, so it’s secure. In my case, as a more informed user, I would actually like to see the software myself and do my own third-party testing of this.

McGraw: That makes sense. What are your views on this kind of emerging software/device certification marketplace?

Moe: I think there’s use for software certification, and I think that’s something that will help to regulate the device makers and give them some incentive to create more secure products.

McGraw: That’s good.

I want to switch gears a little bit and ask you something completely different. Do attitudes towards women in security in Norway differ from attitudes in the U.S. as far as you know, or is it the same deal?

Moe: Wow. That was a difficult question. I haven’t been that much in the U.S., so I don’t know if I can answer that.

McGraw: Well, maybe just explain what it’s like in Norway to be a highly technical woman in an advanced field and doing security work.

Moe: Yeah. There are not a lot of women in the security field, generally. But actually, I know a lot of women in Norway who are working in security, so maybe it’s a little bit better here. My group here at SINTEF, we are actually 50% women working on security, which I think is exceptional and I like that a lot because I like to be in a mixed work environment. I’m used to being one of the few women in the room and it doesn’t really bother me that much. Sometimes it can actually be to my advantage because I get noticed, you know, people remember me.

But at the same time, you need to make a good impression because if you do something wrong, then it’s like every woman is not good at the thing because Marie is not good at it. I think that’s a really common feeling. In Norway, at Gjøvik actually, there’s a yearly conference being held by NorSIS, which is focused on women and security. And I think they have only women speakers in this conference usually. A lot of women attend, which I think is great because this creates a community and people get to know each other. So it’s kind of a networking event for women in security in Norway. I don’t know if that’s the answer for everything, but at least for students and people starting out in the field, I think that’s like a good place to go to not feel like you’re always standing out in the crowd.

McGraw: That makes sense.

Moe: Like you would if you go to normal security conferences.

McGraw: Yeah, that makes sense. Thanks.

And then the last question, which has nothing to do with anything at all, which establishment makes the best cocktails in Trondheim in your opinion?

Moe: Which establishment?

McGraw: Sure.

Moe: Well, you’re planning on going to Trondheim in the future?

McGraw: I’ve been to Trondheim a couple times.

Moe: Yeah, actually I would recommend a cocktail bar/restaurant called Nordøst (or Northeast). I think they have really good cocktails.

McGraw: Okay, you heard it here first. Thanks. This has been really interesting, Marie. Thanks for your time.

Moe: Thanks.

McGraw: This has been a Silver Bullet Security Podcast with Gary McGraw. Silver Bullet is co-sponsored by Cigital and IEEE Security & Privacy Magazine and syndicated by SearchSecurity. The July-August issue of IEEE S&P includes articles on verifiable electronic voting and breaking down barriers between security and business. The issue includes our interview with Marty Hellman, inventor of public key cryptography and recent Turing Award winner.

While you’re there, make sure to watch the video we produced to celebrate the 120th episode of Silver Bullet. Ten years, holy cow. 

Show 127: Dr. Marie Moe