This week I am talking to Neil Clauson, Regional CISO, at Mimecast, a company focusing on stopping “bad things from happening to good organizations by enabling them to work protected”.
We talk about the unique challenges that healthcare faces in securing its attack surface and the ongoing problem of balancing what is possible with what is practical. The bad actors follow the money, like everyone else and there is a lot of money in healthcare and a rich field of information and opportunity for attack. For cybercriminals, patient records, medical data, and the systems that store and process them have high value, which makes healthcare providers and their affiliated businesses an attractive target.
Neil has some great analogies and approaches to security that start with the good, better, and best and elegantly link the concepts to the approach we took to battling COVID as we discovered the threat, learned about that threat, and then applied layers of protection to prevent the disease from spreading and treating instances of breakouts.
We talk about attack surface area and how o reduce this as much as possible by understanding threat actor tactics, techniques, and procedures, and then leveraging the resources internal and external that raise security to the board level focus and the importance of approaching security with a methodical data-driven approach and assessing risk
how quickly can I get to what I call the mean time to conviction where I can quantify risk, that risk is real and there’s something to be done about it or the mean time to innocence, which is yup we are within our risk appetite, this is something that I don’t need to take further action on
Listen in to hear our discussion on raising the red team thinking in your facility, some of the online tools and data available to assess your status, especially in the context of the increasingly important Cyber Insurance that includes the importance of “Cyber Hygiene”.
Listen live at 4:00 AM, 12:00 Noon, or 8:00 PM ET, Monday through Friday for the next week at HealthcareNOW Radio. After that, you can listen on demand (See podcast information below.) Join the conversation on Twitter at #TheIncrementalist.
Listen along on HealthcareNowRadio or on SoundCloud
Raw Transcript
Nick van Terheyden
Today I’m delighted to welcome Neil Clawson. He is the regional CISO for Mimecast. Neil, thanks for joining me today.
Neil Clauson
Thank you very much for having me.
Nick van Terheyden
So as I do with all my guests, I think it’s important to get a little bit of background you’re in healthcare. But actually, that’s not your original area of focus. You came at this slightly differently. Tell us how you got to this point in your career?
Neil Clauson
Yeah, I’ve been a lifelong IT and security practitioner going way back to my days with my computers and dial up and looking being a hacker when that wasn’t necessarily a bad term, right, figuring stuff out. That’s what I think
Nick van Terheyden
hackers is a bad term. I didn’t know that I’m one of those.
Neil Clauson
Exactly, right. It’s, it means somebody who likes figuring stuff out. And so um, so again, just through that journey, that squiggly career, they call it working on desktop and it and helped us for a variety of organizations, including hospitals and health care, but always having that security focus, and how do I prevent those risks from occurring? How do I make things better and prevent bad things from happening? happening to good companies?
Nick van Terheyden
Yeah, so security is a recurrent theme on this show, and certainly with some of the guests that I’ve had, and, you know, as people that know, me know that it’s definitely an area of focus. So I’m always excited to get new perspectives. And I’m curious to know, as you look at the landscape, and I’ll give you my perspective first, but it seems like this is a nuclear arms race with the hackers. And in this version, I’m using them as you know, a bad term who are essentially attacking. And it’s actually a business at this point. In some instances, attacking and healthcare has for a long time now, the largest one of the largest targets on its back. How do we cope with this? I mean, this seems like a I don’t want to say never ending problem, but certainly one that, you know, we struggle with, and we’ve we’ve had challenges.
Neil Clauson
Exactly, I think it’s partially says where the money is, right? It’s, there’s an underground economy. It’s not just the kid in their mom’s basement, like it used to be right. It’s the whole ecosystem of the bad guys facing up versus the good guy. So I think it’s a matter of applying and learning from different different fields and applying those techniques that help us implement that defense in depth strategy, that good, better, best, those things that help you quantify risk and raise cybersecurity risk from being just purely technical to that financial aspect. It’s impacting company’s ability to generate revenue to service patients to deliver the outcomes that they’ve promised. And quantify that in a way that raises it to senior leadership so that we can be aligned, and have those those mutually understood and drive the funding drive that changes drive the difficult conversations that maybe haven’t been happening in the past, to kind of get over that hump of the bad guys winning, and us having to get it right every time and then just having the right one. So I’m pretty sure.
Nick van Terheyden
Yeah, so and you bring up an interesting point, when you talk about it’s where the money is to be clear, it’s not. So I mean, there is with certainly some of the attacks are focus on, you know, extra cating funds with ransomware. But there’s a whole other side, you know, when we talk about where the money is, is the data that’s been exfiltrated, that has high value, because it has, it’s very rich in terms of the content. So you have these organizations, and they have, they even have HR functions and, you know, pay out the whole thing, it’s really quite extraordinary, that is essentially looking at this and going after sort of different groups. If you were sitting in a hospital, what do you worry most about? And how do you sort of cope with that?
Neil Clauson
Yeah, I mean, there’s another cliche, which is it’s not what you don’t know That’ll hurt you. It’s what you do know with absolute certainty, that just isn’t so. Right. And I think it’s, have I implemented those those core functionalities that good better best and, and where am I areas is assessed? So I always say, where are my areas of unmitigated risk? And how quickly can I get to what I call the mean time to conviction where I can quantify yes, that risk is real and there’s something to be done about it or the mean time to innocence, which is you have this within our risk appetite, this is something that I don’t need to take further action on. Alright. And that’s I think in the end, it’s every company is going to be different. But it’s how well, do you have your finger on the pulse of your various risk scenarios? And have you followed the again, we call it the NIST CSF that’d be identified your assets and those key systems, have you properly protected? have you detected when those protections fail? How quickly can you respond? And how effective are your recovery techniques and the wrapper around all that is already learning from this? And it’s that continuous cycle, just like if you’re a doctor, right? Like if I knew, like, when COVID came? Like, how do we identify? And how do we what are those defense in depth strategy? And what is the risk appetite? What are the different levers we can pull? Can I reduce my attack surface? Can I maximize my controls? And again, overall, kind of quantify where those areas are, because security is, is a part of a much bigger picture, right? As technologists, we love turning those nerd knobs we love, you know, building our firewalls and networks and all that kind of stuff. But in the end, it’s to serve a business purpose, right. So every security strategy is really comprised of your threat vectors that might be generic or might be unique to you, as well as the business drivers. And I think a lot of folks forget about having that your foot really in that other camp, we serve at the pleasure of the President to use a Colin Powell analogy. So that’s what I would focus on is, is where are you? How do we how do we address these issues? It’s by first getting your finger on the pulse and understanding where your controls are acceptable and wearing unmitigated?
Nick van Terheyden
Yeah, I gotta say, I like the analogy with COVID, 19, although there’s certainly some tiredness associated with the term and people frustrated, but I think it resonates in part because people can really link that and see the analogy. And, you know, as I’m thinking that through, I think about, you know, understand the risks. So let’s identify, in this case, the virus and you know what to do. And then we had mitigation, we have masking, we had distancing, those are all sort of, you know, those analogies actually stretch into the healthcare setting. And then obviously, I think, importantly, and this is where I’d like to get your thoughts. One of the challenges with that COVID mitigation strategy was, well, we knew a lot of this. So we knew that masks work, I’m probably going to be in trouble. And I’ll get the flood of people telling me that they don’t but they do we, it’s clear, the science is clear to to that particular point. But knowing that and actually implementing it proved to be quite difficult in some cases, because there was resistance. And that brings me to the point in security, I think of and and I forget her last name and the TV show, you are the weakest link. And that’s the case in healthcare. And that’s everybody in your population. How do you focus in on that and start to sort of resolve that under the model that you described?
Neil Clauson
Meaning, how do you how do you how do you get those weakest links? Yeah. So we do some some things called what’s called Red Team thinking, which is we look at you do tabletop exercises, pre mortem. And post mortem, you look at those failure scenarios, and what are the most likely failure scenarios that can impact me, we use some techniques, were giving people psychological safety, so that in that system of systems that you’re trying to build those gaps, and those areas, which somebody might not be willing to share publicly, through some, some just anonymous web tools, where you folks can share with those different value chain failures or things that might break along that process. So if you can consult with your team, and the people who are most affected by that change, we talked about change management. And the people who are most affected by that change should be helping with the diagnosis of the problem and coming up with a solution because then you’re most likely even unfreeze that behavior to change it and rephrase it to where you want to go. So So I think so running tabletop exercises, Western psychological safety, with getting those key stakeholders and expanding that beyond as the technical group, to your legal to your PR, to the folks to the business side, which are going to have to deal with the blowout, that if that control in that threat scenario is realized, and doing that in a way that is actionable, right, and it’s all about getting those that shared consensus, and getting the people process and technology working together. And again, with that, I’m a big fan of that devil’s advocate Red Team thinking concept or as big a group saying for things that cause the the shuttle Challenger explosion and the things that again, that you, you Assura but really aren’t that accurate? Oh, yeah, our backups are great. Are they really in what are the things that can be even our our controls are great, so yeah, that’s a it’s a it’s a lot. There’s a lot of methodologies that this is where it’s aI cybersecurity experts and, you know, we can learn a lot from the healthcare side. And for how they do things, and vice versa.
Nick van Terheyden
I think great points were just for the benefit of the audience, I think it’s important to explain Red Team Blue team, you know, can you just clarify what you mean by that?
Neil Clauson
Sure. So the blue team are typically the defenders, right? The people who are putting up the castles and like guns and the guards and the gates and trying to detect and respond. And the red team folks are the ones who are either the actual aggressors trying to break in, or typically the ones that you’re paying to figure out your flaws before they do that. Red Team thinking is kind of a perspective of looking at it with that devil’s advocate mindset for Okay, flip it right. And so you might be bleaching, you might be thinking things are great, but what could go wrong? Right? And because that’s usually where you have that one person that says, Yeah, this really isn’t gonna this this is this is an actual risk or is that longer tail and frankly, could could occur.
Nick van Terheyden
And the other thing that you brought up that I think is really important here is that I’m going to it wasn’t the term you use, but that’s what I got from it was essentially a no blame approach to identification of problems. I’ve certainly seen punitive, you know, even if not deliberately, deliberately punitive arrangements where people are essentially punished, even if it’s not, you know, punishment in the traditional sense, it can be punishment in terms of time, effort, resources, that you end up being sort of involved. How, how do you create, you know, essentially a no fault, no blame approach to this, because that’s the way that I think you get the maximum value for the red team thinking from the people that potentially see those gaps, but you know, maybe don’t want to reveal or share them.
Neil Clauson
Yeah, I mean, obviously, culture starts at the top. But I think, again, some internet tools of Lean Coffee Table is one of them. But for folks that are again, participating in this can log in or anonymously, they do little mini sticky notes, anonymously, and then that raises those issues. And then you can aggregate them and then do group level still anonymous dot vote ranking on what are you what are we think these things are? So it’s, it’s about that process of continuous improvement is about that process of like in psychological safety, and using some some tools in a way that helps you surface those way surfaces? And then having leadership, right, it’s all about that. Another methodology, I’m a big fan of is the growth methodology, like, what are our goals? What’s the reality? What are our options, and what’s the way forward? So by getting everybody focused on what the goals are, and then kind of talking about the pros and cons and the real true reality of it, you can then really start talking about options. And I always say the first option is do nothing. Because a lot of time people like No, no, we can’t do nothing. I got you, right, like now you’re, you’re part of the team, and you’re working on those other options. But by kind of putting out nothing as an option, people are kind of usually forced to it. So so if we had a breach, we’re going to have to use our cyber insurance, we don’t have enough coverage, we’re going to impact patient outcomes, or, you know, we don’t want to our goal is to make sure all our medical devices are fully patched and up to date. Well, yeah, the reality is that means sometimes getting FDA approval, there’s all that legislation coming through to go because there’s that, that challenge, and that back and forth. But I think by surfacing these up from what is our common goals, and doing that in a way that there’s psychological safety, and really kind of all of us focusing on the outcomes. It’s about us together, versus the bad guys, not us versus each other. And how do you how do you build that out? I think it’s by having good leadership and good methodology.
Nick van Terheyden
So for those of you just joining, I’m Dr. Nick the incrementalist today I’m talking to Neil Clawson. He is the regional CISO. For Mimecast, we were just talking about the major challenge of getting people to sort of come together and you know, I like that the one of the options is to do nothing that, you know, creates almost a sense of camaraderie in a group that, you know, realizes that that’s not acceptable, but you actually present it. I think that’s kind of an interesting sort of approach to this. You’ve obviously got lots of experience in this space. Tell us a little bit about where you see the biggest challenge, you know, as you think about the threat landscape for a typical hospital, I know that’s a little bit unfair, because they’re all different and so forth. But there must be some, you know, key pointer areas that people can start with in terms of you know, and obviously, I’m, I’m an incrementalist. So, you know, if I can fix just one thing, what would it be?
Neil Clauson
Yeah, yeah, agreed. Yeah. Again, I’ll go back to that good, better, best methodology. So the analogy is if you know you were had an open wound, would you really just walk around or go to this London or New York subway system without doing it so So focus on like the tree, either hospital ER triage. And that cyber risk quantification concepts is probably one that we focus on, there’s, there’s tools to be able to continually scan and monitor your attack surface area, right. And so by focusing on that low hanging fruit and being able to quantify it, applying the most critical fixes to the most critical systems in the most critical manner, I would say right that triaging it and having those experts as well as you can’t do it all yourself. So there’s another methodology called Wardley mapping, which is your value chains? And what are the things that I have to do internally, and one of the things that I’m better served by leveraging somebody that’s an expert at it, like in penetration testing, something that you know, that kind of thing or continual attack service monitoring. So there’s a lot of as we’ve evolved in the security space, evolve some methodologies to leverage third parties more effectively, because staffing is so difficult, you know, the threat vectors change so frequently. So continuous attack surface monitoring, using a person behind that vulnerability scanning baselining, gets that this thing called the EPSS, which is the gargoyle name, but it’s that exploit prediction scoring system, you can’t patch everything all the time. It’s just, it’s just inconceivable. So what are the ones that are actually being exploited in the wild? What are the viruses that are actually being you know, out there, from a medical perspective, and then, again, applying that surgical precision to it so that you are reducing your attack surface. So I think there’s the two levers you have is after you quantify your risk in a in a more quantified manner, reducing your attack surface as much as possible, maximizing your controls as much as possible, getting that ecosystem, all those best of breed tools that you’ve purchased, should start cross pollinating threat intelligence should start giving you in raising that actual information because of limited timeframe. And I’m getting the most likely vectors addressed. So again, I think Internet facing was a lot a week, I talk a lot about cyber insurance now, right, because of the financial aspect. And those insurance firms are looking at your internet credit score, it’s called so security scorecard bits site where there they are, there are companies that are monitoring your externally facing footprints, and then giving you a score ABCD on how well your digital hygiene is. And your insurance premiums are being said based off in parts by what the Internet facing score is. So it’s behooves you to do your best to address those findings. Find out where there’s false positives. Now, I don’t even own that IP subnet space anymore. So that is I think it’s part of that overall lather, rinse, repeat. That is insecurity is a process. A not a one time thing.
Nick van Terheyden
I’m I’m sure that, you know, a large number of folks have probably already, you know, looking at that score, it sounds very much like quality scoring on physicians, you know, satisfaction scoring. But for those that, you know, maybe heard that and said, Wait, what, there’s a score on me, where do they go? How do they find that?
Neil Clauson
Yeah, I mean, there’s security scorecard is one, there’s BitSight, there’s risk recon, which is recently focused by MasterCard. And so there are it’s this, these firms have been in business for a long time they’ve been evolving. It’s just it’s, it’s a way for example, for your third parties or your customers, right? If they don’t have the capability to do a deep dive assessment on whether they trust you more, you’re providing part of patient care data or processing their data. It’s a way to kind of get that neutral third party that Kelley Blue Book, American perspective, right, the Kelley Blue Book value of, again, where they fall, is that where do they fall within my risk? Appetite? Right? There’s a B, okay, well, depends what type of we always talk about doing vendor risk assessments and application risk assessments? Right, what, how trustworthy is that vendor? Are they tall enough to ride the ride from a roller coaster perspective? And then how trustworthy? What types of data are my store? Right? And where is that data? Obviously goes back to the that identified where you have your sensitive data and what risks you have. Right.
Nick van Terheyden
Right. And I think, you know, helpful for those folks that are not I clearly there’s, you know, a cadre of people that have spending their time and especially if they have the fortune of having a CISO that, you know, in place not all do. You know, and I think that’s one of the challenges around this, that, you know, it’s maybe subordinated to somebody that, you know, is potentially learning this as they go. So I think this is very helpful. You know, one other thing I want to pick up, you talked about the sort of risk assessment and I like that, you know, incremental approach of, you know, find the areas to focus on and, you know, with surgical procedure isn’t, but one of the challenges I’ve seen. And I’m curious to know how you approach elevating this. And I’ll be very specific, I spent a long time in the voice industry. And we had dictaphones, as they were called, I don’t think you even hear that term now. But people digitally, they used to pick up the phone. And then it was, you know, digital dictation system. And that digital dictation system was never seen as a critical resource, right up until the point that it failed. And it wasn’t an attack, it just failed. And then suddenly, everybody realized that this was like, probably one of the most highest priority, because suddenly they were flooded with physician calls, how do you elevate those things before you actually reach the failure? What’s the process to go about that,
Neil Clauson
I love it, and I think I’m a fan of engaging with your key stakeholders, with an asking them so. So even if you’re at the lowest level of the org, and you’re struggling, getting funding talking is as high as you can say, on a scale of one to five, CFO, financial officer is losing a million dollars, a five or a one, right and, and getting that baseline as a seat marketing officer is losing 1000 customers or a million customers, right? And so when you start to get those, I always say, if you can’t beat them, join them, and then beat them. Right? So. So getting and being able to baseline that and then taking out. So industry examples of hey, this just happened to our competitor and CMO. You said if this happened, this is a five for us. Right? And so again, our goal is to not be like our competitors, where I don’t want to be a five as well. And what can we do? What are our options to to address this? And doing that in a way to say, Do I have a blueprint, right? Do I have an ongoing capability and capacity to deal with those threats. And to bring them back down to the risk appetite security folks typically want to do get everything down to zero? And the business has to look at that as a way to say, you know, what, sort of define that risk appetite. And again, good, good enough, might just be on a scale of one to five, better is that fair analysis, where you’re really putting some math and statistics behind it?
Nick van Terheyden
Yeah, I think that’s great. And, you know, if you can’t beat him, join him, and then beat him that I gotta say, that reminds me a little bit of Caesar and Brutus, I’m just saying, you know, you’ve got to get behind somebody before. But I think you’re not actually going quite that far, you’re getting to the point of understanding, and then creating the barriers or the focus that essentially appropriately attacks, you know, those challenges
Neil Clauson
will not be them in a negative way. But be tonight, in a speaking again, I guess, speaking their own language, everybody needs to be able to understand their, their peers, and what their peers and challenges are, and, and be able to communicate on their level.
Nick van Terheyden
Excellent. So as you think about the future, as I said, at the beginning, you know, feels a little bit like a nuclear arms race. I mean, obviously, we’re applying technology to help us prevent attacks. But I also know that the threat actors are also doing that they’re using machine learning, how do you think about the future and where it’s going?
Neil Clauson
I’m positive, because I feel like I was just at a Cisco event yesterday in Boston. And, again, the methodologies and the ways we’re able to quantify risk, reduce our attack surface, maximize our controls, and as vendors are realizing that it’s not it’s not just one solution, right? It’s, it’s, it’s that system of systems and being able to identify that value chain, I feel positive, because we as a community are starting to mature our understanding our ability to communicate that risk and ability to elevate that to the business level right in where we’re getting the funding and getting the attention and getting the the the the initiatives and the buy in at that top level. So I feel competent, I feel like again, the the vendors are realizing this as well, and moving on just building products that really help mitigate that risk from a variety of perspective.
Nick van Terheyden
So I essentially good news, we have a positive sort of perspective on the continued challenge. I mean, let’s be frank, this is an ongoing problem. I think people increasingly are not only exposed in a business, Stan standpoint, but also the personal standpoint. I think anybody that knows me knows I’ve certainly had my own sort of challenges with security attacks. That, you know, fortunately, I’ve managed to mitigate. It sounds like there’s been a shift in thinking and we’re at a point of, you know, better, clearer focus from the top of an organization and, you know, that really helps drive it and then bringing everybody along for the ride. Unfortunately, as we do each and every week, we’ve run out of time, so it just remains for me to thank you, Neil, for joining me on the show. No Neil thanks for joining me
Neil Clauson
I sincerely appreciate the time and and thanks to all the audience as well