The Incrementalist Graphic Conor Landgraf

This week I am talking to Connor Landgraf (@connorlandgraf) Co-founder and CEO at Eko (@Eko_Health) – a company rethinking the iconic stethoscope augmenting the capability with innovation

Connor founded Eko while studying bioengineering at the University of California, Berkeley, with the goal of improving cardiovascular and pulmonary disease monitoring through machine learning and non-invasive sensors.

As you will hear Eko has received FDA clearance for their solution as a medical device that is able to detect heart disease by processing the signals received from the stethoscope and the EKG capability. We discuss the augmentation capability and dive into ChatGPT and its AI tools that is currently taking the internet by storm and what that could mean for healthcare

Listen in to hear about the incredibly large database of heart sounds and how Eko Health is putting disease detection capabilities comparable to that of an expert cardiologist in any clinician’s stethoscope

 


Listen live at 4:00 AM, 12:00 Noon, or 8:00 PM ET, Monday through Friday for the next week at HealthcareNOW Radio. After that, you can listen on demand (See podcast information below.) Join the conversation on Twitter at #TheIncrementalist.


Listen along on HealthcareNowRadio or on SoundCloud

Raw Transcript

Nick van Terheyden
and today, I’m delighted to welcome Connor Landgraf. He is the CEO of eco health, Canada. Thanks for joining me today.

Connor Landgraff
Dr. Nick, thank you so much for having me. It’s a pleasure to be with you.

Nick van Terheyden
So if you would tell us a little bit about your background and how you arrived at this point, what’s your, you know, what have you done? And what has that sort of contributed to this point in your career?

Connor Landgraff
Yeah, absolutely. I think I think every journey probably starts at the beginning. So that’s to start there. And kind of understand the background of what led to the formation of echo health and, and what we’ve been working on since then. But but the journey, this journey really starts back in grad school for me, was studying biomedical engineering at Berkeley, and got the chance to take some classes devoted to med device development and focus on understanding the common med tech, the common tool that clinicians use in their day to day practice. And a lot of the coursework focused on workflow and change management and thinking about how new technologies get adopted into clinical practice. And one of the examples that we were thinking about in that coursework was the stethoscope and the way that clinicians use that to make really quick assessments of cardiac and pulmonary status and how you can use it as that very fast triage tool to understand the key vital signs the key information about patient status. What was so fascinating to me, though, as a, as a technologist, I guess, or as an engineer was just how much healthcare has changed in the last 200 years, and how the stethoscope has remained unchanged during that time, you know, the stethoscope of today is largely the same as it was in the mid 1800s. And, you know, functionally, from from a workflow perspective, still, so much of that history has not has not gotten transformed at all. And what was also fascinating was that, you know, human hearing is very subjective. You we don’t, you don’t necessarily hear things the same way that I do. Every person has their own level of hearing loss, their own level of hearing acuity. And we’re also not great at hearing all the ranges of frequencies, it can be hard for us to hear low frequency sounds. And so it was just fascinating the way that there was a lot of inconsistency and lack of objectivity and how clinicians use the stethoscope that was particularly intriguing and interesting for me to try to think about how we could how we could transform.

Nick van Terheyden
So I mean, you’re right. And, you know, I think every medical student on this planet would agree with you that the experience somewhat challenging, let’s, let’s be frank about it. I mean, I’ve seen a number of instances, if I go back to my medical training, you know, over the course of time where there were sort of amplification tools that, you know, would be put on the patient, and everybody could plug in so that you could talk through at the same time, you know, and technology has tried, but you’re right, the sort of persistence of that standardized tool or instrument really has remained, for the most part, the same, although I think the original was sort of almost pin odd, like, but, you know, has progressed a little bit from that. And it’s become, you know, strongly associated with physicians, I think I mean, I, you know, my sense of it is that I know, people tend to use it as a sort of tool to project in some respects. And we’ve also seen some pushback to the stethoscope that said, Well, you know, it’s an age old instrument, is it really offering value? I think you have a different view to that.

Connor Landgraff
I do. Yeah. We can talk about the symbology of it, and the relevance of it today. In modern medicine, we have so many great diagnostic tools that are available as well, but But you said it, right. I mean, the stethoscope is a symbol as much as it is a clinical tool. It’s worn by some of the most educated, most credible, most respected people on the planet. It’s a symbol of healing. And so we thought about this a lot. When we started the company, you can’t change that symbolism and has to look like it’s that that’s good if it has to be something that says what clinicians want their stethoscope to say about them. And that’s really important because have just because of what it means to for Clinical Excellence, rigor, all those things. But I do think the other side of it as well is that, that the scope has stuck around because of the fact that it is so incredibly fast and accessible. It’s one of those things that just works. And when you’re in a busy setting, when you’re just trying to get those basic vital signs, having to think about, you know, more complex, more invasive, more time consuming methodologies, the stethoscope is always the thing you can do in 10 seconds. And that that can serve as that kind of first set of status markers, the vital signs, when you’re thinking about evaluating or triage in a patient. And, you know, I think I think what we’ve been excited by is that there is a tremendous amount of data encoded into heart sounds, breast sounds, lung sounds, it is a really rich source of information about mechanical flow about airflow in the lungs. And that’s part of the reason why that it also has stuck around is is a really rich source of information. And if we can just augment that clinicians ability, give them more confidence and more utility with this tool, then it can retain all of the speediness the accessibility of it, the low cost nature of it, but add in really powerful intelligence and insights to help that clinician make an even better decision. And that’s what we’ve hoped to do is marry both, you know, retaining the simplicity, retain all the great things about the stethoscope and then augment that with, with with reliability, and consistency.

Nick van Terheyden
You know, it’s interesting, you bring out that sort of speed of access, I think that was one of the critical elements that, you know, most physicians would agree with, you know, I’ve taken it, I’ve taken it around the world, I mean, I’ve traveled well over 2,000,002 and a half a million miles. And it was the one thing I always took with me, to every country, every continent, all the places it was, it was the one thing that I would always carry, because it was so facile and useful. But you raise one of the challenges for me, one is, you know, background noise. So in a noisy environment, and I’ve seen this a number of times in a plane, I’ve got to be honest, a plane is really noisy, the background noise is somewhere in the order of 70. I don’t know how many inside but it’s it’s high. Very hard to perceive. But actually, as you age, you sort of diminish and you know, so that’s one of the challenges you’ve you’ve looked at, tell us a little bit about the background to that and how you saw or thought about this in terms of improving that.

Connor Landgraff
Yeah, that’s a it’s a great discussion point, as well as Yeah, I mean, at the end of the day, for many clinicians, heart sounds are can be very subtle, they can be hard to hear. We have all those questions of patient body habitus, background noise, environmental factors, there’s a lot of reasons that it can be really challenging to easily hear those heart sounds and, and brass sounds. And when we looked at this problem, what was fascinating to us was that almost none of none of these types of scripts on the market at the time really had any sort of Highly Effective noise cancellation capabilities. And we were looking at consumer and looking at the consumer products, and Bose headphones and air pods. And we’ve just seen this proliferation of of that noise canceling capabilities, it’s gotten really good that you can put those Bose headsets on an airplane. And they’re pretty magical. If you haven’t tried them, like they, it makes a huge difference. And yet, the stethoscope the clinical tool, and we’re noise cancellation is so crucial, doesn’t have that. And so we developed, adopted and spent a lot of time tuning the acoustics to give us that noise cancellation capabilities that we can actually use. We have a two microphone systems, we listen to the heart sounds with one microphone, and we listen to the ambient noise with the other microphone. And we can do a subtractive process where we actually remove the ambient noise from from the heart sounds and the background noise. And the end result is that even on an airplane in a helicopter, a matter of fact, like you can you can hear the heart sounds were far more fidelity, back of an ambulance with sirens and road noise EMTs on the side of the highway trying to assess you know patient’s respiratory or cardiac status. All of those situations where being able to focus and zoom in and isolate the background can can be you know crucial to these frontline first responders.

Nick van Terheyden
Yeah, I as you describe it I mean I’m I’m smile I think because I’ve, I’ve had some of those experiences in your writing, writing. And, you know, the the ability to narrow down and not had that experience, except obviously, when I’m listening to a movie for crying out loud, but not when I was listening to a patient. I had that experience and, you know, shocking that it took us this long. But clearly, you saw that opportunity, you saw the potential. But there’s another thing that’s going on that I think is really important, and one and you know, it’s the distribution. Let’s be frank, we are not all high and supremely qualified cardiologists that literally and I’ve been on ward rounds with these folks, they put the stethoscope on and oh, yes, you can hear a bloody blah, whatever it was murmur and the procession of students that Oh, yes, yes. And, you know, let’s be honest, it was a real struggle. And you can learn that, but not everybody does. But there’s a challenge in what I would call signal processing. And as soon as they say something like that, I think, Hmm, I wonder if we can add some technology? What what are you doing in that space?

Connor Landgraff
Yeah, it’s a fascinating topic. I mean, we’ve spent some time thinking about just how do humans learn to perceive and differentiate audio sources? And what is interesting, we look, we looked at med school textbooks to try to understand how do clinicians actually learned about murmurs and tried to differentiate heart sound timing. And the funny thing is, when it’s described in the textbook, it’s shown visually, they’ll show a graph and they’ll say, okay, here would be the first heart down, here’s the second heart sound, here’s the Holo systolic murmur or something like that, but it’s shown visually in the textbooks. And when it’s shown visually, it’s very easy to understand and appreciate what you should be listening for. But then when you have to translate a visual representation of a waveform into listening, where you have changes in heart rate, and all of those different factors, it gets a lot more challenging, and translating, like a visual representation of what disease should sound like to. Okay, now I’m hearing the same thing. And do I hear the pitches right? And do I hear the timing right? There’s a reason that very few people have perfect pitch or, you know, a musical ear, you know what, as a musician, that’s a very specific skill set skill set that requires a tremendous amount of training, to to develop, and listen to heart sounds the exact same thing. And so we said, well, let’s marry the visual representation back to the sound. So we show what we call a photo, cardiogram lets you see the representation of the heart sounds just right there, it becomes far, far, far easier for the clinician to say, oh, I can see the murmur, in addition to being able to see the, the normal heartbeat. But we’ve also gone a step further and said, Let’s train a machine learning algorithm, let’s train artificial intelligence to help us do this pattern matching process. Because again, this is where computation can be really, really good. And if we can build a large enough data set where the machine learning model can see enough variations in patients and different types of disease, then then the model can actually be as good as if not better than the clinicians at identifying abnormal abnormal heart sounds, lung sounds. And what we found is that, you know, we’ve been able to both build the dataset to do that, and then take the computational techniques and signal processing techniques to actually translate that dataset into into algorithmic insight. At this point, now, the dataset is, you know, being curated from millions and millions of recordings, you could think about it, the algorithm has probably seen more patients than almost any other clinician on the planet in terms of just body of knowledge. And, and that, that size of that data set does translate into clinical accuracy and allowing it to be really consistent and reliable. And I think the great thing is that the algorithm doesn’t, doesn’t have hearing loss over time, doesn’t have a bad day doesn’t get doesn’t get sick, you know, there’s that kind of consistency, that that machine learning algorithms can provide, which which we think is really valuable to clinicians. And I think the last point would be is this is meant to what we hope this will be is an augmented an aid to the clinician and we don’t see this as a diagnostic we see this as a first step to a diagnostic process. You know, you imagine any patient with heart disease or risk factors for heart disease, and the reality is, we all start our journey. Healthcare System at primary care or at a urgent care clinic or at, you know, some frontline care setting. And we don’t get the diagnostics until later. And there’s that referral that’s required or there’s additional testing. And one of the things is, you know, we rely on this frontline clinicians to be able to kind of make those judgment calls make those kinds of suspicions that help us get to additional testing. And where we can aid is allowing our frontline clinician to make that decision with more accuracy so that we can get all of those right patients to additional diagnostic testing early and appropriately, and increase the likelihood that these patients get get treatment later on.

Nick van Terheyden
So for those of you just joining, I’m Dr. Nick the incrementalist today I’m talking to Connor Landgraf. He’s the CEO of echo health, we were just talking about the innovation and the artificial intelligence that’s being applied, I think, you know, the combination of taking sound and creating a visual, and it’s interesting, you talk about that sort of visual that was presented in the medical textbooks. That’s exactly right. I mean, you know, what, what did we what did we learn, we learn through books, it was much more of a, a book orientated education that I experienced, it wasn’t so much of the sort of opportunity to hear it. And the best that we got was, well, that’s lub dub. I mean, that was. So you know, moving just to to bring that I think, you know, exciting, but you bring up AI and that obviously raises the question, I think contextually at the moment, everybody’s saying chat, GPT, it just took the USMLE stay step 123. And past all of them, so, you know, revolution, it’s here. It’s it’s going to replace Well, I don’t think replays but at least, you know, it’s doing some amazing things. It sounds like you’re doing some of that, but there’s got to be some limitations. And, you know, my sense of this and certainly experienced with the FDA is there’s a little bit of oversight on that. Tell us where where we are with all of that.

Connor Landgraff
Yeah, it’s a good question. And of course, I think it’s always interesting to see the ways that chatty GPT will be applied to new problems. And it seems like chat GPT is particularly good at answering, you know, linguistic synthesis questions, where there is a lot of body of knowledge, and I think about the USMLE. And I imagine all of the flashcards and all of the training materials that have been curated specifically exactly for this exam, you could imagine that the body of linguistic knowledge it’s solely devoted to passing the MLE is massive. So it’s probably a relatively easy place for it to just regurgitate the answers, and it’s really learned for those USMLE exams. I think I still have lots of questions. And I think I’m a little bit more of a a jet AI generalist skeptic, I think AI is really powerful for specific problems where you can fine tune that the focus of the algorithm or the models to a really defined set of, of boundary conditions. I’m still skeptical that that generalizes? Well, and I think we see lots and lots of places where chat GPT does comes up with very goofy answers to questions. And those edge cases really start to emerge quickly. I think the healthcare system, we’re not really ready to have a, a clinician or a clinician assistant, you know, an augmentation tool that has, has unknown failure modes, and those failure modes can be rather basic and catastrophic. It’s in certain situations. And so it’ll be interesting to see what happens. But but I think I’m relatively skeptical of that. Of those, like, really large language models yet, but still really, really interesting things happen there.

Nick van Terheyden
And, and in the case of echo health, you’re applying it but in a way, you know, I think extremely interesting as you I think, such a compelling point. You’ve probably listened or not probably I’m sure you have you’ve listened to more recordings, more content than any clinician could probably do in their entire lifetime. And that’s experience that’s now contributing. Where are you with that AI piece? And what’s it doing for that experience with the echo health product?

Connor Landgraff
Yeah, absolutely. So we’ve been building the algorithms themselves during the clinical data collection, running those larger validation studies. Last year, we submitted a regulated an application to the FDA for a machine learning algorithm to predict structural murmurs associated with valvular heart disease. So being able to differentiate is this Is this likely an aortic stenosis? rubber, or is this an innocent murmur? Now is a big step for us. But the FDA is definitely taking a very cautious and thoughtful approach to how they approve machine learning algorithms, we’re definitely seeing a proliferation of new machine learning technologies get approved by the FDA. But the FDA is getting is is and is getting more so very savvy about allowing or about evaluating the datasets that company uses to ensure that there isn’t overtraining happening or overfitting. For those non kind of machine learning folks. overfitting is what happens when the machine learning algorithm has been trained on a training data set, and then evaluated on that on that training data set. It’s like, if you have the answer sheet and you’re taking the exam, you’ve already seen the answers before, it’s easy to score really well, right. And that oftentimes happens where machine learning algorithms get over fitted to a training data set. And they don’t then work actually that well, in the real world. They’re just kind of memorizing the answer key, they don’t really understand the material on the test itself. And so, you know, I think we’ve seen definitely cases, even you know, even even in bad tech companies, where machine learning models don’t really perform that well, in the real world. And the FDA is starting to hold those companies to a higher bar in terms of how they validate the datasets that they’re using.

Nick van Terheyden
So just to be clear, I want to be sure that we understand where are you so use submitted to the FDA approve that algorithm?

Connor Landgraff
It did? Yeah. So last summer, we got regulatory approval for a value of heart disease detection AI, that was a pretty big milestone for us and other AI approval for the business. And still, among just a handful of companies that have machine learning algorithms approved by the FDA.

Nick van Terheyden
Wow, that’s exciting. So if you would just sort of give people a picture of what that looks like at the clinical work face. What what goes on, somebody uses the toolset, how do they use it? And how does that work?

Connor Landgraff
Yeah, absolutely. So with this tool set, a clinician can take the echo device, put it on the patient’s chest, record the heart sounds or ECG, we transmit it to the mobile software or tablet software in the background. So as a tablet that runs alongside the, the device, we send it to the cloud, interpret it and send the result back to the clinician and say, Is there a murmur present here? And is that an innocent murmur? Or is that a murmur that likely indicates structural disease, and all that happens within 30 seconds at the patient’s bedside, we do a 15 second recording, takes the AI about five seconds to analyze, that’s very, very quick, very, very workflow friendly for the clinician to get that information.

Nick van Terheyden
And this is essentially a tool set that’s available to anybody that has access to the internet. And obviously, the device. So this is now broadening the scope of that experts so that, you know, your clinician in this case that’s listened to these millions of recordings is like having your clinician there, not as oversight, but sort of support because we can all hear that, that essentially broadens access, and allows us to find, you know, those cases that might have been missed otherwise, exciting times. Where is this going? I mean, that’s already fantastic. But where do you see this going now?

Connor Landgraff
Yeah, I mean, to summarize what she said, our goal is to have cardiologists level accuracy at doing early disease detection with the stethoscope, and with ECG in the pocket in the hands around the neck of every frontline clinician everywhere. And the cost is such that we can do that with AI, it can be very low cost for us to enable this. And the end result is that more patients can get higher quality assessments at the frontlines of care. We’re going to continue to develop new algorithms for different types of cardiovascular disease. So really specialization within those conditions. And we’re going to continue to kind of keep doing that product development process. But it’s a very iterative process for us.

Nick van Terheyden
So all in all very exciting, I think, you know, the, always the sort of incremental approach to technology. So this is not the replacement. You know, we’ve heard the death of the stethoscope. I’ve heard it a number of times long lived the stethoscope, it’s still here. I still have one most people have one additive device that can be incorporated into existing or you can have it but it essentially looks feels the same. I know you’ve got some alternatives, and potential to add in capabilities and you know, essentially a little bit of connectivity and we’ve got it unfortunately as we do each and every week we run out of time so it just remains for me to Thank you for joining us on the show today Connor thanks for joining me

Connor Landgraff
Dr Nick thank you so much for letting me be here this is so much fun


Tagged as , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , ,





Search