This is the unedited transcript from S1 E1 of The Learning Health System Podcast
For a more human consumable version, check out this page which also has all of the links to the podcast.
We really only put this here so that bots would index our website better so it's not going to be much fun to read unless you are a machine.
But if you really want to, read on - if the text has typo's blame the AI and it's inability to understand our accents. :-)
James Green (00:01)
All right, here we are on our first episode of our first podcast. My name is James Green. I'm with.
Parsa (00:07)
Parsa Mirhaji here.
James Green (00:11)
That's Dr. Parsa Miharji to you. A little bit of a... It's true, everyone does... Yeah, it is. But still, you are Dr. Parsa Miharji.
Parsa (00:12)
Everyone calls me par, so you better get used to them.
Well, if you want to be accurate, I'm doctor, doctor. Yeah. Now that we're bragging, MD, PhD, and obnoxious, and interrupting, know, better be precise.
James Green (00:27)
That's true, PhD. Yeah, yeah, it's true. He's far smarter and better educated than me and obnoxious and interrupts me. So we're going to... So a little bit about our backgrounds first. So I'm just a business guy. I've done a bunch of startups and Parsa found me because he wanted to take some of the technology he's built.
and spread it far and wide. And we have a company, Cognome, that is doing that. But this podcast is not about that. This podcast is about the genesis of AI. I'm going to talk a little bit, Parsa, about your background, and then you're going to round it off with the things that I forget. Parsa was born in Iran. He basically, I was going to say, brought the internet to Iran, but certainly brought the internet to academia in Iran. And...
Parsa (01:08)
Okay.
To Tehran University of Medical Sciences to be exact. That was the very first time that in medical school someone thought about connecting to the rest of the world in almost real time. So I worked, you know.
James Green (01:26)
Thank you.
And as if that wasn't enough, he built an entire EHR that is still one of the largest EHRs in the Middle East. Is that not true? It is.
Parsa (01:45)
Yes, it is. It is hundreds of hospital systems and an integrated network of health systems running one operating system, which is our EMR. More than 50 million patients in.
James Green (02:02)
and
And after those modest achievements, moved to the United States, University of Texas, I've heard fun stories about how that started in very modest means, which I will not say, but you can say as much of that as you like. And spent, how long were you at the University of Texas?
Parsa (02:24)
think 10 or 11 years, 11 years.
James Green (02:28)
And then came to Montefiore, that must be over 2012, so 12 years ago. And in that whole time has been focused on building what we think of as a learning health system. And...
Parsa (02:30)
2012.
Informatics platforms necessary for, yeah.
James Green (02:48)
We should talk about what we're going to call the podcast. I got some names here. I had one which I liked, Parsa Parses AI. No. Or here's another one I liked, Artificial Intelligence or Natural Stupidity. I kind of like that.
Parsa (02:56)
Wow, we laughed about it. Move on.
We may talk about it, but it's not a good title. I mean, it's neither one. And both, yeah. Let's move on. I'm not a in those zingers.
James Green (03:15)
No. We should probably have something around... Yeah, yeah, yeah. I do...
Well, we're going to talk about learning health systems, so maybe just learning health systems.
Parsa (03:26)
Yeah, I would call it that not focused only on AI or machine learning. Let's talk about paninformatics, right? Everything that you need to consider, and AI and machine learning is part of it, but how you translate it to practice is also a big part. How do you empower patients with it and push all of that technology to the patient's hands also a part? Where do you bring the information and data to feed into these machinery is part of that commentary. So let's...
just call it paninformatics for a learning health system. However we consider by definition a learning health system is, I would like to talk about what it takes to support one with tools of informatics and data science.
James Green (03:56)
Let's do that.
There we go.
All right, so we have a name. So now let's have a beginning parser. Where would you like to begin? I think we should begin the story while you were at the University of Texas. What was the first thing that you did that was the first kernel of a learning health system?
Parsa (04:36)
Yeah. So let's roll back to us a little bit before that, because I think it is relevant. So I was deeply involved in building EHR systems and connecting EHR systems at the service of a healthcare system, taking care of patients, using data and
That's when I came to United States. I was in Texas Heart Institute and University of Texas Health Science Center at Houston to basically practice cardiology. I had just arrived when 9-11 happened. And that was my background, right? Health information, connecting hospitals to each other to take care of patients, focused on
how to apply data to take care of patient and hospital system and so forth. And 9-11 happens. And overnight, I turn from a informatician to a public health informatician. I talk to my supervisor then late.
Dr. Ward-Cassell, who later became the Secretary of Defense Health Affairs during Bush administration. But at the time, he was a cardiologist, Chief of Technology Transfer in medical school in Houston about something that I was actually working on when I came to the United States. At the time, it was called RDF.
and people had already just started to talk about semantic web and web of machines and later it became the Web 3.0. How do you connect the dots across hundreds and thousands of data sets, web scale data integrations or automated agents could navigate this space?
hop from one website to another, from one database to another, gather information, assimilate information, and make an inference. was, and Tim Berners-Lee, by the way, the same person that invented the WWW HTTP web concept, Web for Humans, was behind Web of Machines as well. So that was what I was working on.
When I did, I will never forget this, I first Google searched for RDF, which is one of the frameworks used for semantic web, there was only 11 entries in Google at the time, 11. It's now probably 11 million websites. I don't really know, but that was how it was.
James Green (07:42)
That's not the case anymore.
I'm going to look it up. Hang on. RDF was the Mantic Web. Let's see what Google says. No, no, no, no, no, no. They show it here somewhere, don't they?
Parsa (07:55)
Yeah, you proved me wrong.
I don't know.
James Green (08:04)
They used to show how many they come up.
Parsa (08:04)
Now I think Gemini takes over and
James Green (08:13)
Yeah, they don't. Suffice it to say it's a billion. Yeah.
Parsa (08:15)
Yeah, it's just a big number. Doesn't matter. But I'm just saying, I was working on connecting the dots using large scale data integration in an automated way. talked to...
James Green (08:29)
here we go. 1.38 million.
Parsa (08:32)
It's a big number, yes. And it has fallen from the favor since. I that was when people were talking about it. I talked to Dr. Casals. He talks to his people in...
Army, TATREC, Technology Advanced Research Center, Telemedicine and Advanced Research Center in Army. They come to Houston. We talk a little bit about the implications of using semantic web to create what we call the next generation public health preparedness platform, which was about in real time,
linking hospital systems to each other, doing outbreak detection, signal detection, and large-scale syndromic surveillance in vast geographical areas where multiple healthcare systems were seeing different populations in different geographical locations in order to identify potential man-made disasters.
It got funded.
James Green (09:53)
And I know that health systems can be sometimes maligned for moving slowly, but between September 11th, what date did you launch this thing?
Parsa (09:59)
Yeah, it was.
We talked about it and launched a web portal about this like five, six days after 9-11, where we brought people together inside Houston community to create agreements and coalition of willing to put a surveillance network for bio-preparedness, for bioterrorism preparedness in Houston. It was later called Defense of Houston.
James Green (10:14)
pretty fast.
Parsa (10:34)
website. We won a best practice for public health preparedness from Department of Health and Human Services in next year 2022, I think. So yes, we were quick to act the day after I talked to Dr. Casals two days after we meet with army folks that with which they told me that DARPA is also looking into utilization of semantic web for
multiple of their own logistic problems. But here we're talking about public health preparedness, larger scale syndromic surveillance for population health purposes. And we built a network later, it became probably third largest surveillance network for biological surveillance, obviously.
We didn't have bioterrorism at the scale that we were thinking that will happen. We didn't have any of those man-made public health problems, but we had bird flu and we had SARS and we had several hurricanes down south. I don't know if you remember Hurricane Katrina, Hurricane Rita. So the technology we built were just
gradually evolved into really large scale automated data integration across health systems to tease out patterns and clusters of associations between the health data that could be used to characterize patients and subpopulations in a timely manner.
timely and fast real-time manner as possible. Lockale again, these things really never were used for their intended purposes of bioterrorism, but were robust enough when CTSA was announced first time by NIH. And the CTSA is one of the biggest infrastructure research infrastructure grants that NIH had.
to allow creating platforms, informatics solutions, services for health care systems to support translational research. So I was part of the CTSA number one in University of Texas. And then for the second CTSA, I came to Albert Einstein College of Medicine in New York and brought all of those concepts about large scale data integration to allow real time as real time as possible.
integration of analysis of data to support CPSA program in New York. Now suddenly I was clinical informatician because the population house gave rise to a vision of using distributed heterogeneous and expansive data sets.
And the clinical informatics allowed us to translate that to, again, going back to my roots, taking care of patients, applying data, not only for clinical decision support, but also here for research. Combination of research informatics and clinical informatics, translation to research. I think we were really helped by...
Affordable Care Act creation of the Kori now we're talking about 2014 2015 when my Watch is trying to Help me. Yeah, join the conversation. Yes in the worst time possible
James Green (14:38)
join the conversation. That's what happens with AIs. They just get out of control and they start to change the conversation. They don't listen. Yeah.
Parsa (14:43)
Exactly. And they don't listen. or they listen too much. Anyway, I was just saying that Picoti helped us to again, translate some of these technologies that we were doing for institutional support of translational research and translation to practice inside One Health System to doing it in again, much larger scale.
across multiple collaborating institutions to support much larger scale clinical research, compatible effectiveness research, pragmatic clinical trials and such, which gave rise to real commitment and understanding of standards, common data models, interoperability at large scale across large clinical data sets.
I helped building one of the largest clinical data research networks in New York. It was called New York CDRN, but it is now called Insight Network, which is a coalition across six CTSA programs in New York. By application of these common data models, I was one of the early contributors to OMOP common data model. I still support.
OMOP extensively, both the schema as well as the vocabularies. And then obviously, I think it was almost around the same time that people were talking about self-driving cars. AI here, AI there, but not really in healthcare system. We weren't really talking about AI in 2015, 2016, but we had a...
NIH grant with one of my collaborators kudos to Dr. Michelle Gong. We got a clinical trial for five years to look into application of machine learning in predicting prolonged mechanical ventilation respiratory failure up to 48 hours in advance. Questions were actually twofold. The first round of
grants just supported building and validating one retrospectively and then we won the second round of the grants continuation for three years to translate that to a clinical decision support system integrated to our EMR to drive care using AI generated machine learning generated advisories.
And we spent two years to build the model, validate it retrospectively, validate it prospectively parallel to the clinical systems without really injecting those analytics into the routine care.
James Green (17:55)
And what was it the model did exactly?
Parsa (17:56)
Model predicted two things, whether or not a patient will have respiratory failure in the next 48 hours, up to 48 hours. We were 78 to 80 % accurate in predicting respiratory failure 18 hours in advance. think we have a publication that is.
in more detailed analysis of how robust the algorithm was. And whether or not a patient would die in hospital in an inpatient setting a few days, three days in advance of the terminal event. So it's essentially looking into two things. We could do this because in general, like everywhere else,
in other sectors of economy, we were collecting a lot more data in a digital format from a lot more places in the healthcare system. We weren't really assimilating it as robustly and properly as we do now, but we were an OMOP shop. We were standardizing, we were building data pipelines, looking into quality of data for large scale analytics.
for our OMAP implementation anyway. So we were somehow on top of our data quality and fit for use of data integration to real time data sets at Montefiore and the computational power to be able to process these information was also being becoming available. We actually started collaborating with Intel starting with their
data center group, just to understand what it takes to build big data analytics in healthcare systems, what are the kind of processing power network infrastructure we need to do to build these analytics. So several things came together at the same time. The grants allowed us to think data at scale, data become available because of the really
larger scale adoption of EMRs everywhere, including in Montefiore and the standardization that we invested on and just partnerships that allowed us to apply these technologies at scale. We went live with our first machine learning algorithm January 19, 2017. Again,
James Green (20:46)
So that's quite a thing, because you're now talking about a model that predicts respiratory failure in 2017. There was a need for such a thing coming in 2020.
Parsa (20:56)
Yeah, serendipity, right? We had almost everything ready. were not knowingly, we were building everything that we needed to build for anticipation of who is going to need mechanical ventilation because we had a model for it, who is going to die inside the hospital. We had just finished building a different machine learning algorithm for ARDS.
James Green (21:02)
Yeah.
Parsa (21:26)
acute respiratory distress syndrome, which is probably what kills patients that have respiratory failure inside the ICU. We have integrated our platform across the hospital system. So it wasn't running in my, know, it wasn't a toy model running in my laptop or in my server. It was running across the hospital system connected to our Epic EMR everywhere.
And we obviously when pandemic happened, we were as surprised and caught off guard and unprepared as everybody else. But we also had infrastructure that we could quickly, really quickly bring on board and use it. Our algorithm had not seen
James Green (22:24)
This is very loaded question, but how many lives do you think were saved because you had this model?
Parsa (22:29)
I'm not going to answer that question because care provisioning is a multi-team, it's much bigger than an algorithm, much bigger than a model. You are a cog in a much bigger machine and I don't want to steal attention away from our clinicians and nurses and doctors that were in the forefront. I'm just happy that we were part of the solution.
But we were part of the solution a few days after everybody got together and we weren't a latecomer. I think that's what matters. At the time, we didn't really have enough patients to fine tune and train our model to understand the dynamics of COVID, which was different than other forms of respiratory failure. You know, when you build a model to predict
respiratory failure in 18 hours or 48 hours or 24 hours, it tells you that you're not anticipating a rapid deterioration like in minutes. It is something that you're expecting to happen over hours. But in COVID, we were seeing patients that were just deteriorating in 20 minutes, in just two hours from their admission. So we needed to fine tune our algorithm to do this.
for COVID with the recognition of specific characteristics that COVID had. So we worked with Intel to adopt the federated learning to connect to other hospital systems, other data sets, actually mount a, we worked with a company to create synthetic data sets.
that would allow us to federate learning across synthetic data and other data sets from other hospitals and our own health system to optimize the model, which worked fantastically. And we could in four to six weeks fine tune our model and basically build not only
models that were much more accurate on detecting respiratory failure and ARDS in COVID cases, but also due projections two or three weeks in advance on how many patients will be hospitalized or admitted to the hospital in any of our facilities. So we were providing the backbone of information that would be
useful for our care coordinators and clinicians to take care of patients. And we learned, and we learned.
James Green (25:29)
I'm going to stop you here for just a second. I'll take this out of the podcast, but we're now at 25 minutes, so I want to find a way to wrap up this thought so the episode can end. Continue.
Parsa (25:33)
Mm-hmm.
Okay.
Yeah, okay. So we emerged from pandemic with a lot of understanding of how these systems work, how these systems interact with delivery systems, how they scale, and how they become part of the bigger machinery of health system. Learning from every patient that we have seen, not only in our own healthcare system, but in neighborhoods.
in other locations and how we can do this learning in rapid cycle and translate everything into practice in rapid cycle. Have the outcomes and the results of your analytics be validated, integrated to practice and become the feedback loop for continuous monitoring and continuous improvement.
which is essentially the definition of learning health system. And that is why I think we are focused now on that, how you bring about everything that you have, your data sets, your analytic assets, your computational assets, your EMR to become part of a learning system that cycles through outcomes as a feedback and continuously improves.
James Green (26:42)
So pass it.
There you go.
So, just to summarize what we've covered so far today, 9-11 helped us have this concept of gathering data from multiple places.
Parsa (27:20)
Yeah, it was the kick that pushed us to the frying pan of thinking seriously at scale, ultra large scale. Yeah, ultra large scale. think that's what happened, right? It wasn't a project anymore or academic curiosity anymore. It was an existential problem that needed to be solved. And we took our time to solve it, but I can tell you that we have a solution.
James Green (27:26)
Yeah.
But then once we, then once. Right.
Yeah. And then once you have all of that data in place, of course, the logical thing is then to use it for predictive models. So you started building predictive models. We had a pandemic came along and look, we already had the foundations in place to make this happen. So I think that's a great place to wrap up this first episode.
Parsa (27:54)
Yeah.
Yeah, but there is a different, let me tell you one thing. No, but foundations for learning health systems are not only laid out in one healthcare system, Foundations for learning health system should be planted across collaborating health systems. No health system will have all of the data
James Green (28:12)
You're not gonna let me wrap it up, are you? Okay.
Parsa (28:34)
and all of the kinds of patients that they need to really solve complex problems or rare problems or build generalized solution. What we emerged, I would say, with all of what we have done in the past decade is foundations for a federated learning health system augmented by machine learning and advanced analytics. And that's why I call it paninformatics.
because we really build everything from data to collaboration and federation and AI into a learning health system or definition of learning health system.
James Green (29:14)
So Parsa, what should we talk about next episode?
Parsa (29:16)
You tell me. You can talk about some specifics or examples. We can talk about, yeah.
James Green (29:23)
Let's do that. Let's dig into some specific examples next episode about some of the ways
Parsa (29:25)
Yeah, and maybe some of the, you know, we went through a journey. And I don't think that we should expect that other health systems should also go through the same torturous path we went for the first time. Yeah, so we may want to also talk about an incremental roadmap of how you can benefit from existing
James Green (29:41)
We would like to help them a little bit.
Parsa (29:51)
experiences and infrastructure and research that we've done that may translate to other health systems and so forth. So we can talk about the maturity model. You tell me. Okay.
James Green (30:04)
Perfect. Okay, Parsa, thank you very much. Until next week.
Parsa (30:06)
Thank you. Thank you for allowing me to interrupt you. Okay.
James Green (30:13)
You do all the talking, Parsa. You know you do. All right, thank you. All right, now let's stop this recording.
Parsa (30:16)
Thank you. Bye bye.