At NorthShore University HealthSystem, All Systems Go on AI

April 29, 2022
Chad Konchak, assistant vice president of clinical analytics at NorthShore University Health System in the northern suburbs of Chicago, shares about his team’s groundbreaking work in artificial intelligence

Chad Konchak has been assistant vice president of clinical analytics at NorthShore University HealthSystem, the nine-hospital integrated health system based in the Chicago suburb of Evanston, Illinois, for five years; and has been at NorthShore for 12 years in a clinical analytics role, beginning with a two-person including himself, and expanding the team out over time, as the demand for its services has grown organically.

On March 14, as part of the Machine Learning & AI for Healthcare Forum, one of the specialty symposia held on that first day of HIMSS22 in Orlando, Konchak moderated a panel entitled “The Name of the Game is Implementation,” in which he and several other healthcare leaders spoke of the challenges and opportunities involved in developing and implementing artificial intelligence (AI) algorithms. Among the numerous areas of discussion included the fact that, as Konchak stated, “We’re learning that 60-90 percent of predictive models never make it to production,” and the fact that engaging in what has historically been the normal model for analytics development—having data analysts and data scientists go off on their own to develop analytics programs—is not working well when it comes to developing AI algorithms for patient care delivery, including for clinical decision support.

As a follow-up to that excellent discussion, Healthcare Innovation Editor-in-Chief Mark Hagland caught up with Konchak recently, in order to extend the conversation around the development of AI algorithms, and what’s being learned right now by teams of developers in patient care organizations. Below are excerpts from that interview.

How many individuals are currently on your team?

Our analytics team is comprised of about 25 people, with about five who are data scientists. Most are doing data enrichment; and another 25 people doing data engineering and data architecture. So, altogether, 50 people are making analytics successful at NorthShore.

How is your team connected to IT? And do you report to the CIO of your organization?

We are in IT, and that works well for us, because we have higher reliability; but also have very close connectedness with our business operations and clinical teams, and that’s very important. You need close connectedness to the strategic needs and clinical operations; that’s the right place to put analytics.

How do you frame your work, and how does executive management frame your work, in AI, at NorthShore?

AI is a broad-based term that can mean anything from predictive analytics to general analytics; I think of AI as computers and systems replicating human-like behaviors. The reality is that a lot of these terms are being used somewhat interchangeably. But, per our organizational strategy, two main components: one is anchoring our operations around core analytics competency: we can’t improve performance unless we can measure it. And our team measures all our performance; and that in itself isn’t AI, but it’s core to what we do. And teams that are aligned with analytics, can better ensure that AI activity is aligned with those core strategic priorities. And the other thing that really gets to the core of AI is population segmentation—using advanced systems, machine learning, analytics, to understand our groups of patients, from a population health or risk-strata perspective, who might share similar characteristics, around which we can organize strategies. Our executive leadership sees the value of the measurement system aligned with our priorities, and the true AI that allows us to target groups of patients that are more homogenous with each other who share similar outcomes, and to whom we can apply similar strategies.

What have been the most challenging things as you’ve gotten into, as you and your colleagues have worked to develop true AI?

Well, the idea that technology can just solve all our problems, has been a problem. And, no, maybe the best we can do sometimes is point our fingers at where the problems are, but the hard work involves process improvement and performance improvement—how we fix the problems. And you can identify people at high risk of an outcome, but it doesn’t help if all you’re doing is admiring your analytics work.

As in, “Oh, look, Mrs. Smith is at high risk!”

Yes, exactly that. The real hard work is identifying process improvement, and that’s not an AI problem; it’s a process improvement problem. So there’s this false sense of AI as a silver bullet. I think it’s an accelerator or a differentiator, but just one element in driving better outcomes.

Some people in healthcare do seem at times to be dazzled by the weird idea that analytics itself is an end goal, perhaps?

Yes, you see that across the industry, this idea that we’ll hire an expensive bunch of people to do advanced analytics; they’re not necessarily getting you towards the goal, or employing the strategies to change the outcomes. You can’t just hire a bunch of Ph.D. mathematicians and expect to make a difference with AI. And you can use AI and analytics interchangeably, right? Creating dashboards, etc. And AI is just the most advanced form of that. And the degree of understanding that’s behind a complex algorithm is even harder to sell to a group of people who are resistant to change. And actually, it might not even be resistance to change, it’s the idea of moving forward into change.

What are one or two biggest accomplishments of your team so far?

It goes back to robust measurement systems that are trusted by the organization. We’re the team that’s building predictive analytics, and guiding our goals and scorecards. And doing that across the entire enterprise is a huge success because it gets everyone bought into the goals involved. And it’s also that last-mile problem: deploying analytics into clinical workflows that people can really buy into, to change clinical care. Where we’re really good is how we partner with clinical teams to deploy the analytics and create standardized workflows to improve clinical care. So we have a system the Clinical Analytics Prediction Engine, or CAPE. We built all these predictive models, but asked whether they’re really helping anyone. If you’re a physician working in the EMR, and you get a pop-up saying the patient is at a 73-percent probably of deterioration, what do you do with that? CAPE brings together all our predictive models into a single engine, and ties it into predictive models that can be used across the organization, where physicians, nurses, and care coordinators can identify possible outcomes using a checklist of interventions. And that then becomes an actionable platform based on AI; but what’s unique about that is developing standards for clinical care, which is really the basis for intervention.

How will things evolve forward over the next few years at NorthShore?

We’ll be applying this analytics work across more diseases, and moving forward across the care continuum. And more broadly, it’s about how we promote better health, and segment the population based on what’s happening in their lives, such as via the use of SDOH data, and then connecting patients to services in the community. And that’s true for the industry as a whole. From a connectedness perspective, that’s one thing. And then… right now, we’re abstracting SDOH from unstructured text, because NLP is getting better. Models are learning and are taking the output from the interventions, and then learning from those interventions. That will be what is revolutionary; most models are still based on static data. It’s like Waze [the traffic app]: everyone who logs into that app adds information into the app; whereas the Farmer’s Almanac, which is making predictions based on historical observations, but not taking in real information on the ground.

So you might have someone at a high risk of readmission, but now, with CAPE here, we can make sure that a patient’s risk of outcomes should go down. But the models aren’t smart enough; we need to get to real-time feedback and update; and we need to connect the analytics to the entire continuum of care, so that the risk of readmission later because a formula for the risk of admission once the patient is at a home for a certain period of time.

Could you offer one or two very explicit pieces of advice for our audience?

Yes: they should be paying less attention to the advanced AI and paying more attention to the clinical workflow and what needs to be done. So for example, risk of readmission tied to coordinated care, is better than an advanced AI that isn’t tied to interventions and is just throwing a score at a clinician. The predictive analytics is the easier part; people need to spend more time in the iterative, PDSA, type of improvement cycle, to determine how the data and analytics will impact patient care. And in order to do that, you need to develop a leadership team ideally led by a clinician champion.

Sponsored Recommendations

Overcoming capacity constraints: Top healthcare leaders share their strategies

Healthcare leaders share innovative strategies to tackle capacity constraints, emphasizing data-driven decisions, workforce optimization, and redefining care delivery. Learn how...

MemorialCare Boosts Nurse Engagement and Retention

MemorialCare increased engagement scores by 7.3% for strong platform users, boosting retention and saving $3.1M in one year.

How to Construct a Sustainable GRC Program in 8 Steps

Weighed down by compliance-related costs? A governance, risk, and compliance (GRC) program can help you proactively keep up with change. This white paper shows you how to build...

A Cyber Shield for Healthcare: Exploring HHS's $1.3 Billion Security Initiative

Unlock the Future of Healthcare Cybersecurity with Erik Decker, Co-Chair of the HHS 405(d) workgroup! Don't miss this opportunity to gain invaluable knowledge from a seasoned ...