RISE Radio

Episode 31: The 5 skills every health leader needs to succeed in value-based care

Ilene MacDonald

Use Left/Right to seek, Home/End to jump to start or end. Hold shift to jump forward or backward.

0:00 | 1:00:02

Dawn Carter, senior director of health policy and regulatory affairs at  Centauri Health Solutions, joins RISE Radio for a lively discussion on the five skills health care leaders need to succeed in value-based care, starting with how to turn analytics into decisions that actually change outcomes. 

During this hour-long episode, Carter shares practical frameworks for working with data, regulations, interoperability, strategic storytelling, and social determinants of health.

To learn more, see Carter in person at RISE National March 23-25 in Orlando where she will lead an interactive "Bingo" game roundtable discussion to uncover smarter strategies for risk adjustment. She'll also speak at the upcoming RISE webinar on interoperability that powers SDoH referral loop closure in value-based care on April 28 at 2 p.m. EDT.

About Dawn Carter

Dawn Carter, BSBA, MHA, CPC, CRC, CPMA, CDEO, CPCO, AAPC Fellow, is the senior director of health policy and regulatory affairs at  Centauri Health Solutions, with over 30 years of experience in the health care industry. She has a proven track record of success in developing innovative products and services for the Medicare Advantage, Medicaid, and commercial health plan markets and is a sought-after industry speaker and author as well as independent consultant and educator.

She holds a Bachelor of Science in Business Administration and a Master of Science in Healthcare Administration and is currently pursuing a Doctorate in Business Administration and Healthcare Management. She is a member of the American College of Healthcare Executives (ACHE) and is a Fellow of the American Academy of Professional Coders (AAPC). Her extensive AAPC certifications and recent appointment to the AAPC National Advisory Board for the 2025 -2027 term further demonstrate her deep commitment to knowledge and expertise in the health care field.

About Centauri Health Solutions

Centauri delivers data-driven technology solutions that transform fragmented clinical and member data into actionable intelligence—maximizing accuracy, quality performance, and outcomes for health plans and health systems. Through close collaboration with our customers, Centauri improves patient and member outcomes by providing advocacy, advanced data insights, and intelligent clinical data delivery future-proofed for interoperability.

Overview

Ilene MacDonald, host

Hello and welcome to the latest episode of RISE Radio. I'm your host, Ilene MacDonald, the editorial director at RISE. Today we'll be exploring the five skills health care leaders must cultivate to succeed in the industry. My guest today is Dawn Carter, senior health care policy and regulatory affairs for Centauri Health Solutions, which offer solutions to health systems and health plans for various health care programs, including Medicare and Medicaid. And thank you, as always, Dawm, for joining me and educating me and the listeners. You and I talked a few weeks ago, and you mentioned that there are five skills that must have in order to succeed. And I wonder if we could start with number one. You told me that was a deeper understanding of analytics and data. You mentioned that when we were talking uh prior to the recording, that leaders have to have a real deep understanding of analytics, not just dashboards. And what does that look like in practice? Why is it no longer optional?

Dawn Carter, guest

So I have often said that no matter how high I'm promoted in an organization, even if I ever get to be a CEO, I will never stop being an analyst at heart. Business analysis was where my career started when I entered the technical realm and product development. And even in my personal life, the concept of root cause analysis reigns supreme. My teenage children, for example, are very familiar with the five why's method of root cause analysis. I always say start them early, right? I would imagine that many people listening to this also feel that they will always be an analyst at heart. And as long as you're in the position to make decisions, you'll need to leverage those skills. Now, analytics is the engine of decision making. And although dashboards and reporting and analysis, all those are very important parts of the whole analytical process, but they matter only insofar as they lead to a decision, an owner, a start date, and metrics that prove the decision was the right one. What leaders really need to have are finely honed decision-making skills so that they can most effectively use that analytical engine. And I want to officially start my answer to this question by talking about the history of the word decide. So it comes from the Latin word, which literally means to cut off or to settle. Now, this idea, of course, is metaphorical because you essentially cut off all other options when you decide, what I call all of the maybes, and keep only one. Another perspective I like to use comes from the 1980s song Free Will by Rush. If you choose not to decide, you still have made a choice. In health care, not deciding is expensive, as is constantly waffling back and forth between a bunch of maybes. It's no longer optional because margins are tight, regulation is moving quickly, and value-based contracts put real dollars at risk. And if you can't turn data into decisions using your finely tuned analytical engine, you're flying blind while your competitors iterate faster. Now, a deeper understanding and more effective use of analytics means you do four things. So I'm going to talk about four things. The first one is framing the decision. So in plain English, that means don't just sit there and stare at the numbers or go off and start yelling at people to do things. What this means is you start by naming the choice that you need to make, not the chart or the dashboard you're looking at. Things like risk adjustment and quality at their core are about making sure that your funding reflects the acuity of your members. So you can staff, support, and serve them appropriately because the members are at the center of everything we do, and we have to remember that. And we have to realize we're not changing numbers to that end. We're matching resources to real people's needs. We frame the decision from three perspectives: the member, the provider, and cost in this value-based world. By so doing, we know our decision has the best chance of providing high-quality care for the lowest cost. Now, our decision isn't, like in the case of risk adjustment, how do we increase a risk score? That's not the decision. It's where should we focus to accurately capture members' real health needs? Is it during visits, after hospital discharges, or by getting people in for their annual checkups, or some mixture of that? And what's the least burdensome way to do it for members and clinicians? Now I want to emphasize that this is predicated on a very intimate understanding of the population's unique health needs based on their demographics. And this is a theme you will hear throughout this interview. Fair warning. For example, decisions made for a primarily rural population with limited access to care and low digital literacy will look very different than those in urban areas or more affluent areas. Having this knowledge significantly reduces the number of maybes that impede effective decision making. So you can use for this, there's a framework. If you're like, okay, how do we how do we frame decisions? I got a little framework for you, and it's like a little checklist. And you look at this, is for those of you business types out there, this is a riff on the popular SWOT analysis. That's strengths, weaknesses, opportunities, and threats. So you've heard this before. We start with a goal, such as we want the dollars to match reality and reduced waste and rework. Then we look at the levers we control, which are helping clinicians prepare before visits so that we address known conditions. We make it easy for members to keep their annual checkups and appointments, such as addressing transportation needs or sending reminders. We also follow up after hospital stays so that we don't miss any other conditions. Then we take a look at the constraints. We know that clinician time is a constraint, member trust is a significant one. We have compliance guide rails that we can't discount, and also budget. You know, it would be nice if we could do everything all the time, but we know that we can't. We look at our trade-offs. We do this because we want less trade-offs would be in this case, like less end of the year scrambling that sometimes happens when we're executing our risk and quality programs. And we have more steady routine capture during normal care. And then we have a success measure. How do we know we've done this successfully or made the right decision? We have more members with complete and accurate records, fewer denials or do-overs, and fewer avoidable ER visits. So that's just an example of a decision-making framework that you could use. Now, the second thing that we talk about with our analytical engine and honing it is the ability to separate signals from noise. Now, what this means is before deciding to change course or or make a decision that's going to alter things, make sure the pattern is real, not some seasonal blip or a shift in who is covered, because as we know in our world, we've got some real things that signal that changes need to happen, and we need to be able to distinguish that from noise. Now, let's use the example of observed drops and year-over-year average risk scores or even Star rating drops. The question to ask to separate signals from noise and informing our decisions is not why did this change happen? But it's, is it because our members are different or because our process is different? Now, I'm going to give you some quick reality checks that anyone can ask to eliminate this noise as you're, again, honing that decision-making engine in your analytical engine. First, look at the timing. This means asking, did something routine happen, like open enrollment, redeterminations, or new client onboarding that changed who is in our population? We look at our member mix. Are we comparing new members to longtime members as if they're identical because we know they're not? We look at hotspots, and this is is this problem everywhere or systemic, or is it mainly in just a for with a few provider groups, hospitals, regions? You know, look for those patterns. Were there process changes? Did we change a vendor? Did we introduce maybe new templates and process in the EHR? Was there a workflow change that explains this? And then taking a look at the a tie-in with the outcomes, meaning, do the places with the quote unquote worse numbers also have more avoidable ER use or readmissions? Now, coming back to our example with the year-over-year Star ratings and risk scores, this is in practice what this might look like. You see a dip in the RAF. Averages say we're down, but a closer analysis shows lots of new members have not seen their PCP yet. That's not a crisis, it's a cue. So you accelerate your first visit scheduling and do some simple pre-visit prep so real conditions get captured in normal care. If it's late in the year, you can evaluate telemedicine options. And although we know there's a lot of back and forth between what we can and can't do with those, but they're still an option. Now, the message here is not to let an average reflected in the analytics send you into a tailspin and on some wild goose chase. Do that reality check and make sure you eliminate that noise. Now, third, what it was it relates to your honing that analytical engine is pressure testing your assumptions. Now, what this means is that every plan that and every decision is predicated on a few big assumptions. You need to be able to name them. Yes, we assume them, but you have to be able to name them. If one is wrong, the whole idea is going to fall apart. So you need to test first with a small pilot. And this is where you ask yourself, before we scale, what has to be true for this to work? And how will we know in a month or some other specified unit of time after we do the analysis? So let's look at some of the usual must-be true items for RAF improvement strategies, going back to our example. The first one is the members will show up. Now, if we do this, means if we do outreach and invite 100 people for wellness visits, how many are going to actually come? It's important to know. The second assumption is clinicians have time. Can we fit accurate documentation and clinical documentation improvement into visits without slowing down care? Our third assumption in this case would be our suspect lists are sensible. Are we giving teams realistic, clinically plausible reminders that there are gaps and not noise that they will simply ignore or dismiss? And then here's one that many people forget, usually because they think that this is IT's responsibility. Hint, it's not. Our encounter data submissions are complete and accurate. This means are our encounter data submissions getting through the first time, or are there rejections? And better yet, are those rejections being addressed systemically? So a pilot approach, you know, because remember, I said test a small pilot before you scale. This is the lightweight approach, using outreach for member wellness visits as our example. We try it with a small targeted group of providers for maybe four to six weeks, perhaps those where our analytics show that there are the costliest gaps. This is where you get really friendly with your informatics people and your analytics folks. You watch kept appointments and care gap closure because this is what's going to tell you if this is working and if this is going to be something you're interested in scaling. And then predefine a stop or scale rule. And what this is, say if fewer than 30 percent of invited members show up, we fix our outreach before expanding. And we do some sort of a root cause analysis to tell us why we're not getting a higher percentage until we get there. And the fourth thing we want to do to hone that analytical engine that leaders need is to connect insight to action. And this might seem like a no-brainer, but what this means is insights gained from analysis only matter if they change who does what by when. In practice, we say, here's what we're doing next week, who's in charge, and how we'll measure it to know that it's working. And these this kind of harkens back again to folks who've been through business school or even most leaders understand the SMART goal framework, you know, being able measurable, timely, you know, all those things. So, two quick examples of what that looks like in practice. For a commercial plan, this might be we say, wow, we noticed people with diabetes and heart failure who hadn't seen a PCP in a year drove a lot of our ER visits. So we funded Rideshare for appointments and gave clinicians a one-page prep. Kept visit rates rose, ER trips fell, and our funding better matched real needs without the end of the year scrambling to close gaps. In a government plan, this might look like, after hospital discharge, we scheduled a follow-up within seven days and made sure chronic conditions were addressed during that visit. Members recovered faster, we avoided readmissions, and documentation reflected real health status. And so there you have four things to consider when you are honing those analytical skills that you will need to be successful.

Ilene MacDonald, host

Thanks, Dawn. You know, how do you have any sense of how AI tools might work with this process? Does it enhance analytical thinking in any way?

Dawn Carter, guest

So I would imagine by now that most people in the audience use AI with some regularity as a tool to support their decision making. And I always like to advise people to treat AI like a sharp-eyed analyst that you're guiding, or better yet, the annoying colleague who constantly challenges your assumptions. Because, like with a human, you know, the way you interact with the AI is going to very much affect the and the quality of the answer that you get. Also, although it is certainly not recommended to provide the AI like a giant paragraph as a prompt, if you are asking it for interpretations or recommendations, it is often helpful to provide it with a basic demographic description of the population in question. Because member provider-related factors such as what percentage of the population is rural or has health care access will make a difference in their answer. So, first thing you do is you start with context and constraints. And this is where I'm going to talk about how to interact with the AI a little more intelligently when you're making decisions. And an example would be telling it, we're a regional plan with X lines of business. We're deciding between A and B, success metric equals Y. So context, constraints, also other factors that are important, like I said, how much is rural, who has access issues, urban population, those types of things. Who's low income? Ask for comparison and counterfactuals. Ask it, what would change this recommendation and what assumptions matter the most? Hearkening back to my prior answer. Insist on the what I call the so what. Turn this into three options with cost, risk, and timeline. So have it tell you what its opinion is based on your prompt. Cost, risk, and timeline for options. And usually it will present you with a nice little grid of these things. And use it to prototype, not to rubber stamp. Most people know this. Meaning, and I we draft hypotheses, we do, you know, we delineate outreach segments, we look at sensitivity ranges, and then we validate with our team and data. So it provides the general parameters for those things, and then we validate them with the actual data that we have. So now I can take a moment to give relevant examples of what not to do with AI, because all that's great. Here's what what we we want to consider, here's what we don't want to do. So we want to go from, because these are things I see, write an outreach plan for our members. Not good. We want to go to something like draft three messages for post-discharge members at a sixth grade reading level, opt-out respected with ride options, measure kept visit, visit conversion. Much more intuitive than write an outreach plan for the members, because if you do that, you're not going to get anything that's meaningful or that can scale to the population. Go for, and here's another example. Go from explain this regulation. This is one that I see a lot because I work in regulatory compliance. Go to list the top three operational implications for payers and/or providers, and five questions to review with our compliance team. And then another example that I see frequently is uploading a report and say, summarize our RAF trends, risk adjustment score trends. Okay. Better to ask, explain the risk adjustment factor dip separating new versus returning members and two hotspot provider clinics, regions, whatever. Recommend one 30-day fix per group. Now, I will also say that based on my own experience, I often use AI if I'm stuck and need to get the creative juices flowing with regard to decision making or anything really. It can be a very useful tool to gently remove barriers so that those juices can flow freely, especially when time is of the essence, and I can't afford to have a protracted case of writers or analysts block. So hopefully those are some tips and tricks of using the AI to support analysis and decision making that that our audience will appreciate.

Collaboration with data teams and/or informatics specialists

Ilene MacDonald, host

I think so. very specific prompts it sounds like, very technical. And if any of our listeners who are leaders who aren't really analysts by nature, like you mentioned, you were at the top of the show, what are some of the ways that you would suggest they collaborate with their informatics specialists or data teams?

Dawn Carter, guest

So speaking their language is always helpful. Here are some tips on how to accomplish that, even if you don't come from an overly technical or analytical background or haven't had much exposure to informatics. The first is to bring the decision to them and not the data set. And what this means is go to them and say, here's our decision. We need to cut avoidable admissions 8 percent in 12 months. What is the fastest path to doing that? Because they're looking to you for the direction that is framed by the decision. They're going to skate to where the puck is, which is what I like to say. And if they don't know in what direction that they need to go, they can't give you what you need to get there, or at least not efficiently. So lead with the decision with these folks. You don't have to, you know, know all about their dashboards. You know, they'll come to you with that in an intuitive manner. Just bring them the decision and bring it very clearly. The second is this is one that I love. Agree on what I call it's a one-page decision specification. Frame the business question, which is a hallmark of business analysis. What is the business question you're trying to answer? Because that's what it comes down to in analysis. Definitions, time frames, data sources, and how you'll act if X or Y is true. Now, sometimes the act of memorializing something and writing in this way with a specified format goes a long way in bridging communication gaps with these folks. These people like format and structure, so be sure to provide them with it, and you'll have a much easier time interacting with them. The third is set a cadence with these communications. Just like we learned in middle school science class, we make a hypothesis, we test it, and then we implement. We should move from quarterly quote unquote report reviews to adding in two-week test and learn cycles. Now, many of these folks are used to working in an agile development framework. So anybody out there who's familiar with software development, very familiar, no doubt, with agile. And so this will be familiar to these folks. That's how they operate. And so, yes, quarterly report reviews still have those, but supplement those with test and learn cycles, whether it's two weeks, three weeks, monthly, some shorter increment of time. And then last but not least, this is my favorite, co-own a small portfolio of experiments. Now, no, I'm not a scientist. Like I said, I'm an analyst, but I have a great interest in science. And this is where organizations that are involved in Six Sigma have a leg up. Now, each of your experiments would have a business owner, a data owner, and a before and after metric. For organizations that leverage Six Sigma for quality improvement, this offers a great opportunity for doing these experiments in an incubator of sorts. It also offers an opportunity to implement Six Sigma. or similar QI methodologies if you don't have any in place, so that you can start moving the needle from analysis to action. So definitely if you're not doing something formal for quality improvement, Six Sigma is a good place to start.

Ilene MacDonald, host

Thanks, Dawn. You you kind of touched on it earlier on the need for regulatory compliance awareness. You mentioned that that's something that's in your wheelhouse obviously. What do today's leaders need for that? Do they need to be experts themselves or do they have to have compliance experts on their teams?

3. Interoperability knowledge

Dawn Carter, guest

So unpopular opinion: Regulation is actually strategy in health care. Now if you only comply and view regulation as simply an obstacle to overcome, you're going to miss the competitive edge you could gain. Now I can tell you from having been in business school through the doctorate level, because this time next year I will be Dr. Carter, yes, one of the things that is drilled into our heads is that in highly regulated industries such as health care, it is imperative that organizations find a way to leverage it into competitive advantage. Now anyone out there who has read the book The Obstacle is the Way by Ryan Holiday is familiar with this concept. Because value-based care focuses on cost, quality and experience, organizations that can move the needle on these three things by leveraging regulatory change will be the winners. Let's look at a couple of examples. So let's look at first data and interoperability mandates which affect the member and provider experience and that's how you can move that needle. So again this very much is Centauri's wheelhouse with interoperability since we do own a QHIN and we are involved in clinical data interchange. And our example here is CMS has issued a number of recent interoperability and actually a prior authorization final rule. It's CMS0057F for those playing along which require payers to stand up fire-based APIs which are you know interfaces for patient access, provider access, payer-to-payer interactions, prior authorization, and to speed up prior authorization decisions. This has been all over the news and so most people in the audience are probably familiar with the work that's going on here. Now leaders who treat this as an experience redesign exercise and not just an IT project can shrink denials, cut cycle time and become a more provider friendly plan. And we know that compliance dates for this begin in 2026-2027 depending on the provision. So that's just one example. Another one that a lot of this audience is probably more familiar with is payment model changes that we have going on because this affects and can move the needle on revenue stability and quality bonuses. Now we know of course in Medicare Advantage V28 risk adjustment is phasing in with updated mappings and coefficients changing how acuity translates into payment. This has been ongoing for a while. Leaders who pivot to accuracy first, prospective point of care capture inside normal care, protect revenue without compliance exposure. This means a more balanced risk adjustment strategy between retrospective and prospective rather than the traditional retrospective heavy approach, which relies on activities such as chart reviews. And we have seen in the headlines how this is perceived. And I actually gave a recent RISE webinar about balanced risk adjustment and we'll be doing that also at RISE National and I know we're going to be talking about that a little later. The competitive edge is in the balance in ensuring that as with data and interoperability mandates the member and provider experiences are redesigned to make enable decisions again here we got decisions to be made as close to the point of care as possible. So compliance experts are absolutely necessary. I am one of those and I would like to think I have some job security in that regard and we need them in order to identify the things to pay attention to because there's a lot out there. It is up to the rest of the leaders to work collaboratively with these professionals to turn compliance into competitive advantage

Ilene MacDonald, host

Excellent . You talked about interoperability and that part of your world what does that look like business leaders right now how can they be how can that interoperability knowledge help with members making that sort of the center of care?

Dawn Carter, guest

So first and foremost it's important for leaders to remember that interoperability is not just about moving data. That's what a lot of people think you immediately think when they think interoperability . It's actually about moving decisions again our decisions closer to the moment of care. It is important to remember that when it seems that building interoperable systems is an expensive chore it's very important to remember it's not just about moving data. That's cheap. It's actually about moving those decisions point of care that is not but the ROI is higher and we're going to talk about that a little later. And here's some words of wisdom I can impart especially for those who might be somewhat new to the concept . As I said interoperability is a business capability. It's not just an IT project or some buzzword. It enables precision outreach, closed loop referrals, and real-time care navigation things that are going to move that needle on value-based c are. It's the bedrock and the data that that value-based care needs to deliver on its promises of member provider and cost savings. The second thing words of wisdom are timeliness beats completeness what this means and this this is as simple as I can make it is no one is likely to argue that a 70% view this week is more valuable than a 100% view six months late. So we say always hey hindsight is 2020 and it sure is but in health care that's going to put you behind the eight ball. So more prospective view. Third is you have to govern definitions in this world. Now the concepts of things like admission, active member , closed gap and attributed must match across product clinical and finance just to name a few examples. I like to also give the example of one of the most hotly debated definitions in AI development and that is human in the loop. If you ask 10 different people what this means you'd likely get 10 different answers and so it's important to have a good foundation of standard definitions when conducting discussions about interoperability. And then last but not least it changes how organizations buy and build holistically and vendor contracts related to interoperability work should include data rights, response times, event ,notifications and real world use cases not just a bunch of file formats. And that's my best advice

Ilene MacDonald, host

Where do you see interoperability headed in the next few years and if leaders haven't been thinking about it, what should they be doing to prepare for it now?

Dawn Carter, guest

So to prepare now you want to certainly lock in data rights into your contracts, build minimally viable data products that are tied to actual business outcomes, and definitely you want to invest in identity management so that you can reconcile members across sources reliably. Now we have seen in privacy and security regulations, we've seen them acknowledge the evolution of identity management and so we can expect much more progress towards a more efficient way to manage it. This has been a major challenge for interoperability. We can also expect to see increased enforcement of information blocking regulations to ensure that no one is profiteering from interoperability and removing this as a barrier. So four bullets that we can definitely is a summary see more real-time eventing and automation you know think admission alerts triggering same-day outreach and post-discharge scheduling without any manual work. We can expect broader data domains things like social determinants of health, pharmacy, behavioral health, and community based referrals will be pulled into a single member fabric. This is a huge focus of the work going on at the federal level with interoperability and I'm actually going to be giving a RISE webinar in April that talks about social determinants of health and this data domain and the work that's going on with that and how it it contributes to that single member fabric. That's April 28th that's happening and I'm co-presenting with our medical director. Third thing prior off transparency and status at scale the winners will collapse turnaround times and make it painless for providers so again yes we know there's mandates out there for this to happen a lot of work going on. Also patient and caregiver directed data flows expect more member authorized sharing and your experience layer has to be ready for this and this is where identity management comes in. Think being able to use the data from what is called the Internet of Things, the IoT, things like wearables and other wireless technologies that give more real-time insights into patient health status. And so those are all some of the exciting ways that we can see interoperability manifesting as we move forward.

Ilene MacDonald, host

Excellent. The fourth skill that you mentioned to me was awareness financially. What concepts do you think leaders currently underappreciate or might be misunderstood today?

Dawn Carter, guest

So I'm going to focus on the concept that most of our audience is already likely familiar with and that is return on investment, the ROI, the good old ROI. It's the logical place to start but it is absolutely not the finish line in value-based care. That might be an unpopular opinion but I think it's a misunderstood and underappreciated concept but not for the reasons you might think. Rather we have to understand that in the concept of ROI it's a way more nuance than we might have learned in undergraduate business school or early in our career. Value-based care investment ROI should be evaluated on five other measures that reflect these nuances. First is speed and we look at that in terms of time to value this year or in some other time increment. The second is cashability. What are the savings we can realize from this? The third is capacity. Does it fit normal work flows? In other words again, not wanting to disrupt what providers are doing, which is providing care. Fourth is contract fit. Does it move the actual measures and the revenue levers and conversion? So as an example, how many of the members who are outreach to going back to our wellness exam example, how many of those resulted in an actual visit? Not just the outreach activity itself. So, I'm going to give you another practical example that people are very familiar with, in-home assessments versus pre-visit planning. So in-home assessments of course occurring in the home, pre-visit planning occurring in the electronic health record that incorporates prompts for care gap closure. So we have two very different activities. Now most people listening to this know that in-home assessments are expensive but typically show the higher traditional short-term ROI on paper because they can rapidly capture previously undocumented conditions and stabilize revenue in the current year. Now I these assessments headline ROI is often late year, labor heavy, and audit sensitive as we know. So it's great for a quick revenue patch but less great for durability. Now in contrast pre-visit planning inside the EHR usually slows shows a slightly lower short-term ROI but it delivers steadier multi-year value, better provider trust, cleaner compliance, and spillover gains in quality and total cost of care. So let's apply our more nuanced ROI model that I just talked about with the five characteristics to this comparison. So for in-home assessments, our speed it's quick, we can stabilize revenue fast. What's our cashability? That's mixed completion rates very has very limited impact on medical costs. What's the capacity? Very heavy vendor dependent by and large, member fatigue, workflow disconnections with PCPs, lots of capacity needed to support it. Contract fit? Meh. Partial. Supports funding accuracy but it has a weak link to quality and avoidable utilization. What's the conversion? Variable, many invites and fewer completed visits and downstream care. And of course we know there are a lot of reasons for this depending on the market. So what's the verdict? It's useful as a targeted supplement but not as a primary strategy. So let's do that same exercise with your pre-visit planning inside the EHR using your prompts in the workflow. What's the speed? Moderate but steady. It builds month over month gains. What's our cash ability? It's durable. Better documentation plus fewer bounce backs and duplicative visits and readmissions. So it's pretty durable. What's its capacity? Very strong. It fits routine care and provides very low friction for clinicians and members if it's done correctly. What's the contract fit? Excellent. It supports accurate funding, quality performance, and total cost of care and it's tied positively to all those things. What's the conversion? It's pretty high prompts at the point of care lead to real actions right there at the point of care. So what's the verdict? Pick this for lasting performance. Add in-home assessments only where they truly complement care. So the moral of the story is if you need a quick revenue patch, in-home assessments win on traditional ROI. If you want durable performance with lower risk, pre-visit planning wins over time with that nuanced approach. Of course there are moderating variables presented by the characteristics of the population that you need to consider when making decisions about the balance that makes the most sense relative to the ROI. And this is why we love those actuaries.

Ilene MacDonald, host

I wonder before we move on to our fifth skill, if you have any thoughts on how a stronger financial literacy would help leaders succeed in value-based care specifically ?

Dawn Carter, guest

So value-based care works when the math matches the medicine and some things to consider when pursuing that match are sharper contract design. You can set realistic benchmarks, risk corridors and stop loss to protect your downside risk while rewarding real improvement. Second, it enables better panel strategies and you'll balance acuity mix, outreach capacity and network design so you can hit both your quality and cost targets. Also better financial literacy allows you to implement faster ROI on care models and realize that ROI because you're matching your interventions such as your home health, behavioral health integration to populations where they actually add to the margins. And then last but not least you're more able to better align incentives. You'll pay providers for actions that change outcomes and not just set vanity metrics, which are numbers that look impressive at a glance but really don't reliably change decisions, behavior or outcomes. These metrics tend to these reflect volume, visibility, or activity rather than conversion, impact, or value. So remember that. We don't want metrics that are just focused on volume, visibility, acuity. We want conversion, impact and value. In short, they make you and perhaps the C-suite feel good but they don't give you actionable insights so pursue those conversion metrics.

Ilene MacDonald, host

Great. Our final topic of or for the skills anyway that leaders need you said you mentioned when we talked earlier that it was strategic storytelling is one of your favorite topics. Can you define it and explain why it's such a critical leadership skill?

Dawn Carter, guest

So I should probably preface my answer by distinguishing strategic storytelling from regular storytelling. So strategic storytelling is built to drive a specific decision and action while regular storytelling informs or inspires. So think of this as the difference between Grimm's fairy tales or Aesop's fables which entertain and a book like The Obstacle is the Way which I mentioned earlier or even any of the holy texts such as the Bible, Quran, the Torah, any of those these are designed to shape choices and behaviors in the real world. So some might argue that fables and fairy tales teach lessons that shape choices and they do, but strategic storytelling goes a step further. It doesn't just offer a moral. It names the decision and the behavior change it seeks. For example, Aesop's The Tortoise in the Hare illustrates Slow and Steady Wins the Race. Memorable and certainly a lesson but not prescriptive about how to schedule run practice this week and what to do to get faster to win that race. Conversely a book like The Obstacle is the Way asks readers to reframe adversity, act deliberately, and cultivate will, which help one move from insight to concrete action and actually win the race. Therefore, strategic storytelling is a decision ready communication. It targets a defined audience. Opens with what should we do now and why? Follows a tight arc that we'll talk some about context, insight, implication, action, and measures conversion and results which I mentioned earlier. It also lives in the workflow and and we see this in EHR tasks, care manager cues, member text so timelines actually change and there are actual measurable impacts so here's an example a practical example. Here's what our regular storytelling would say. Maria missed her PCP visit and ended up in the ER. We must do better. Let's send her a bill with a mixed missed visit fee. Now usually the story stops here or maybe there might be an attempt to call Maria to reschedule with no regard or care for why she missed her appointment in the first place. Only the punishment of the missed visit fee. Not to mention that missed visit fee has far less ROI than what I'm about to describe to you in the strategic story, especially if you consider the nuances of the ROI that we discussed earlier. So what does the strategic story say with this with the same case with Maria? So let's apply our story arc. Our context. Members like Maria have two times the ER use as other members. The ER provided a report back to the PCP of Maria's visit and the care navigator at the PCP's office called to follow up on why the appointment was missed. What's the insight? When care navigators or other scheduling staff asked Maria and others like her why they missed their visits and went to the ER, they indicated transportation difficulties as the issue. So what is the action? Hmm starting Monday care navigators or other scheduling staff will be prompted to auto-offer rides when they book and push a visit prompt in the EHR. And the implication of this is if we remove the transportation barrier at the moment of scheduling and prompt clinicians in workflow, we'll convert missed visits into timely primary care and cut avoidable ER use. If we don't, this cohort continues to drive twice the ER utilization and higher PMPM costs. And we set some targets for conversion and again this is all very simplistic it takes time to do this but you get the idea. It might be 45% of visits kept in six weeks or and and/or ER utilization is down 15% in 90 days. And then the threshold is if the kept visit goes lower than 30% by week four of this experiment, we adjust and retry. So strategic storytelling is turning complex realities into a clear decision with aligned action. It's not spin, it's structure. So in summary, context is what problem matters and for whom? Insight is what's new or counterintuitive that we've learned? Implication is what happens if we act or if we don't. If we make a choice if we don't. A ction. The specific decision, owners, and timeline. This is critical because busy stakeholders don't need more data. They need confidence a strong strategic story builds that confidence without .oversimplifying

Ilene MacDonald, host

Do you have any thoughts on, maybe not mistakes but maybe common, I don't know , shortcuts or something that leaders might make when they're working with data that you think would be helpful as part of this strategic storytelling.

Dawn Carter, guest

Yes, I'm going to give five of my favorite examples with what the story should instead include. An example of what it looks like in practice. The first is relying too heavily on averages. And you heard me mention this earlier. The mistake here is quoting a single average that hides the real story. So instead, you want to break it down. Maybe new versus returning members, a few hotspot clinics or providers or regions, or even specific measure families like diabetes or heart failure. Your informaticists can help you decide the best way to do that. And an example of what you say to yourself is this isn't everywhere. Two regions and new members are driving most of the dip, so we'll fix onboarding and fix those clinics first. Second would be confusing documentation activity with better care. Now the mistake here is celebrating, ooh, we recorded more conditions as if it means better health. Yes, we want our clinicians to tell the story through documentation, but we have to be able to see the results of that effort. So instead, we want to pair accuracy with real outcomes, such as kept appointments, fewer avoidable ER visits, faster follow-ups, any number of things. And what we say to ourselves is yes, documentation is cleaner, and that's great. And the proof is fewer readmissions in the same group. Third thing that I see is chasing year-end numbers instead of building durable routines and optimizing workflows as part of an overall prospective strategy. And I did mention this earlier also. And here's our mistake, and hopefully none of you are shaking your heads when I say this. Big frantic pushes in Q4 that burn everyone out and create abrasion, and nothing changes in daily care. No one's doing that, right? Instead, put simple prompts into normal visits, connect hospital alerts to seven-day follow-ups, and make first PCP visits easy to schedule. This is where thoughtfully designed workflows that incorporate prompts come in to facilitate accurate storytelling. The story is in that day-to-day, and the challenge is using the data to determine what the day-to-day looks like. So we say to ourselves, no end-of-the-year scrambling, burnout, or abrasion. We want steady, routine capture during regular care. Fourth, this is related to the one I just talked about, is creating data in a separate portal where insights live in dashboards and portals and pretty reports. Here's the mistake: no one checks these during care. And you are asking for a fight and abrasion if you tell providers they need to consult a bunch of reports before, during, and after they deliver care. Trust me, they want to deliver care, not be in a portal rifling through reports they don't understand or looking at a bunch of dashboards. Instead, push tasks and nudges into systems that people already use in their workflows. The EHR inbox, the scheduling queue, care manager work list, members' applications, any number of things. And you say to yourself, if it doesn't show up where work happens, it's not going to happen. And then last but not least, my favorite, ignoring equity in the story. The mistake here is declaring victory on overall quality while gaps widen for certain groups. Instead, show your quality in your risk results by language, geography, income, any number of other if significant demographic factors, and tie these to targeted outreach or community partners. Again, be very specific for your population and your informaticist can help you with this. Say to yourself, okay, the average improved, but rural members still lag. We'll need to add transportation support and some other phone scheduling interventions there. And those are my recommendations.

Ilene MacDonald, host

Thank you. I'm excited to be seeing you in a few weeks at RISE National. When we talked, I know that you're going to be doing sort of this interactive bingo activity, which is live that includes I think you described it as dumb strategies. I was wondering if you could share an example of a strategy that you think organizations should maybe put to bed immediately, and what should they do instead?

Dawn Carter, guest

Yes. And when I gave my webinar about this, this particular strategy was was one that people admitted was a problem in their organization, at least as much as they knew. And this is what I call spray and pray gap closure. This is like mass texts, robocalls, generic mailers to every member who's got some sort of a gap. I will tell you, and most of you know this, it burns trust with members, irritates providers, and wastes money if it's not done right. The judicious and intelligent use of analytics, like we've been talking about, can help organizations avoid these dumb strategies holistically and find the ones that will net the best results for the least cost, including the cost of abrasion, because there is a cost that comes with abrasion. Now, it should be noted that this dumb strategy of spray and pray also includes the situation where an organization only invests in the same old retrospective strategies without really looking at how much of a balance of retrospective and prospective would benefit them and the needs of the population they serve. Because it's going to look different for every health plan. This is not set it and forget it. This is not rubber stamp template for everyone. Retrospective strategies are viewed as safe, quote unquote, because their success can be definitively measured. But with the right analytics, those strategies can be complemented and even made more effective with the right targeted prospective strategies. So instead, we want, instead of spray and pray, we want precision outreach. And here's examples of what that looks like. And note that it's really similar to what our example with Maria earlier. So we want to target members with high likelihood to close with a high impact, such as rising risk, recent ED visits, no PCP relationship. Again, informaticists can help help you tell who those members are with a high likelihood to close. Use multi-channel nudges that are tied to a scheduled action, such as you get a text message, there's a link that they can go to schedule, and also schedule transportation if they need it. Third, coordinate with your providers so that your messages match clinic availability. That's important. It seems like a no-brainer, but make sure that clinicians are available when people are going in there to schedule so they're not frustrated by wow, my PCP is not available or no one's available to see me. Also, measure reach, conversion, kept appointments, and outcomes. So that whole arc, not just how many messages you sent. Again, that vanity metric of, oh, I sent 5,000 messages. Well, how many of those actually resulted in a kept appointment and then an outcome like reduced ED visits or PCP visits captured?

SDoH misconceptions

Ilene MacDonald, host

You mentioned I think earlier as social determinants of health is still a cost driver, even if it's not necessarily being pushed in regulations right now. What do you think are the biggest misconception when it comes to SDoH and what it can do for leaders um regarding the impact on financial and quality performance?

Dawn Carter, guest

So this is a timely question because just within the last few weeks, CMS released their report on the performance of the accountable health communities model evaluation. This was a Center for Medicare and Medicaid Innovation, the CMMI. It was a that put forth this model, and it was launched in 2017 and concluded in 2023. And it demonstrated that focusing on patients' needs related to upstream drivers of health, our SDoH, can lead to cost savings while maintaining or improving the quality of care beneficiaries receive. And this was evidenced by reductions in inpatient and emergency department utilization. So that's what this model focused on is its metrics. Now, this model had two tracks, and in it it had the assistance track, and in that one, Medicaid beneficiaries had a 3% reduction in total expenditures, and fee for service Medicare beneficiaries had a 4% reduction in total expenditures. Now, in the alignment track, the other one, Medicaid beneficiaries had a 7% reduction in total expenditures, and Medicaid beneficiaries in the intervention group also had lower inpatient admissions and unplanned readmissions relative to the control group, indicating that reduced inpatient use, including unplanned readmissions, was a key driver of the lower observed expenditures among Medicaid beneficiaries. There were other positive findings, but the message here is that, you know, is that organizations benefit from addressing those upstream drivers of care. And there are some misconceptions that could prevent organizations from realizing similar outcomes. The first is, quote, this is charity, not strategy. Now, in reality, addressing SDoH often protects margin by reducing avoidable utilization. SDoH has been estimated to drive 80 to 90 percent of avoidable of health outcomes. And so health plans and providers ignore it at their peril. This is not charity, it is common sense. Need evidence? Well, I'm going to give you some. Read the paper authored by Berwick, Bachelor, Trotsky, Graybau, Gilfillin, Isasi, Milstein, and Nichols. Say that five times fast. And that's in the January 2025 Health Affairs article entitled From Laggard to Leader: Why Healthcare in the United States is Failing and How to Fix It. Great article. The second misconception is we can't measure the ROI of these things that are focused on SDoH. Yes, you can, because you can tie it to specific cohorts and events like ED and readmits and time windows. Now, in risk adjustment, various HCCs are tied to food insecurity, for example, such as diabetes and hypertension, as well as a variety of quality measures. Remember, ROI is nuanced, and to revert to a traditional black box ROI here will result in missed opportunities to really move the needle on value-based care. In fact, if you are dealing with a leader who claims the ROI cannot be measured, direct him or her to the 2025 paper of Nichols, Waidman, Clemence Cope, Garrett, and Taylor, published in the Health Affairs article titled Tracing Value from Social Determinant Solutions. And this was in September 2022. Also in September 2022, the Commonwealth Fund published a guide related to calculating the ROI for partnerships that address SDoH. And last but not least, although there's many, it's outside our scope. We don't address homelessness and food insecurity. We can't carry food boxes to everybody or give everybody a home. Maybe not, but contracts, community partnerships, and benefit design give you more levers than you think to address these things. And I'm going to stress this. And if you don't remember anything I've said on this podcast, please remember this. Everyone has a role in addressing SDoH. Payers, providers, the government, communities, and the members themselves.

Ilene MacDonald, host

And to end our conversation, do you have any practical ways that care management teams can integrate the SDoH into their value-based strategies?

Dawn Carter, guest

Yes. And I'll give a few. The first is to screen with intention. Of course, we've got to screen first to know what these are. So short, repeatable screeners can be embedded in visits or digital touch points, and they focus on actionable needs such as food, housing, transportation, utilities, caregiving, any number of very common social needs. Now many EMRs and EHRs have templates built in for this purpose. So if you are not using one, inquire, especially the large ones, have templates for this. And if not, there are many validated tools that are available that can be incorporated into workflows. Because if you're doing this, CMS requires certain validated tools to be used, and there are plenty, and they are very easy to use. The second is to close the loop. Use referral platforms or curated lists of community partners. Confirm service completion and not just that you sent a referral and then it went into some void. There is work ongoing at the federal level with interoperability to facilitate closing the loop. And again, I'm giving a webinar actually that talks about this on April 28th. So stay tuned for that. It doesn't do any good to send a patient away with a referral to some community-based organization and then have no way of knowing if they actually completed it. Because then how are you going to measure any movement in the value that's tied to that? You can't. The third is bundle the offer. Pair social determinants help to patients with a clinical action. And an example would be scheduling a colonoscopy. Pair that with transportation help. Or something like, oh, somebody's got diabetes and they need diabetes education. Maybe they get a food box delivery. The next one is funding what works, which is tied to bundling the offer. Stand up small, pay for outcomes pilots with community-based organizations. Get creative. Scale your partners that hit kept appointment and reduced acute use metrics rewards. Then be sure you measure the right outcomes. And we've talked about this with vanity metrics and such earlier. Event rates such as ED visits and readmissions, appointment adherence, PMPM for the targeted cohort, member satisfaction, provider experience, many different outcomes that could be measured. And again, use your informaticists, your actuaries, and your other staff to help you figure out what those outcomes are based on your population. You also want to make sure you're building strategies into your contracts and give them teeth. Have incentives for completed closed loop referrals, for meeting equity-stratified quality measures, and give support for care team community health workers. And then last but not least, because I am also a medical coder, you want to document completely and accurately. You got to capture those Z codes and the narrative context support care planning and quality and equity reporting, even when they don't impact risk scores and measures directly. Now, these codes generally fall into the range of Z55 through Z65, and there are special rules and medical coding guidelines related to who can capture them, and includes a patient's ability to self-report as long as a clinician reviews and signs off on them. So those are some of my best practical advice for those care management teams who are doing that good work of addressing SDoH with our clinicians.

Ilene MacDonald, host

Thank you, Dawn. I'm really looking forward to learning more at RISE National, those who want to attend and go to your round table to talk about that live action Bingo. And also in April 28th for your webinar. Much appreciated.

Dawn Carter, guest

Thank you so much for having me. It's always a pleasure.