A new website, created by the South Carolina Center of Excellence in Addiction (COE), makes it easier for state and local leaders, health care providers, researchers, and citizens to understand how well opioid use disorder (OUD) treatment is working in their communities. The website provides an example of how states can report on core metrics for assessing OUD treatment.
Jodi Manz is the director of COE, which is located in the state capital, Columbia; Christina Andrews directs the data analytics score within the COE and is an associate professor in the Department of Health Policy and Management at the University of South Carolina’s Arnold School of Public Health; and Margaret Key gives the COE a local treatment perspective from her position as the executive director of the Aiken Center, which provides addiction recovery services in Aiken, South Carolina.
This interview with the three of them has been edited for length and clarity.
Andrews: We wanted to answer key questions about how the opioid “ cascade of care”—a framework used to track and assess the quality of care delivery to people, from diagnosis to outcome—is performing in South Carolina. How many people who have OUD have been diagnosed within our treatment systems? Among them, how many are linked to care within a reasonable period of time? And finally, among those who enter medication or psychosocial treatment, or both, how many of them are retained in care? We began with data from the state’s Medicaid population because it’s really high quality and comprehensive, though we’re seeking to expand to other groups as well.
Andrews: The first iteration focuses on the Medicaid population because it’s a large portion of people within the state and also includes many people who are struggling with OUD.
We developed a set of core measures to understand how people were moving along this cascade of care, or dropping out. We obtained the Medicaid data and computed the rates at each point along the cascade, both for the entire state as well as different key geographic regions and individual counties.
Andrews: First, we created individual county-level profiles so any resident of any county can come to our website, pull up their community’s information, and see how their systems are performing and compare it to their region as well as the state as a whole.
But we didn’t just want to produce this information and say, “Here’s a bunch of problems to throw into your lap.” We share best practices to try to improve on those outcomes—linkage to care, initiatives to improve retention—and we also try to refer community leaders to our center’s technical assistance team so they can talk with a real human being and get hands-on advice.
We also created a web-based dashboard, which includes maps showing what we found county by county in South Carolina.
Our next steps will be to include the uninsured population, which is tougher to find because there’s no claims database for people without insurance. And I have the dream that down the road we’ll be able to bring in the commercially insured population as well.
Andrews: We’ve found that we’re not where we would like to be. We know from Medicaid data for all states, for example, that on average about 3% of Medicaid enrollees receive an OUD diagnosis. In South Carolina, it’s 1%, yet we don’t have any reason to expect that there’s actually less OUD here. In fact, South Carolina is number 10 among states in terms of the rate of opioid overdose deaths; that suggests that there are some really important opportunities to improve efforts to identify and diagnose people in our state who have an OUD and connect them to care.
We found that among the 1% that’s diagnosed, about 20% of people in our Medicaid program are linked to care within 14 days—and it’s only slightly higher when looking at 30 days. It’s particularly problematic in emergency departments, where less than 10% of people who come in with an opioid-related emergency are then linked to follow-up care.
Andrews: Originally, we had two main pieces of data: how many people overdosed, and how many people died. Those are really important statistics, of course. But to understand how we can improve our systems of care, we need information about how good of a job we’re doing in figuring out who needs treatment, and how well we’re doing in terms of getting them to care. And that data was lacking.
Manz: We also want to be judicious stewards of the opioid settlement funds and grants coming to the state. In order to do that well, we need to understand how our treatment system is performing.
Key: It’s easy at the county level to feel alone. It’s not necessarily that you don’t have enough data; it’s just that you don’t know how to narrow it down to the metrics that matter.
Behavior change isn’t just about clients: In this constantly changing environment, we have to figure out how to stay nimble as peer support specialists, administrators, front-desk people, prevention specialists, clinicians, and so forth.
The beauty of this project is that it helps us take an organized approach so that we can all start talking in a similar language, measuring points along the journey.
Manz: South Carolina did not expand Medicaid in recent years, leaving the state with more uninsured individuals. So the very first question we get is: “What about the uninsured?” It’s a valid question and figuring out how to get and analyze that data is a very lengthy process.
The other is that people’s eyes often glaze over when you start talking about data and analytics. It’s not necessarily a component of service providers’ day-to-day work. So we have to make it accessible to folks like me who aren’t data analysts.
Andrews: One of the greatest strengths of our center is our partnership with state government, who trust us when we say, “This is important; let’s make it happen.” None of this work would be possible without them. On the flip side, when our partners saw the first dashboard we produced, they said, “No, no, no. This is way too complicated and data-heavy.” We had to strike a balance.
Key: Well, if I click on the dashboard and look at Aiken, I might think anecdotally that we’re about average in one category or another. But this database tells us if we’re a little better or a little worse, and helps us figure out how to prioritize our time. And it helps that across counties, the database’s definitions of the measures are the same—it’s really apples to apples. You’re not lost in the wilderness going with your gut instincts.
Eighteen months ago, there were two people dying each week from an opioid overdose in my county. This past year, it was one person. So, a 50% drop. But that’s still one person per week, and my staff and I find ourselves thinking, “OK, how can we help so that one death is prevented?”
The more this data is built out, the more accurate a picture we’ll have and the more likely we are to be successful in saving lives.
Andrews: I’ll add that if you see some counties that seem to be doing well in a particular area, you can find out more. What are they up to? How are they doing it? And see if we can find some actionable steps that could be shared with other folks.
Manz: We also heard from one of our community partners that she just used the data to help with a grant application. It’s great to hear that local data is helping grant writers do their jobs.
Over time, the center also hopes to contribute to an impact analysis to understand how grant dollars are being spent in our state. And this data starts in 2021, so at particular points when we’ve made specific investments, we’ll be able to see what the impact has been.
Manz: I think the first thing is to identify how long you need to obtain and analyze the data, and then multiply it by two. Data is notoriously challenging. Also, identify potential uses beyond your primary concern. Doing that early on can help you tailor your approach and dissemination.
Key: From my position, I’d tell states to think about the end user of the products you’re going to create. Make sure it’s serving the citizens and will allow service providers to actually improve what they do. That builds credibility from the beginning; it builds trust that there’s care behind the research.