How Counties Can Use Evidence-Based Policymaking to Achieve Better Outcomes

Research can guide budget and policy decisions in local jurisdictions

Navigate to:

How Counties Can Use Evidence-Based Policymaking to Achieve Better Outcomes
iStockphoto

Overview

Counties play an essential role in delivering front-line services to residents. From administering state and federal benefits to operating jails and supporting local hospitals, they invest more than $550 billion annually in their communities, according to the National Association of Counties (NACo). Given the scope of their responsibilities, counties also have a significant opportunity to improve the quality and efficiency of the programs that serve their residents.

But county leaders often face substantial challenges in their efforts to provide high-quality services. They may lack adequate tools to determine the effectiveness of public programs as well as the data collection and oversight systems needed for improvement. However, better access to research and advances in technology are making it easier for leaders to engage in these activities, particularly through evidence-based policymaking. This approach uses the best available research, such as existing program evaluations and outcome analyses, to guide policy and funding decisions. It allows county leaders to ensure that they are choosing the best services for their residents, maximize limited resources and benefits for taxpayers, and track and continuously improve their programs.

Because of counties’ relatively small size compared to most states and proximity to services and providers, county leaders often observe community problems firsthand, have personal relationships with agency leaders and community stakeholders, and can work quickly to solve problems. These factors can help counties get the support, buy-in, and coordination they need for their evidence-based policymaking endeavors to succeed. Although significant research has been conducted on state-level efforts to advance evidence-based policymaking, little research exists about these practices at the county level. This report by the Pew-MacArthur Results First Initiative and NACo is a first-of-its-kind look at how counties engage in this work. It also identifies ways counties can support and sustain their success, including:

  • Building internal support for needed changes. 
  • Starting with small innovations and scaling them up. 
  • Engaging external partners. 
  • Investing in capacity building for provider organizations. 
  • Leveraging existing administrative data

Key components of evidence-based policymaking

In 2014, Results First created a framework for evidence-based policymaking that includes five key components: program assessment, budget development, implementation oversight, outcome monitoring, and targeted evaluation.

This report builds on that framework and finds that many of the steps applicable to states also apply to counties. In addition to exploring the work that local jurisdictions are doing within these five components, this report examines the lessons that are specific to counties. While they do not need to tackle all of these steps at once, local governments can adopt some of these strategies—those that best fit their needs—to support better decision-making.

Program assessment: Identifying effective programs

A key first step is getting a clear understanding of what programs a county is operating, how effective those are in achieving outcomes, and how existing services compare to alternatives.1 While this may seem like a simple task, few county officials have access to these details. An absence of this information can limit policymakers’ ability to direct funding to the most effective services. This section highlights three strategies counties are using to identify which of their currently funded programs are working, which need additional testing, and which are not delivering anticipated outcomes and could potentially be replaced with a more effective alternative.

Creating a common language. Without a shared understanding of what a “program” is or what constitutes an “evidence-based program” versus a “promising practice,” leaders are likely to find that different interpretations of these words lead to confusion among agency heads and inconsistent standards. To promote uniform interpretation of research on program effectiveness across agencies, some counties have established formal definitions or standards of evidence.2

The Sonoma County, California, Board of Supervisors created the Upstream Investments initiative in 2007 in response to concerns over escalating criminal justice costs. The cornerstone of the initiative is an online portfolio that catalogs and includes detailed information on 109 prevention-focused programs operating in the county.3 The portfolio distinguishes between evidence-based and other practices, such as those that are promising or innovative. To be listed in the portfolio, government agencies and provider organizations must apply to have their programs considered for inclusion, and a panel of research and evaluation experts reviews proposals to determine whether the services have sufficient evidence to be included. In their applications, providers must share a logic model—a visual depiction of the relationship between a program’s activities and its intended results4—as well as service procedures, an evaluation plan, and other information that will help monitor effectiveness. Creating common standards for “evidence-based” and for inclusion in the portfolio helps ensure that featured services meet certain requirements and that providers measure and understand the effectiveness of their programs relative to alternatives.

Determining the effectiveness of existing programs. Once counties have defined clear standards, they can inventory the programs they fund and determine whether they are, according to the counties’ definition, evidence-based. National clearinghouses are useful tools in this process. Clearinghouses provide summarized research on programs and assign ratings based on the program’s level of effectiveness. Counties can use information from clearinghouses to assess whether programs in their own inventory demonstrate positive, negative, or no impacts on target populations, or whether they require further evaluation (see Appendix B for additional information on using research clearinghouses). In addition, counties have used baseline data on programs to uncover overlapping programs or gaps in services.

The Department of Correction and Rehabilitation in Montgomery County, Maryland, worked with Results First to inventory and assess the effectiveness of its Detention Services Division programming. The county cataloged all 86 services operated by the division to help provide people who are incarcerated with skills and information for community re-entry, then collected additional information on 30 of these serving behavioral health needs or specialized populations. These steps helped the county catalog the assistance it offers, the specific target populations, and the outcomes expected, such as reduced recidivism. Using the Results First Clearinghouse Database, a resource providing information on effectiveness from nine national clearinghouses, the department found most of the 30 reviewed practices to be evidence-based and promising. Eleven were shown to be unrated—meaning that there was not enough research to adequately show effectiveness. This process of creating an inventory and assessing effectiveness confirmed for division leadership that most of its core specialized services have been tested and are effective and appropriate for the populations they serve. It also enabled the department to identify several programs with limited or no rigorous research supporting their effectiveness, which department leaders now plan to evaluate.5

Using research to select among effective program options. Local governments can use research clearinghouses as a menu of potential program options. After assessing the effectiveness of their programs—and the degree to which these meet the needs and priorities of their communities—department heads or other local leaders may determine that it is time to replace an ineffective or outdated program or add new services. County leaders can then turn to clearinghouses, which often include detailed program information that agency or program managers can use to identify an effective program that will best fit community needs.

Salt Lake County, Utah, utilized existing research to identify and fund two evidence-based programs to meet the needs of at-risk youth in Kearns Metro Township, an underserved community within the county. After a community coalition analyzed data from a state youth survey and found Kearns youth to have higher rates of alcohol and nicotine use, gang involvement, and depressive and mental health symptoms relative to the rest of the state, the coalition used the data to identify priorities for its youth programming. The coalition then created a comprehensive list of its programs, compared it to the identified priorities, determined areas where the township needed additional programming, and then consulted Blueprints for Healthy Youth Development, a national clearinghouse, to find proven effective programs that would fit those needs.

“We ended up with two programs: Guiding Good Choices and Positive Family Support. We decided they were the right programs because they fit several of our priorities,” says Caroline Moreno, education program manager at Salt Lake County Human Services.6 The youth survey data exposed a need for services that could help Kearns promote prevention and support youth and families to avoid risky behavior. Working through this process empowered the coalition to select programs that addressed these needs and had demonstrated results under rigorous evaluation.

Budget development: Using evidence to inform funding decisions

In addition to helping local leaders identify and assess programs, counties can use evidence—including performance data, evaluation or audit findings, and national research—to inform funding decisions.8 Instead of relying on anecdotes or a prior year’s expenditures to determine how to allocate limited funds, county leaders can make those decisions based on how effective programs are, according to research and program monitoring data. The strategies outlined below demonstrate some of the ways counties are using research for budget development.

Using evidence in the budget process. Some counties have instituted mechanisms that require agencies to provide research on a program’s effectiveness at some point during the appropriation process. This enables local leaders to assess whether programs are likely to produce positive results as they determine how to allocate resources.

One such mechanism is a requirement that all agencies include information on a proposed program’s effectiveness as part of their requests for funding. The Santa Barbara County, California, Community Corrections Partnership (CCP), a group of representatives from all branches of the local criminal justice system, develops strategies to reduce recidivism among individuals in the county justice system and oversees criminal justice funding totaling more than $13 million annually.9 The CCP recently implemented a funding proposal review process—the Criminal Justice Funding Opportunity form (see Appendix B for a snapshot of the form)—requiring any agency that requests funding from the CCP to cite evidence of program effectiveness, the specific outcomes the proposal will address, how it plans to measure those impacts, and—where possible—cost-benefit analysis information.10 The CCP reviews this information and recommends whether or not to fund a program; the proposing group must supply the recommendation to the Santa Barbara County Board of Supervisors for approval.

“Our goal is to have a robust project so that the supervisors, who are not all criminal justice experts, can feel confident about approving the funding,” says Tanja Heitman, chief probation officer for Santa Barbara County. “The CCP understands the difference between evidence-based programs and promising practices, and the agencies will address the hard questions before going to the Board of Supervisors.”11

Similarly, the Boone County, Missouri, Community Services Department uses research to direct resources in its Children’s Services Fund toward effective programs. Based on a special ¼-cent tax passed by local voters in 2012, the fund generates approximately $6.7 million annually and can be used for a range of community-based prevention and intervention services for children up to age 19.12 The Boone County Children’s Services Board, a nine-member group of experts and administrators appointed by the county commission to oversee the fund, reviews all applications for funding. The board ensures that programs meet statutory eligibility requirements, are demonstrated to work, and meet current and emerging community needs.13“The board looks at each proposal and considers if it’s evidence-based or has research supporting it,” says Kelly Wallis, director of the community services department. Wallis notes that proposals don’t have to include specific proven programs to gain approval, but that organizations at a minimum need to develop a logic model showing how the program will affect the specific outcomes they aim to achieve.14

Building evidence requirements into contracts. County governments frequently contract with nongovernmental entities, such as community-based nonprofits, to deliver services to residents. To achieve the best outcomes, county budget offices can prioritize funding for effective programs in their contracts with these third-party providers, or require new programs to include an evaluation of their impact.15

Since 2015, Santa Cruz County, California, has shifted toward a collective impact funding model, in which multiple  organizations—all funded by the county and the city of Santa Cruz—share a common measurement system and set of results. The introduction of the Collective of Results and Evidence-Based (CORE) Investments represented a shift not only to increased dialogue around the use of local funds, but also to a results-based process. The county incorporated eight features of effective funding models adapted from best practices around the country, most of which apply directly to CORE’s contracting process, into the funding model.16 For example, CORE adopted a tiered approach to evidence-based programs, and asked all applicants to describe how their program fits within CORE’s three-tier framework (Model, Promising, and Innovative). Moreover, Santa Cruz took steps to ensure that applicants for CORE funding understood what the county was looking for. Through the support of local foundations, the county provided technical assistance to applicant agencies on identifying and implementing evidence-based practices and articulating program outcomes.

“Our county board of supervisors makes significant investments in safety net services and enjoys a strong partnership with our nonprofit service providers. Historically, what’s been missing from our model is agreement on the results we seek, a shared understanding of what makes a program strategy effective, and a standardized performance management system to guide our progress,” says Ellen Timberlake, director of the Santa Cruz County Human Services Department. “Through CORE, we’ve built evidence requirements directly into contracts. We’re better positioned to promote evidence-based programs and engage all stakeholders as we deepen our community understanding of collective impact.”17

"We know that evidence-based services have research showing positive treatment outcomes and are better for long-term recovery management. We believe it makes sense to provide quality services, and pay for these services, because ultimately it can help someone into recovery and perhaps prevent repeated relapses with higher levels of inpatient services."

Donna Carlson, deputy director for managed care Chester County, Pennsylvania

Establishing incentives to implement evidence-based programs. While counties acknowledge the benefits of these programs, they also note several barriers to providing them, including higher costs associated with additional training, monitoring, and other capacity-building needs. To help providers overcome these obstacles, local governments can establish incentives tied to adoption or expansion of proven programs. For example, counties can give preference in grant competitions to proposals that include interventions with demonstrated effectiveness. They can also provide technical assistance or pay higher rates to providers implementing programs with demonstrated effectiveness. Such benefits can help mitigate the costs of training and implementation.

Chester County, Pennsylvania
John Greim/Getty Images

In Pennsylvania, Chester County’s behavioral health system provides a higher reimbursement rate to providers who implement a proven effective practice from a selection designated by the county. Leaders noted that incentivizing proven effective care can lead to cost savings and improved outcomes. “We know that evidence-based services have research showing positive treatment outcomes and are better for long-term recovery management. We believe it makes sense to provide quality services, and pay for these services, because ultimately it can help someone into recovery and perhaps prevent repeated relapses with higher levels of inpatient services,” explains Donna Carlson, deputy director for managed care. “These inpatient services ultimately cost more than the enhanced rates for evidence-based, community-based services.”18

The Department of Human Services and its contracted Behavioral Health Managed Care Company collaborate with providers to consult research clearinghouses and select an appropriate and sustainable practice based on community needs. The county provides training, coaching, and ongoing implementation support to providers, and in turn requires that providers maintain fidelity to the practice to sustain the higher rate. The department conducts regular fidelity reviews, provides additional guidance for programs that do not perform well, and discontinues the incentive funding if they do not make the required improvements. Department Director Kim Bowman says, “In the beginning it was daunting for providers, but now they are excited. ... Now that we are seeing outcomes, it has been well received.”19

Some local governments may choose to develop innovative performance strategies, such as pay-for-success models that leverage private and philanthropic dollars to test or scale up programs. In this approach, private investors rather than governments raise capital for programs, and governments repay the original investment only if the program achieves specific, measurable outcomes. These emerging strategies come with challenges—such as time-consuming data analysis and contract negotiation—but they can, where successful, help scale up proven programs and create systems change in governments.20

Targeting Funds to Evidence-Based Programs in King County, Washington

When King County identified a lack of stable funding to support a shift in focus from crisis-oriented to prevention-oriented health and human services programming, it conducted research and leveraged academic and provider expertise to form a targeted funding strategy. Working closely with community partners, the county assessed the community’s needs through community conversations and data analysis, calculated where its investments would have the most impact, and created a proposal for a property tax levy that would address the needs and outcomes identified. Approved in 2015, the levy funds the Best Starts for Kids Initiative.

In many cases, Best Starts for Kids released a series of funding opportunities to ensure that it is directing funds to programs in a thoughtful and measured way and to allow organizations to apply for multiple funding opportunities. For example, the initiative is investing in home-based services along a continuum of evidence; separate requests for proposals (RFPs) have been issued for evidence-based, evidence-informed, and community-designed practices. The county endeavors to simplify the application where feasible—limiting the number of pages and questions involved, for example—and support applicants with technical assistance to make the proposal process equitable and accessible for all applicants. This will help the county direct funds to proven programs, while providing opportunities to test innovations.

Additionally, the county created a detailed evaluation and performance measurement plan for programs funded by the initiative, and allotted about $18 million (of the expected $399 million generated by the levy over the next six years) to support evaluation and data collection to contribute to an evidence base that will equip King County and its partners to improve equitable results for its residents.21

Implementation oversight: Ensuring effective program delivery

Research shows that the way a program is implemented tends to significantly affect the outcomes it achieves, making oversight of this process a crucial step in evidence-based policymaking.22 Even when governments know which programs work, delivering them in a manner consistent with the original model—known as fidelity—can be a challenge. Programs require collaboration of multiple entities, an understanding of what adaptations can be made to an evidence-based program to ensure that it works in a local context, and awareness of how such changes could affect outcomes. Fortunately, with smaller populations and sometimes more interagency collaboration, counties can facilitate training, dialogue among key stakeholders, and pilot testing of proven service models and implementation tools to ensure programs are delivered with fidelity. This section focuses on strategies counties are using to build their capacity to support effective program delivery.23

Assessing community needs and gaps in services. A needs assessment—which gathers data about target populations, the prevalence of certain conditions within those populations, and the risk factors that could be addressed through various interventions—provides information to ensure that any proposed intervention is a good fit for the problem being addressed and for the community where it will be delivered. Counties can often borrow from existing assessments, which are frequently required by federal grants, or data on prevalence, rather than gathering source data for each new program.

The Oklahoma City-County Health Department utilizes the Mobilizing for Action Through Planning and Partnerships framework, a strategic planning process for improving community health that leverages four distinct assessments, to better understand the various factors influencing its public health system.24 Every three years the county collects data to determine communitywide health needs and inform strategic planning decisions. The county uses the data to target the appropriate resources to areas and populations that need them the most. “We’ll look at ZIP codes that have low access to health care, for example, and using visual guides such as our ZIP code map, can see where areas may also have the highest populations or are very rural without access to transportation. Then, in working with our community partners, we can target specific resources to the more urban and rural areas, to ensure equitable access across our community,” says Megan Holderness, epidemiological support at the Oklahoma City-County Health Department.25

Even when governments know which programs work, delivering them in a manner consistent with the original model can be a challenge. Fortunately, with smaller populations and sometimes more interagency collaboration, counties can facilitate training, dialogue among key stakeholders, and pilot testing of proven service models and implementation tools to ensure programs are delivered with fidelity.

Screening Individuals to Better Target Interventions

Screening practices—used, for example, to detect mental illness or trauma—can help providers identify an individual’s specific needs and target interventions properly. Without the appropriate screening and assessment tools and training to implement them correctly, agencies may direct individuals to interventions that are not effective in addressing their problems, and will not achieve the expected impact that they would if services were properly selected. Validated risk assessments or screens—those tested for accuracy, consistency, fairness, and utility—can improve outcomes and help staff make a more accurate and objective determination of the type of care needed. Once staff are trained to screen individuals for needs or risk factors, they can develop more well-informed conclusions while still allowing for individual discretion.

In Miami-Dade County, Florida, the Criminal Mental Health Project (CMHP) has significantly reduced the number of individuals entering local jails and returning to jail for a new offense by connecting them with appropriate community-based treatment and support services. Over the past 18 years, the CMHP has implemented a crisis intervention team (CIT) training program to teach law enforcement officers to better recognize and respond to individuals experiencing psychiatric emergencies. People who may otherwise be arrested for minor offenses are diverted to crisis units to receive treatment in lieu of being admitted to the county jail. Over time, the county has prepared more than 6,000 CIT law enforcement officers.26 For individuals with mental illnesses who are booked into the jail, standardized screening and assessment protocols have been developed to identify those eligible for diversion into community treatment and support services. These measures have resulted in significantly fewer injuries to individuals with mental illnesses and law enforcement officers, thousands of diversions to crisis units and community-based treatment, and reduced arrests—even closure of a ail facility, saving taxpayers $12 million annually.27

Miami-Dade County has taken the lessons learned over the past two decades and refined its approach to more effectively respond to individuals experiencing mental illness and substance use disorders involved in or at risk of entering the justice system. For example, proponents of the initiative have worked for several years to engage a wide range of stakeholders and help increase their understanding of and comfort with using post-arrest screening tools to make decisions around diversion for their communities. “There was some pushback on post-arrest processes initially, so we just screened nonviolent misdemeanors at first,” says Judge Steve Leifman. “The program was so successful that the state attorney allowed us to expand to felony cases.” Miami-Dade County was also one of four launch sites for the Stepping Up Initiative, a national effort that works with counties across the country to reduce the number of individuals with mental illnesses in local jails, and has leveraged this network to improve its own processes.

Monitoring program implementation. A key step in successfully implementing evidence-based programs is to create a process to regularly monitor implementation and ensure that interventions are being delivered as intended and therefore more likely to achieve results similar to those demonstrated through evaluations. Although securing resources for implementation oversight can be a challenge for many local governments, counties are finding unique ways to build capacity for it both internally and through collaboration with partner organizations.

One approach, used by the Los Angeles County, California, Department of Mental Health, is to develop implementation guidance and internal expertise, and then rely on expert staff to train others in the agency and oversee fidelity. The department requires all providers receiving prevention and early intervention (PEI) funds to implement approved evidence-based practices. Providers implementing one of the approved programs receive PEI funds and comprehensive training assistance to support effective delivery. The county has standardized training for all PEI services; every practice—for example, trauma-focused cognitive behavioral therapy—has a designated practice lead who oversees fidelity for the program or service, required training protocols, and potential certification requirements. Each provider organization has a designated coordinator who works with the county practice lead to ensure that all staff within the organization who deliver that program maintain their certification as well as any other requirements. The county also compiled all requirements—such as those for training and fidelity—and other detailed program information for all approved practices into a comprehensive reference guide. Providers can bill only for services they have been trained and authorized to provide. Debbie Innes-Gomberg, deputy director of the Los Angeles County Department of Mental Health, says, “There’s a cost to the organization, but it’s often offset by the improved outcomes they attain. The county provides technical assistance and implementation support by way of training. Then the practice lead becomes the expert in that practice, and knows a network of experts.”28

Poorly delivered programs are unlikely to achieve the outcomes policymakers expect. Thoughtful identification of community and individual needs, and careful implementation of programs in the way they have been designed or proved to work, can help ensure that funded programs are delivered successfully.

Outcome monitoring: Measuring results 

After selecting proven programs and ensuring quality delivery, the next step is to make sure those programs are working as intended. Outcome monitoring refers to the systematic tracking of program performance to determine if government programs are achieving desired results. Policymakers can use this information to make informed budget and policy decisions, mitigate risk, and strengthen accountability.29 While many counties operate some type of performance measurement system, how they use these systems varies widely. The most common challenges faced by counties are identifying the most meaningful measures to track performance and developing the systems required to collect and report those performance measures.30 Even where counties have managed to develop effective systems for tracking performance data, using this information to support continuous improvement and inform policy or funding decisions remains a significant challenge. Smaller counties or those with fewer resources may need to consider how to prioritize these activities in a manner most suitable for their government and residents (see Appendix B for additional resources). This section details the various ways counties can overcome these common hurdles.

Developing meaningful outcome measures. Developing meaningful, consistent outcome measures is an essential first step to gathering comparable performance data across a county. Standardized measures enable local leaders to analyze and report countywide results, compare programs and providers, and identify underperforming areas. Using research to identify appropriate outcome measures and performance benchmarks can enable appropriate comparisons—for example to similar counties, industry standards, or best practices.

Boone County, Missouri
Sean Pavone/iStock/Getty Images Plus

The Boone County, Missouri, Community Services Department, in collaboration with the Boone Impact Group (BIG)—a collection of local funders that meets regularly to discuss strategic opportunities, community needs, and overlaps in program funding—created a glossary that standardizes descriptions of local programs and services, called the “taxonomy of services.” The taxonomy functions as a universal language—and the building blocks of a performance management system—by ensuring that all services purchased or funded by the city and county governments and the local United Way use common names and definitions. “Our first step was creating a common taxonomy to get organizations on the same page,” says Kelly Wallis, director of the Community Services Department. “If organizations are proposing a service—for example, a developmental screening—they are calling it by the same name.”31

The BIG worked with the University of Missouri Office of Social and Economic Data Analysis to create a data dashboard that reports on more than 100 community-level indicators aimed at measuring the health, education, safety, and economic well-being of residents. Many of the indicators reported in the dashboard come from existing data sources, such as the U.S. Census Bureau or the Missouri Department of Elementary and Secondary Education.32 The data collected in the dashboard will create a baseline for measuring community-level indicators. As a next step, the group will add outcomes to the taxonomy to align services with the results they are expected to achieve. “We are trying to get providers on board with the [outcome monitoring] component,” says Wallis. “For example, if a provider provides therapy services, are those children having less disruptive behavior? We will work with providers on how they measure that and how that gets aggregated so that we can see the impact programs are having on community-level indicators.”

Refining systems to track and report outcomes. Because governments collect large amounts of data through performance management systems, making sense of the information collected and ensuring quality control can be challenging. To promote the use of these data, local governments can train staff to properly collect and verify data, and create reports that provide concise analyses that enable decision-makers to take action.

In Olmsted County, Minnesota, the Health, Housing, and Human Services Division33 created a continuous improvement and analysis (CIA) team, consisting of six analysts, which works across numerous program areas within the division to help program staff identify gaps in service areas, track outcomes, and ensure implementation of needed changes. “When we look at and monitor performance, we’re asking two questions: ‘Are we worried about this?’ and if yes, ‘Are we worried from a policy or practice standpoint?’” explains Sarah Oachs, director of the Health, Housing, and Human Services Division. This process complements the efforts of the departments, helping them to analyze data, identify and review trends, determine which trends warrant changes, and create plans to implement those changes—work that would not be possible otherwise.

For example, when the analyst supporting Child and Family Services noticed that the department was struggling to find permanent homes for children in foster care within target timelines, the analyst conducted a root cause exercise—a structured process to identify the origins of a problem—with social workers of the unplaced children, on a case-by-case basis. The analyst shared findings—including identified barriers and potentially misaligned policies—with the performance management team, and the division’s social workers have already changed their behaviors to try to address those barriers. The CIA team has received requests to repeat the approach in other areas. “It’s strategic that we have an embedded team, housed in administration, working toward the same goals. This structure allows analysts to float across the top of the program areas and be aware of all the collective priorities at the same time,” says Oachs. “The practice experts really understand the work that’s happening day to day, and the analysts bring a lot of quality improvement expertise and that broad perspective.”34

In Wilson County, North Carolina, the Department of Social Services has adopted an outcome-focused, continuous quality improvement framework—called Leading by Results—that it has applied across the agency with an emphasis on child welfare for the past 10 years. It is credited with helping the agency achieve significant improvements, including lowering the number of children in foster care by 45 percent over the past five years, and reducing the number of children suffering repeated maltreatment to zero.35 The department started with a relatively simple goal—trying to figure out whether the services it provided were having a positive impact on the children and families it served in the community—and became a pilot site for Leading by Results, an approach the state wanted counties to adopt.

“We identified five or six key results that we want to achieve; big-picture results,” says Glenn Osborne, director of the Department of Social Services. “An example would be children growing up in safe and secure homes, or that older and disabled adults live in the least restrictive environments and have as much self-determination in their lives as possible. These are the kinds of outcomes we hold ourselves accountable for. We create performance measures—indicators of success—that we track on an ongoing basis and that we use to determine how we’re doing at what we say we’re trying to achieve.”

Once the system was up and running, the department began using meetings to discuss the data and identify gaps or areas needing additional attention. Osborne notes that many of the department’s strategies, including its practice model for working with families and its strategies for recognizing and addressing trauma, stem from reviewing its performance data and thinking of how it could improve measures the department tracked. “Based on the measures, the important work begins to get done: aligning our resources and efforts toward achieving those results,” says Osborne.36

Creating forums to share and apply outcome data. Several counties interviewed for this report noted that lacking real-time, accurate data made it more likely that policymakers would make decisions based on anecdote, political pressure, or public criticism, rather than outcomes and performance measurements. Regularly providing performance information to policymakers can help empower them to use this information when making programmatic decisions. It is essential to provide the data in clear and easy-to-digest formats, highlighting key findings to ensure that local leaders can easily transform the information into action. To facilitate this process, several state and local governments communicate performance data through report card systems37 or regular meetings.

The Oklahoma City-County Health Department collects data on health outcomes in the community, aggregating the data at the ZIP code level to track outcomes tied to smaller geographic regions, and has found several forums to present its findings to decision-makers. Oklahoma City-County Health Department leadership has presented to the city council regarding health status in Oklahoma City and County and to the state capital on the influence of state policymaking at the local level. The department also held a legislative breakfast for lawmakers in the region, and provided them with county- and district-level data and data story maps that represented the health and wellness status of their population. The data-driven breakfast enabled a focused discussion on individual districts and the role policymakers could play in improving health in their communities.38

“Simplifying the data and showing policymakers the impact of it in their district is really helpful. It speaks to them and their areas of focus. When we hold program status meetings, we bring in clients to share their stories firsthand. We saw health improvements in maternal and child health, so we brought in a mother who talked about how our Children First program helped her get a job and raise a healthy baby,” says Holderness, from the local health department’s epidemiology division. “We’re giving the details to policymakers, both through quantitative and qualitative data, and making sure to hear the community members’ voices and make decisions on health programming together.”39 Gary Cox, the executive director of the Oklahoma City-County Health Department, leads the organization dedicated to seeking opportunities to deliver health in a forward-thinking way. “This data improves the way our local public health system delivers services in Oklahoma County, and we will continue to innovatively serve our community alongside our dedicated partners,” he says.40

While some counties present data to policymakers on a regular basis, others establish councils of local leaders that are continuously involved in reviews of performance data to inform decisions. The Maricopa County, Arizona, Smart Justice Committee engages elected officials and appointed leaders to promote proven solutions and public safety outcomes. The committee coordinates the efforts of several departments working to reduce recidivism and improve reintegration. “Smart Justice is made up of leaders at the chief, deputy chief, or director level, who are in a position not only to influence policy but to make decisions that impact practices. It’s a very effective group for that reason,” says MaryEllen Sheppard, assistant county manager. “It was founded based on the recognition that good work was being done in probation, the sheriff’s office, human services, correctional health, etc., but we hadn’t integrated the work to make sure it was evidence-based and that we were coordinating our responses in a way that enabled us to meet the needs of individuals over time.”

For example, when the county introduced a tool to screen all individuals entering the county jail system and connect them with the appropriate interventions, the committee began a baseline study of its pretrial detention population, enabling it to monitor subsequent years’ outcomes against the baseline. Through its research, the county has ensured that the groupings in the screening tool adequately represent the county’s justice population. It has also determined that while the tool does a good job separating categories of risk, it is not as valid for individuals with serious mental illness, and the county is assessing new measurements to predict risk for this population.41

Outcome monitoring systems can help local leaders track and report countywide or agency-level progress on key indicators to help determine whether publicly funded programs are achieving the results that constituents expect. These systems also provide useful information to department leaders and county managers to help direct resources to areas in need of improvement.

Targeted evaluation: Assessing untested programs

Evaluations of various kinds can help counties understand the effectiveness of public programs. Process evaluations assess whether programs are being delivered effectively and prove useful in implementation oversight, while outcome evaluations assess whether intermediate outcomes are being achieved. While these types of evaluations are particularly useful when applied to evidence-based programs to ensure that they are producing expected results, a third type of evaluation—the impact evaluation—proves most useful for programs that are untested or have limited evidence.42 Impact evaluations test whether an intervention is the cause of observed changes. Impact evaluations can be particularly useful when considering which programs to scale up, which to improve and continue testing, or which to scale back or eliminate. Some untested programs may not be good candidates for an impact evaluation—for example, if they have been improperly implemented or have inadequate data for an impact evaluation—in which case their success can be measured through other means. This section discusses three ways counties can enhance their ability to make use of evaluations.

Leveraging external resources to conduct evaluations. Counties can build partnerships with external research entities, such as local universities, to benefit from outside expertise where they lack internal evaluation capacity.43 These entities have been known to support government efforts by conducting evaluations in their entirety or providing technical assistance or trainings to government staff.

Outagamie County, Wisconsin’s Criminal Justice Treatment Services partnered with a researcher at the University of Wisconsin-Milwaukee to evaluate a new program aimed at reducing drunk driving in the county. The state struggles with one of the highest frequencies in the nation of alcohol-impaired driving.44 Examining other initiatives in the state aiming to reduce impaired driving, county leaders identified the Safe Streets Treatment Options Program (SSTOP). The program aims to provide treatment and rehabilitation services to individuals with operating while intoxicated (OWI) offenses, keep them in the community, encourage behavior change, and reduce recidivism rates. After seeing the success of the program in reducing recidivism rates among individuals with OWI convictions in Winnebago County (also in Wisconsin), Outagamie County decided to implement the program and see whether it could produce similar results. The county approached a University of Wisconsin-Milwaukee professor, Tina L. Freiburger, to request assistance in assessing the impact of the program on the community. The researching professor—who agreed to conduct the evaluation pro bono if she could publish it freely—found that SSTOP resulted in a 31 percent reduction in OWI offenses.45 Bernie Vetrone, criminal justice treatment director in Outagamie County, emphasized that the county would have adjusted the program if it had not demonstrated positive impacts on recidivism.

Salt Lake County, Utah, used philanthropic funding through its partnership with the Annie E. Casey Foundation to evaluate youth prevention programming in Kearns Metro Township. To ensure that its work is positively affecting Kearns youth, the county plans to examine community-level outcomes, program participant outcomes, and process outcomes (including program fidelity) for two programs, Guiding Good Choices and Positive Family Support. The county notes that much of this work would be difficult without partners. It collaborates with the Pennsylvania State University evaluation team to assess community-level outcomes (supported by the foundation); with a Utah-based evaluation firm to measure participant outcomes; and within Kearns and Salt Lake County to complete the process appraisal. Once implementation of the two programs is underway, the evaluation firm will present program data in regular meetings where team members can discuss what’s working, what’s not, and what can be improved. Caroline Moreno, education program manager at Salt Lake County Human Services, notes the importance of the Casey Foundation’s funding to carry out this work. “I don’t want to say that it’s what’s making fidelity implementation possible, but it’s making it more possible and better,” she says. “We wouldn’t have had the funding to hire external evaluators and not a lot of funders understand the need for that. In light of all other funding needs, evaluation funding is not always considered a high priority.”46 The county is already thinking about how to integrate this approach into everyday work when its current funding finishes.

Targeting evaluations to high-priority programs. It is not practical or necessary for local governments to rigorously evaluate every one of their programs. When deciding which programs to evaluate, local leaders can consider several parameters; for example, prioritizing programs that are costly or that serve many clients, or considering the level of research supporting the effectiveness of a program (as gathered during the program assessment process). Programs with a strong evidence base are lower priorities for impact evaluations than programs that are untested or that have limited research. Additionally, evaluations are more feasible from programs being properly implemented, and which have sufficient data available for an evaluation.

The Sonoma County, California, Upstream Investments initiative historically prioritized those programs with limited evidence for evaluation, which allowed the county to focus its technical assistance on a smaller subset of programs. Currently, the portfolio consists of 109 programs; 39 of these have evaluation plans in place, and the rest have all been previously evaluated (not necessarily locally) to varying degrees.47 The programs with the most limited evidence, on the other hand, have little information available to the county to help them understand the predicted impact of those programs.

Some providers have not had the capacity to complete their evaluations. To address this concern, Upstream is refreshing portfolio application requirements and its technical assistance program. Moving forward, all portfolio programs, regardless of the level of evidence informing their design and implementation, will be required to implement an outcome evaluation and demonstrate local results in a specified time frame. Upstream will continue to provide broad communitywide education and will begin providing targeted support to help organizations implement evaluations and use the data for program improvement. The type of evaluation conducted may vary depending on the resources of the organization, what data they collect, and what works for a program and its logic model. Upstream will target evaluation support to programs that address outcomes prioritized by the community.48

Building internal capacity for evaluation. Conducting and applying high-quality evaluations requires technical and analytical expertise, as well as access to substantial data. Counties that set aside or identify new resources for evaluation can dedicate staff with the necessary skills to understand evaluations and oversee external evaluators49—or, in some cases, to be able to conduct such studies themselves. They can also increase their ability to make use of existing administrative data.

The Hennepin County, Minnesota, Department of Community Corrections and Rehabilitation (DOCCR) has an Office of Policy, Planning, and Evaluation (PPE) that supports its mission with evaluation, planning, research, data reporting, and technical support services. This dedicated office consists of seven analysts and facilitates performance management and planning by tracking correctional facility populations, creating outcome measures, and generating annual and special reports on detained populations. The county has utilized numerous assessments and evaluations—such as process evaluations of its Beyond Violence programming and a six-year recidivism study of participants in its One Day DWI program—to guide its decision-making.50 Recently, the PPE commissioned a report evaluating whether a DOCCR screening tool was predictive of recidivism. While the tool was found effective overall, it did not prove useful for forecasting recidivism among black males. As a result of the PPE’s efforts, the department is developing a tool that better encompasses outcomes for this population.

Local governments can also build internal capacity by strengthening their use of existing administrative records. Harnessing this information effectively can reduce the most expensive component of evaluations—data collection—and facilitate the evaluation process for counties of various means.

Local governments can also build internal capacity by strengthening their use of existing administrative records—collected by government agencies primarily through the delivery of services and for administrative purposes. Harnessing this information effectively can reduce the most expensive component of evaluations—data collection—and facilitate the evaluation process for counties of various means. Counties are beginning to gather and centralize records from overlapping departments and jurisdictions, addressing privacy or usage concerns by developing sharing agreements that outline the purpose, privacy protections, and public use of data.51 Intermediaries, such as applied academic centers, nonprofit research organizations, or regional planning agencies, exist to help local governments implement best practices in organizing, sharing, and using their data.52

County leaders care about funding what works, and targeted evaluation can provide important information about the effectiveness of public programs, particularly untested ones, to inform decisions about when to scale up, scale back, or adjust a program. This and other evidence-based policymaking strategies can help counties identify and fund the most effective programs, ensure that they are implemented successfully, and monitor and evaluate outcomes so they are producing desired results. Counties can work closely with leaders, partners, and providers to build, support, and sustain the success of these initiatives.

Lessons learned on building and sustaining evidence-based policymaking

Setting the right conditions for success can help counties more effectively implement evidence-based policymaking. From conversations with county leaders across the country, Results First identified the following lessons learned as ways counties can strengthen their efforts, determine whether to identify and fund the most effective programs, see that they are implemented successfully, or monitor and evaluate outcomes to ensure programs are producing desired results.

Support of leadership and other local actors is critical to success. A lack of buy-in from community and local leaders can inhibit the implementation and sustainment of new practices. This can result in low participation levels from staff, and limited financial, political, or administrative support. On the other hand, early endorsement from key stakeholders can help county officials and agencies overcome obstacles. 

Marion County, Oregon
Marion County, Oregon
VW Pics Getty Images

For example, Marion County, Oregon, Commissioner Janet Carlson emphasizes the need to bring stakeholders together to get them on board with the use of evidence, and to ensure that they have the support they need to participate in the process. She notes the power of commissioners, the public, judges, or other county officials to block, or support, changes. When judges were not completely comfortable diverting individuals to certain evidence-based programs, for example, the Oregon Criminal Justice Commission presented data on the matter to the presiding judge and the district attorney’s “trial team.”

“It takes everyone leaning in on this effort—other counties, sometimes individuals—to make sure we are keeping people out of the criminal justice system. We govern in the county with an elected board of commissioners. My ability to effect change depends on what the other commissioners and what the community wants,” says Carlson. “What we do is provide forums so everyone can get together to hash things out. Not every policymaker responds to the data, so we also need to talk about the human story.”53

Dauphin County, Pennsylvania, noted the importance of securing buy-in from its contracted services providers when it introduced new measures of effectiveness for each type of service funded through its Human Services Block Grant. The Human Services Department developed measures that were consistent for every type of program and could demonstrate the impact of the program on important outcomes. “When we started the process, we met with every provider individually. We talked about what we were doing and why we were doing it and how it would impact them in the upcoming fiscal year,” explains Randie Yeager, director of human services. “Though it was very time consuming, it was very important to do that at the beginning. With few exceptions, folks got on board right away because they were part of the process.”54

Piloting changes on a small scale allows counties to test whether they work, and demonstrate success, before scaling them up gradually. Testing new projects allows time for policymakers to assess their performance and impact. If those projects prove successful, information collected can provide proof-of-concept that can help local leaders obtain the political and financial support needed to scale them up. If a project is demonstrated to be ineffective or effective only in certain conditions, such an assessment can help a county determine whether it should make adjustments to improve the program, eliminate it, or refrain from replicating it at full scale. Counties need not overhaul an entire agency overnight; instead, they can test evidence-based policies, programs, or processes on a small scale, and more easily expand once they have proved successful and leveraged positive results to bring on board key stakeholders.

In Maryland, the Montgomery County Department of Health and Human Services—which oversees a wide range of programs—decided to start small by assessing the research supporting its youth mentorship programs. “Few programs had evidence-informed models and while most had some performance metrics, they varied widely,” says Matthew Nice, former manager of planning, accountability, and customer service at the department.55 The county created a cross-departmental work group—including subject matter experts and members from Results First—that reviewed national research to clarify what outcomes should be affected by effective mentoring programs, mapped them against existing contracts, and selected 12 measures aligned with its services for inclusion in contracts. “Because we have so many providers, this allows a standardized approach. It makes it easier to talk about results systemically,” says Nice.

The department anticipates that creating common outcomes will enable it to compare results across providers and quantify community-level impact. The county hopes to demonstrate the success of this initiative and apply this process to other policy areas to ensure that contracts across departments are evidence-informed.

External organizations and partners are crucial sources of financial and technical support. Many of the counties interviewed for this report noted that they either started doing evidence-based policymaking or accelerated their work through the support or technical assistance they received from external organizations, such as a foundation or university, or through participation in a national or state initiative. Given the resource constraints facing most counties, these partnerships can help by providing expertise on how to distill existing research on what works and guidance on ways to adapt evidence-based practices to their ongoing efforts. They also create formal networks that can help connect local leaders to discuss common challenges, learn from peers, and make adaptations to programs in their counties. Some initiatives provide tools that allow participants to find and link data sets, which enable them to better understand their populations, identify policy problems, and evaluate programs.

For example, 150 jurisdictions (counties, cities, and states) across the country participate in the Data-Driven Justice project, which promotes the use of data to identify safe alternatives to incarceration or emergency room use for “frequent utilizers,” individuals with mental illnesses, chronic health problems, or substance use disorders who are frequently involved with the criminal justice and health systems. The Data-Driven Justice project is supported by the Laura and John Arnold Foundation and co-facilitated by NACo. It provides guidance to counties and other jurisdictions on how to adopt proven reforms, and provides tools to help refine, anonymize, and share data. Michael Daniels, justice policy coordinator in Franklin County, Ohio, says, “Peer-to-peer sharing is the biggest asset that the Data-Driven Justice project provides. It gives us a forum to talk to peers facing the same type of issues as us. The Data-Driven Justice project shows the value of sharing data and it has been a tremendous benefit.”56

Building the knowledge and capacity of service providers helps sustain new practices. Implementing evidence-based programs often requires additional training and analytical capacity among service providers. Counties successful in raising the expectations of these organizations have balanced higher standards with additional supports, including training, education, and/or resources. Some counties have found success in creating forums where agencies, contracted provider organizations, and other community stakeholders can share information, clarify expectations, and help foster peer learning.   

Results First
Bexar County, Texas
iStock/Getty Images

Hennepin County, Minnesota, uses the Correctional Program Checklist (CPC)—a tool that measures correctional programs against recognized principles of effective intervention—to enhance provider capacity for continuous improvement. While the tool helps with implementation oversight, Hennepin is focused on using it in a way that supports community providers. The county meets bimonthly with other Minnesota counties that utilize the CPC to share lessons learned, ensure that checklist assessors are reviewing programs consistently and with fidelity, and strengthen the network of assessors across the state.57 Through the CPC process, the county has gained a deeper understanding of local treatment services and has strengthened capacity to deliver them.

“We’re trying to bring the counties and program providers with us when doing this work, as opposed to the department leading the effort and simply telling the providers what to do,” says Danette Buskovick, manager of policy, planning, and evaluation at the Department of Community Corrections and Rehabilitation. Rather than using the CPC simply to decide whether to continue or discontinue funding, she says the tool helps them improve programs collaboratively: “We would prefer to focus on consistent improvement and building relationships to get to a point where the organization can successfully provide evidence-based service.”58

Data are important assets that some counties are using more frequently to identify problems, inform solutions, and track results. Local governments often collect and maintain administrative data such as criminal arrest, health, or education records through general reporting systems, which they can leverage to clarify existing problems and monitor progress against them. Yet, many counties noted the need for more comprehensive shared data systems across departments and jurisdictions to identify and solve problems sooner. Educating staff on the importance and impact of using data to drive decisions can help facilitate sharing and use.

For example, several counties are using data to identify the most frequent users of emergency services in the county and then develop targeted services to more effectively and efficiently meet the needs of this population. In 2013, Bexar County, Texas, created the Department of Behavioral and Mental Health to address gaps, plan, and coordinate an improved system of care for individuals with mental illnesses. As a first step, the county commissioned a study of its community mental health resources to ascertain what gaps existed in its care continuum and what issues the new department could address. Analyzing data on more than 1 million encounters from numerous providers of mental health and support services across county agencies, the study identified a population of roughly 3,700 “super-utilizers,” uninsured or underinsured individuals with complex health and mental health needs that were not being met by the existing system, whose annual cost of care was nearly $175 million. Total expenditures for service to the public safety net (individuals with poor or no insurance) totaled $1.2 billion in 2016.59

The county used the data to secure support from local leaders. “Presenting this data to the hospitals did two things,” explains Gilbert Gonzales, Bexar County director of behavioral and mental health. “It convinced the CEOs they were personally being affected—we could point to the data on costs for specific hospitals—and it highlighted the fact that the left hand was unaware of what the right hand was doing. There was no level of transparency.”60 As a direct result of its findings, the county has taken steps toward improving collaborative care of the high-needs population, such as creating the MEDCOM system to help county law enforcement divert psychiatric emergency patients to the nearest, most appropriate treatment facility. The county will continue to facilitate information sharing across the system, and to track services and outcomes for this population.

Conclusion

Without the appropriate tools, local leaders often face difficulties in getting the information they need to distinguish between local programs that are effective and those that are not. Evidence-based policymaking provides a valuable approach for directing limited resources in a high-impact way. While implementing some evidence-based policymaking strategies can be challenging and take time, counties can take gradual steps toward driving their decisions with rigorous research and systematic processes. Doing so can help them enhance fiscal stewardship amid funding constraints and achieve better results for their communities.

Appendix A: Methodology

For this report, Results First researchers captured data in three steps. Researchers first conducted comprehensive searches of websites from 30 counties, distributed geographically and among population categories of 250,000-499,000, 500,000-999,000, and more than 1 million residents. Second, researchers developed and administered a questionnaire to National Association of Counties members regarding evidence-based policymaking efforts in their jurisdictions. Results First researchers then gathered additional feedback through 40 follow-up interviews with county officials, including department heads, county executive officials, and staff that engage in evidence-based policymaking. Researchers then analyzed findings from the web searches, questionnaire, and interviews, applying the criteria outlined in the Pew-MacArthur Results First Initiative’s 2014 report, “Evidence-Based Policymaking: A Guide for Effective Government,” to identify promising practices that counties can learn from and potentially adopt.

Appendix B: Evidence-based policymaking resources

Research clearinghouses

Clearinghouses systematically review and summarize rigorous research to identify what works in a given policy area.61 While specific criteria and procedures vary by clearinghouse, they all review research studies to determine what evidence exists to support a conclusion about a program’s effectiveness, and assign ratings to programs based on that evidence. Counties can use these clearinghouses to:

Assess the evidence supporting a specific program

Even though clearinghouses have different rating systems, they all generally allow users to distinguish between programs with strong evidence that they work, programs with some evidence that they work, and programs with limited or no evidence on their outcomes. Users can:

  • Search for specific programs.
  • Review the program’s description and other details (such as target population, setting, and duration) to determine whether it is a true match to the program in their jurisdiction.
  • Review the rating assigned by the clearinghouse to understand what the evidence says about that program’s effectiveness.

Users can follow these steps for each program in an inventory to determine the level of effectiveness of all programs funded by a given office or agency.

Identify new evidence-based programs to replicate

County leaders and service providers can use clearinghouses as a menu of program options for potential investment. They can learn about effective programs in detail (including their target populations, outcomes, delivery setting, or implementation requirements) to select the best program for their community’s needs.

Use evidence to inform decision-making

In general, interventions with higher evidence ratings are likely to produce positive outcomes when implemented with fidelity, and programs with limited or no information require further evaluation to confirm their effectiveness. Depending on their needs, users will apply these ratings to budget and policy decisions in different ways (e.g., considering whether to fund a program or how to allocate resources within a budget), and should consider numerous factors in addition to evidence ratings such as the needs of target populations, local information on an intervention’s effectiveness, and the availability of implementation capacity and resources.

Add value to a grant application or justify current funding decisions

When applying for a potential funding opportunity, county agencies can use clearinghouses to strengthen the application by citing evidence to support their proposal. Additionally, in some cases users can access clearinghouses to fulfill the requirements of an application that incentivizes or specifies a need for evidence-based programs.

Evidence requirements

Santa Barbara County, California’s Criminal Justice Funding Opportunity Form 

The county’s Community Corrections Partnership requires agencies to submit this form, which requests citations of evidence supporting proposed programs, with their funding proposals.

Click to Enlarge

Prioritizing outcome monitoring activities

Any outcome monitoring process, no matter its size or level of formality, must be tailored to meet the needs of the government and its residents. NACo acknowledges that some officials in small counties may consider a formal performance measurement process unnecessary. So the association provides suggestions that can help these and other counties find a way to assess outcomes that works best for them. These include: 

  • Consider what metrics to track that are most relevant to the county’s priorities, perhaps prioritizing metrics for the most important projects to measure.
  • Consider how to organize the process to enable use of results to manage and improve performance, including integrating metrics with the budget process if possible.
  • Find opportunities to learn from other counties in your state when possible. Meaningful measures enable comparisons against peers or within an organization. 

Endnotes

  1. For more information on best practices in program assessment, see Pew-MacArthur Results First Initiative, “Program Assessment: Identifying What Works in Your State or Locality” (2015), https://www.pewtrusts.org/en/research-and-analysis/issue-briefs/2015/06/program-assessment-identifying-what-works-in-your-state-or-locality.
  2. For more information on establishing and sharing clear definitions of evidence, see Pew-MacArthur Results First Initiative, “A Common Language for Evidence-Based Programming: Results First Provides a Guide to Establishing and Sharing Clear Definitions” (2017), https://www.pewtrusts.org/en/research-and-analysis/fact-sheets/2017/11/a-common-language-for-evidence-based-programming.
  3. Upstream Investments, Sonoma County Human Services Department, internal communication, Jan. 26, 2018.
  4. Logic models are an important component of program planning and evaluation. Governments can require promising programs or those that lack strong evidence of effectiveness to develop logic models that specify their expected results. These can be used to establish outcome measures and performance targets for those programs. The Centers for Disease Control and Prevention (CDC) defines a logic model as a “graphic depiction (road map) that presents the shared relationships among the resources, activities, outputs, outcomes, and impact for your program. It depicts the relationship between your program’s activities and its intended effects” and notes that it may also be referred to by other terms including roadmap, theory of change, model of change, or outcome map. It notes that logic models are instrumental in making assumptions explicit and answering three questions: Where are you going? How will you get there? What will show that you’ve arrived? For more information, see Centers for Disease Control and Prevention, “Logic Models,” last updated Feb. 21, 2018, https://www.cdc.gov/eval/logicmodels/index.htm.
  5. Montgomery County Department of Correction and Rehabilitation Detention Services Division, “Pew-MacArthur Results First Initiative” (2017), https://www.montgomerycountymd.gov/cor/.
  6. Caroline Moreno, education program manager, Salt Lake County Human Services, interview with the Pew-MacArthur Results First Initiative, Jan. 12, 2018.
  7. The National Association of Counties reports that 29 states permit counties to implement a local option sales tax, and 23 allow counties to implement a secondary sales tax for statutorily defined purposes. Many states require voter approval to introduce such taxes. To learn more, see National Association of Counties, “Doing More With Less: State Revenue Limitations & Mandates on County Finances” (2016), http://www.naco.org/resources/doing-more-less-state-revenue-limitations-and-mandates-county-finances.
  8. For more information on using research to inform program funding decisions, see Pew-MacArthur Results First Initiative, “A Guide to Evidence-Based Budget Development: How to Use Research to Inform Program Funding Decisions” (2016), https://www.pewtrusts.org/en/research-and-analysis/issue-briefs/2016/07/a-guide-to-evidence-based-budget-development.
  9. Tanja Heitman, chief probation officer, Santa Barbara County, internal communication, May 18, 2018.
  10. Santa Barbara County Probation Department, “CCP: Criminal Justice Funding Opportunity,” accessed Feb. 20, 2018, https://www.countyofsb.org/probation/crimjustfund.sbc
  11. Tanja Heitman, chief probation officer, Santa Barbara County, interview with the Pew-MacArthur Results First Initiative, July 26, 2017.
  12. Boone County Community Services Department, “2016 Annual Report and Program Directory” (2016), www.booneimpact.org/wp-content/uploads/2017/05/2016-Annual-Report-3.0.pdf.
  13. Kelly Wallis, director, Boone County Community Services Department, interview with the Pew-MacArthur Results First Initiative, Oct. 26, 2017.
  14. Ibid. 
  15. For more information on evidence-based contracting, see Pew-MacArthur Results First Initiative, “How to Use Evidence in the Contracting Process: Data and Research Can Increase the Efficiency and Effectiveness of Government Programs” (2016), https://www.pewtrusts.org/en/research-and-analysis/issue-briefs/2016/12/how-to-use-evidence-in-the-contracting-process.
  16. The eight features are: (1) Provide empirical basis of need and address disparities; (2) Develop shared set of results or goals; (3) Use tiered approach to evidence-based programs (EBP); (4) Measure fidelity to EBP models; (5) Provide support for applicant organizations; (6) Adopt clear and transparent proposal review processes; (7) Monitor or evaluate outcomes at the community and program level; and (8) Adopt values of collaboration, alignment to other initiatives. For more information see Santa Cruz Human Services Department, “Implementing a New Community Programs Funding Model: Status Report and Recommendations” (2016), https://santacruzcountyca.iqm2.com/Citizens/FileOpen.aspx?Type=4&ID=4374&MeetingID=1033.
  17. Ellen Timberlake, deputy director, Santa Cruz County Human Services Department, interview with the Pew-MacArthur Results First Initiative, Sept. 19, 2017.
  18. Donna Carlson, Chester County Human Services, internal communication, April 17, 2018.
  19. Kim Bowman, director, Chester County Human Services, interview with the Pew-MacArthur Results First Initiative, Oct. 27, 2017.
  20. Urban Institute, “What Are the Challenges of Pay for Success?”, https://pfs.urban.org/faq/what-are-challenges-pay-success.
  21. Jennifer DeYoung, deputy policy director, Seattle & King County Public Health, email communication with the Pew-MacArthur Results First Initiative, July. 24, 2018.
  22. Joseph A. Durlak and Emily P. DuPre, “Implementation Matters: A Review of Research on the Influence of Implementation on Program Outcomes and the Factors Affecting Implementation,” American Journal of Community Psychology 41, no. 3-4 (2008); 327-350, https://doi.org/10.1007/s10464-008-9165-0.
  23. For more information on ensuring effective program delivery, see Pew-MacArthur Results First Initiative, “Implementation Oversight for Evidence-Based Programs: A Policymaker’s Guide to Effective Program Delivery” (2016), https://www.pewtrusts.org/en/research-and-analysis/issue-briefs/2016/05/implementation-oversight-for-evidence-based-programs.
  24. The four assessments are: (1) a community health profile (called a Wellness Score) derived from available data such as income level, mortality rates, and ER visits; (2) a locally administered survey on the influences, barriers, and supports affecting community health; (3) qualitative feedback on the barriers and supports for improving community health gathered from town hall discussions over the course of a month; and (4) a local public health assessment survey in collaboration with the CDC National Public Health Performance Standards Program, and analyzed by the CDC. For more information on the Oklahoma City-County Health Department communitywide health needs assessments, see Oklahoma City-County Health Department, “Wellness Score 2017,” https://www.occhd.org/WellnessScore.
  25. Megan Holderness, epidemiological support, Oklahoma City-County Health Department, interview with the Pew-MacArthur Results First Initiative, Oct. 20, 2017.
  26. Steve Leifman, Miami-Dade County judge, interview with the Pew-MacArthur Results First Initiative, Nov. 13, 2017.
  27. Ibid.
  28. Debbie Innes-Gomberg, deputy director, Los Angeles County Department of Mental Health, interview with the Pew-MacArthur Results First Initiative, Oct. 2, 2017.
  29. For more information on outcome monitoring, see Pew-MacArthur Results First Initiative, “The Role of Outcome Monitoring in Evidence-Based Policymaking: How States Can Use Performance Management Systems to Achieve Results” (2018), https://www.pewtrusts.org/en/research-and-analysis/issue-briefs/2018/08/the-role-of-outcome-monitoring-in-evidence-based-policymaking.
  30. Emilia Istrate, Stacy Nakintu, and Jonathan Harris, “Building Trust: Performance Metrics in Counties (2018), National Association of Counties, www.naco.org/featured-resources/building-trust-performance-metrics-counties.
  31. Wallis interview.
  32. Boone Impact Group, “Boone Indicators Dashboard,” http://booneindicators.org/.
  33. At the time of the creation of the Continuous Improvement and Analysis team, it was a smaller Community Services Department.
  34. Sarah Oachs, director, Health, Housing, and Human Services Division, interview with the Pew-MacArthur Results First Initiative, Nov. 13, 2017.
  35. Glenn Osborne, director, Wilson County Department of Social Services, interview with Pew-MacArthur Results First Initiative, March 14, 2018.
  36. Ibid.
  37. For examples of this, see LiveHealthy Fairfax, “Fairfax County Health and Human Services Report Card,” accessed July 15, 2018, http://www.livehealthyfairfax.org/index.php?module=Tiles&controller=index&action=display&id=94476444240349481; and Children’s Trust of South Carolina, “County-Level Profiles,” accessed July 15, 2018, https://scchildren.org/research/kids-count-south-carolina/county-data-profiles/.
  38. Megan Holderness, epidemiological support, Oklahoma City-County Health Department, questionnaire response, Oct. 16, 2017.
  39. Ibid.
  40. Megan Holderness, epidemiological support, Oklahoma City-County Health Department, email communication with the Pew-MacArthur Results First Initiative, July 26, 2018.
  41. According to the National Institute of Mental Health, serious mental illness (SMI) is defined as a mental, behavioral, or emotional disorder resulting in serious functional impairment, which substantially interferes with or limits one or more major life activities. The burden of mental illnesses is particularly concentrated among those who experience disability due to SMI. For more information, see National Institute of Mental Health, “Mental Illness,” last updated November 2017, https://www.nimh.nih.gov/health/statistics/mental-illness.shtml.
  42. For more information on how to prioritize different types of evaluations, see Pew-MacArthur Results First Initiative, “Targeted Evaluations Can Help Policymakers Set Priorities: A Policymaker’s Guide to Building Evaluation Capacity” (2018), https://www.pewtrusts.org/en/research-and-analysis/issue-briefs/2018/03/targeted-evaluations-can-help-policymakers-set-priorities.
  43. For more information on collaborating with researchers, see William T. Grant Foundation, “Research-Practice Partnerships: Working Together to Improve Outcomes for Youth,” accessed Oct. 5, 2018, rpp.wtgrantfoundation.org.
  44. Tina L. Freiburger and Alyssa Pfeiffer, “Assessment of the ‘Safe Streets Treatment Options Program’ (SSTOP)” (2017), http://www.outagamie.org/Home/ShowDocument?id=49470.
  45. Ibid. 
  46. Moreno interview.
  47. Joni Thacher, training and technical assistance manager, Upstream Investments, Sonoma County Human Services Department, email communication with the Pew-MacArthur Results First Initiative, Jan. 26, 2018.
  48. Joni Thacher and Leah Murphy, on behalf of Upstream Investments, Sonoma County Human Services Department, interview with the Pew-MacArthur Results First Initiative, Jan. 18, 2018.
  49. The Abdul Latif Jameel Poverty Action Lab (J-PAL) provides numerous resources on evaluations that can help county staff understand how evaluations are used, what considerations go into conducting an evaluation, and what questions to ask. For more information, see https://www.povertyactionlab.org/research-resources. The Bureau of Justice Assistance also provides a resource that describes criteria for assessing the quality of outcome evaluations. For more information, see Bureau of Justice Assistance, “Is This a Good Quality Outcome Evaluation Report? A Guide for Practitioners” (2011), https://www.bja.gov/evaluation/reference/Quality_Outcome_Eval.pdf.
  50. Hennepin County Department of Community Corrections and Rehabilitation, “Community Corrections Act Comprehensive Plan 2016-2017.”
  51. For information on the importance of data sharing, see Susan Urahn, “The Barriers to Shared Data and How to Overcome Them: Promising New Approaches Have Emerged to Overcome Rules That Inhibit the Sharing of Information Critical to Evidence-Based Policymaking,” Governing, Jan. 27, 2015, http://www.governing.com/columns/smart-mgmt/col-state-barriers-shared-data-how-overcome-them.html. For more information on elements contained within data sharing agreements, see National Neighborhood Indicators Partnership, “Key Elements of Data Sharing Agreements” (2014), http://www.naco.org/sites/default/files/documents/Key%20Elements%20of%20Data%20Sharing%20Agreements.pdf
  52. For more on data intermediaries, see Urban Institute, “Improving Public Decision-making: Local Governments and Data Intermediaries” (2018), https://www.urban.org/sites/default/files/publication/97111/improving_public_decisionmaking_local_governments_and_data_intermediaries_2.pdf.
  53. Janet Carlson, commissioner, Marion County, Oregon, interview with the Pew-MacArthur Results First Initiative, Oct. 6, 2017.
  54. Randie Yeager, director, Dauphin County Human Services, interview with the Pew-MacArthur Results First Initiative, March 22, 2018.
  55. Matthew Nice, former manager of planning, accountability and customer service, Montgomery County Department of Health and Human Services, interview with the Pew-MacArthur Results First Initiative, Jan. 26, 2018.
  56. Michael Daniels, justice policy coordinator, Franklin County, Ohio, interview with the Pew-MacArthur Results First Initiative, Nov. 15, 2017.
  57. Hennepin County Department of Community Corrections and Rehabilitation, “Community Corrections Act.” 
  58. Danette Buskovick, manager, Hennepin County Department of Community Corrections and Rehabilitation Office of Policy, Planning and Evaluation, interview with the Pew-MacArthur Results First Initiative, Aug. 15, 2017.
  59. The study defined “super-utilizers” as unfunded and underfunded patients who had 3+ inpatient discharges or both a serious mental health diagnosis and 2+ inpatient discharges, or 9+ emergency room visits (excluding pediatric cancer care and neonatal care). From a study conducted for Bexar County, internal communication, March 9, 2018.
  60. Gilbert Gonzales, director, Bexar County Behavioral and Mental Health, interview with the Pew-MacArthur Results First Initiative, March 8, 2018.
  61. Pew-MacArthur Results First Initiative, “Results First Clearinghouse Database User Guide,” updated April 2017, http://www.pewtrusts.org/~/media/assets/2015/06/results_first_clearinghouse_database_user_guide.pdf?la=en.
  62. Santa Barbara County Probation Department, “CCP: Criminal Justice Funding Opportunity.”
  63. Istrate, Nakintu, and Harris, “Building Trust."
Article

Counties Adopt Evidence-Based Policymaking

Quick View
Article

Leaders in Salt Lake County, Utah, observed a troubling trend in one of its communities, the Kearns Metro Township: They found that local youth had higher rates of alcohol and nicotine use, gang involvement, and mental health symptoms as compared to the rest of the state.