State and local governments collectively spent more than $116 billion in 2012 on contracts and grants to a range of organizations that provide health and human services, according to a 2013 brief from the Urban Institute. These governments typically consider factors such as past performance, price, and organizational capacity—but often do not prioritize other crucial factors, including whether a program meets the needs of a particular community or has been proved effective in achieving the desired outcomes.
But over the past 10 years, a wealth of data and research on the effectiveness of programs has become available and is increasingly accessible through tools such as online research clearinghouses. Governments can use these tools to embed such evidence into their contracting processes, thereby improving the likelihood that the services achieve desired results.
This process, known as evidence-informed contracting, can vary across jurisdictions but includes some consistent core principles. This issue brief highlights those principles and provides examples from states and counties that are putting them to work.
Principle 1: Use data to identify service needs.
Jurisdictions should systematically gather information on the target population, such as its needs and risk factors, and the capacity of provider organizations to address those needs.
Example: Bernalillo County, New Mexico, issued several requests for proposals (RFPs) in 20161based on findings from a comprehensive scan of the state’s behavioral health needs and gaps in service.2 The scan was conducted in partnership with state agencies, managed care organizations, subject matter experts, and members of the community. For example, to reduce the number of children exposed to adverse childhood experiences such as abuse and neglect, the county issued an RFP for $3 million. The RFP was informed not only by strong research linking this mistreatment to poor mental and physical health outcomes later in life but also by findings from the community scan that identified the need for such services in the county.
Takeaway: Jurisdictions can use information collected from a needs assessment to clarify priorities for contracted services and ensure that their communities have the right programs to address them.
Principle 2: Collaborate with provider organizations.
Government staff should work closely with providers, policymakers, and other key stakeholders to build support for and understanding of evidence-based practices.
Example: The Human Services Department in Santa Cruz County, California, engaged community providers to develop a new contracting model and offered technical assistance on selecting appropriate, evidence-based programs. Some of the comprehensive changes identified through this planning process included prioritizing funds for evidence-based programs, scaling back funding for programs not supported by research, and combining existing county and city budget streams dedicated to reducing homelessness and poverty.
Takeaway: Collaboration between government and providers can increase support for introducing evidence-based programs into a jurisdiction’s range of services and help ensure that such programs are a good fit for the population.
Principle 3: Identify clear, measurable outcomes.
Government staff should identify desired short- and long-term outcomes, embed them into contract requirements, and verify that the necessary data are collected consistently across providers.
Example: Seattle’s Department of Human Services worked with the Harvard University Government Performance Lab to restructure its homeless services contracts around specific performance goals, such as clients’ ability to sustain permanent housing.3 In its initial review of contracts, the city found that the measures being tracked were largely process metrics,4 such as the number of clients served, that did not reflect desired outcomes. City staff worked with the lab to develop a set of key metrics to track the performance of homeless services, held discussions with providers to ensure that the metrics aligned with the program models, and then embedded them into existing contracts. Since then, the city has developed processes to regularly review performance data and form strategies to improve services.
Takeaway: Specifying desired outcomes in contracts can help clarify providers’ responsibilities and allow for meaningful comparisons of performance. Agencies can then use this information to identify areas in which providers need additional training or support.
Principle 4: Prioritize evidence-based interventions.
Directing funding toward interventions that have been shown through research to be effective can increase the likelihood that programs will achieve desired impacts.
Example: The New York State Division of Criminal Justice Services conducted cost-benefit analyses and used other research tools to identify effective, evidence-based programs aimed at reducing crime.5 This process revealed that investments in some of those programs could reduce offender recidivism by up to 12 percent and generate returns on investment upwards of $4 in taxpayer benefits for each $1 spent on programming. Based on these findings, the division restructured more than 100 contracts, representing approximately $21 million annually, to prioritize and support evidence-based, cost-beneficial alternatives to incarceration.
Takeaway: Jurisdictions can use the contracting process to prioritize funding for programs that have been proved effective.
Principle 5: Build in measures to track program implementation.
Programs are most effective when they’re implemented according to their original design. Deviations—such as a change in location, population served, or setting—can jeopardize the anticipated benefits.
Example: The Pennsylvania Commission on Crime and Delinquency contracted with EPISCenter, an implementation support organization, to develop common indicators for measuring implementation fidelity and outcomes for 16 evidence-based community prevention programs operated by grantees throughout the commonwealth.6 The metrics help providers monitor their programs and allow the commission to remotely track implementation across grantees to identify problems and make needed adjustments in real time, rather than waiting until the contract is up for renewal. Through these efforts, Pennsylvania generated an estimated $55.8 million return on these investments in fiscal year 2013-14.7
Takeaway: Leaders can specify in contracts key milestones of program implementation, such as the time from intake to treatment, to ensure that public programs yield intended outcomes.
Principle 6: Allow for experimentation or adaptation when evidence is limited.
While prioritizing evidence can help produce desired outcomes, evidence-based options are not always available. In these circumstances, jurisdictions can allow providers to pilot new programs, adapt existing ones to incorporate evidence-based principles, or evaluate new and untested programs to build the evidence base.
Example: Following passage of a 2007 law that required state funds to support the use of evidence-based programs, the Tennessee Department of Children’s Services built evidence requirements into its contract guidelines with local juvenile justice program providers.8 The department can still contract for programs that are not evidence-based as long as they incorporate key elements of evidence-based practices. The department uses a standardized assessment tool developed by researchers at Vanderbilt University9 to determine the extent to which juvenile justice programs align with elements of practice that research has shown to be effective at reducing recidivism.
Takeaway: Allowing for experimentation or adaptation enables jurisdictions to meet the unique needs of their communities and fill gaps where there is limited evidence of what works.
Principle 7: Support providers in meeting rigorous standards.
State and county leaders can support providers’ capacity to deliver services by creating forums for collaboration between contracting agencies and service organizations, as well as offering technical assistance and training on implementing evidence-based programs.
Example: The South Carolina Center of Excellence in Evidence-Based Intervention at the University of South Carolina helps behavioral health providers in the state identify and implement evidence-based interventions for children, youth, and families. In 2016, the center issued a report that named key evidence-based interventions that could support this population, along with information on each program’s implementation requirements, related costs, which outcomes could be affected, and target population.10 The center then worked with provider organizations across the state to help them select programs that target the specific needs in their communities and that were feasible to implement based on each organization’s capacity.
Takeaway: Enhancing provider capacity and expertise can help jurisdictions better serve diverse populations and deliver high-quality services.
The contracting process provides a strategic opportunity for state and county leaders to prioritize evidence in their budget and policy choices. These key principles can help jurisdictions strengthen the quality of their contracted services and foster a culture that uses evidence to make decisions along with providers and other stakeholders, while increasing the likelihood that the services purchased are effective in meeting community needs and producing positive outcomes.
- Bernalillo County, New Mexico, “$5 Million Approved for Behavioral Health Proposals,” news release, Sept. 30, 2016, https://www.bernco.gov/general-news.aspx?aa431b263de84365b8eaae43ab63bd6dblogPostId=1cb29ed4ca74456593c0d5d3ee601e1f.
- New Mexico Legislature, “Behavioral Health Needs & Gaps in New Mexico” (2002), https://www.bernco.gov/uploads/files/Behavioral%20Health%20Needs%20and%20Gaps%20NM-Executive%20Summary%202002.pdf.
- Harvard University Government Performance Lab, “Shaking Up the Routine: How Seattle Is Implementing Results-Driven Contracting Practices to Improve Outcomes for People Experiencing Homelessness” (2016), https://govlab.hks.harvard.edu/files/siblab/files/seattle_rdc_policy_brief_final.pdf.
- Harvard University Government Performance Lab, “Outcome and Process Metrics Recommendations Developed for Seattle’s Homeless Services Contracts,” n.d., http://govlab.hks.harvard.edu/files/siblab/files/gpl_homelessness_metrics.pdf.
- New York State Division of Criminal Justice Services, “Cost Benefit Analysis for Criminal Justice” (2013), http://www.criminaljustice.ny.gov/crimnet/ojsa/resultsfirst/rf-technical_report_cba1_oct2013.pdf.
- Pew-MacArthur Results First Initiative, “4 Ways Implementation Support Centers Assist in the Delivery of Evidence-Based Programs” (2017), https://www.pewtrusts.org/en/research-and-analysis/fact-sheets/2017/07/4-ways-implementation-support-centers-assist-in-the-delivery-of-evidence-based-programs.
- EPISCenter, “2014 Annual Report” (2014), http://www.episcenter.psu.edu/sites/default/files/outreach/EPISCenter-Annual-Report-2014.pdf.
- Tennessee Department of Children’s Services, “Contract Provider Manual Section One (1)—Core Standards” (2018), https://files.dcs.tn.gov/policies/contractProviderManual/Section_1-Core.pdf.
- Peabody Research Institute, “SPEP Information,” accessed Jan. 24, 2019, https://my.vanderbilt.edu/spep/spep-information.
- South Carolina Center of Excellence in Evidence-Based Intervention, “Evidence-Based Interventions for Youth With Behavioral Health and Substance Use Problems” (2016), https://bestpracticesforyouthsc.files.wordpress.com/2017/01/coe-report-4-15-16.pdf.