How Policymakers Prioritize Evidence-Based Programs Through Law

Lessons from Washington, Tennessee, and Oregon

Partager
How Policymakers Prioritize Evidence-Based Programs Through Law

© iStock

Overview

Policymakers seeking the best return on taxpayer dollars are increasingly focusing on cost-effective programs that have been proven to achieve desired outcomes. Evidence-based policymaking, which relies on rigorous analysis of program results to inform budget, policy, and management decisions, is one strategy gaining support among public leaders who want to reduce wasteful spending, expand successful programs, and strengthen accountability.

With state and local governments often challenged by budget shortfalls or changes in leadership priorities, laws that promote evidence-based programs are one of several strategies that can help institutionalize these practices and save taxpayer dollars over the long term.1 Several states have passed laws to prioritize evidence-based programs, including providing incentives for using such programs, requiring agencies to inventory and categorize existing programs, and prohibiting funding of those shown to be ineffective.2

This brief highlights laws in three states—Washington, Oregon, and Tennessee—mandating the use of evidence-based programs and practices, and documents each state’s experience, the impact of these efforts, and lessons learned. The analysis found that the laws drove state agencies to develop baseline information on existing services provided, create new data systems to monitor program implementation and outcomes, and prioritize funding to programs backed by evidence. The states’ success with this method has led policymakers to expand consideration of evidence-based approaches in areas beyond their initial ones.

The experiences in Washington, Oregon, and Tennessee provide important lessons for states interested in expanding their use of evidence-based programs. These insights include:

  • Engage key stakeholders. Leaders from the three states noted the importance of gaining support early in the process from key stakeholders, including state agency leadership and contracted service providers charged with carrying out the legislative requirements.
  • Funding percentage mandates can be powerful tools for change. Requirements to direct designated percentages of funds to evidence-based programs can be an effective tool in promoting evidence use. In Tennessee and Oregon, stakeholders reported that mandates helped promote reforms and establish valuable goals for agencies and providers. However, percentage mandates aren’t a necessity. Washington saw value in forgoing percentage requirements and giving agencies flexibility, as stakeholders noted that uniform requirements to fund evidence-based programs could have created incentives for agencies to overprescribe certain services.
  • Require monitoring of program fidelity. Programs implemented with fidelity to the original design are significantly more likely to achieve positive outcomes than those that deviate from them.3 Washington and Tennessee required that programs be implemented with fidelity to their designated design and that agencies establish systems to regularly track and report this information. Several Oregon departments have created advanced data systems, recognizing the need to monitor program implementation.
  • Include provisions for evaluating “homegrown” programs—those not based on a specific model. In Oregon and Tennessee, program leaders used assessment tools to determine whether providers were implementing services associated with effective, research-based practices, even if such programs were homegrown. Tennessee leaders in particular noted that this was important to building support among providers.4 Washington enables providers to submit evaluations of homegrown programs to a panel of experts; programs with strong evaluations may be included in the state’s inventory of evidence-based interventions.
  • Establish a process for verifying compliance with evidence mandates. Options for this include surveying providers, developing an inventory of programs and comparing each to an approved list of evidence-based interventions, or using tools such as the Standard Program Evaluation Protocol or Correctional Program Checklist to assess providers. Each state profiled in this brief used a different process to gather and verify this information, and in some cases the method was left to the agencies to determine, creating inconsistent reporting of information.
  • Consider available resources when requiring evidence-based programs. Although these can achieve better outcomes than other programs, they can also require additional expertise to implement. The three states noted challenges in building and retaining staff expertise, creating data systems that support fidelity monitoring, and evaluating programs in areas where few evidence-based program models exist. Washington had the benefit of the Washington State Institute for Public Policy (WSIPP) and the Evidence Based Practice Institute (EBPI) at the University of Washington to aid implementation. Stakeholders from Oregon and Tennessee noted that similar institutions would have helped in their states.

Case studies

Washington: Required four state agency divisions that deliver prevention and intervention services for children’s mental health, child welfare, and juvenile justice to substantially increase their use of evidence-based programs using research and inventory methods developed by WSIPP and EBPI. The divisions were also directed to establish monitoring and quality control procedures to ensure that programs were effectively implemented.

Oregon: Required five state agencies to spend an increasing percentage of their funding on evidence-based programs, reaching 75 percent of available funds by the 2009-11 biennium. The law also required biennial reports to the Legislature that assess each of the agency’s programs and its progress toward meeting that percentage goal.

Tennessee: Required the Department of Children’s Services to target up to 100 percent of funding to evidence-based juvenile justice programs by 2012-13. The law says the department should require all program provider contracts to include evidence-based services, as well as monitoring and quality control procedures.

Washington

Legislation: H.B. 2536, passed in 2012.

Policy areas: children’s mental health, child welfare, and juvenile justice.

Key requirements: Each state division must establish a baseline of existing programs and substantially increase their use of evidence-based programs.

Over the past two decades, Washington has created a strong foundation for using evidence systematically to inform budget and policy development. In 1983, the Legislature created WSIPP to conduct and aggregate research on social policy programs. The institute has developed a cost-benefit model that calculates the return on investment in funding evidence- and research-based programs and promising practices.5 In 1997, the Legislature expanded its use of evidence-based strategies by enacting the Community Juvenile Accountability Act (CJAA), which is among the first laws in the nation to mandate that agencies direct funding toward cost-effective programs in juvenile courts.6 In 2007, the Legislature created EBPI to help develop effective practices in children’s mental health and improve access to services for children.

Building on this foundation, the Legislature passed H.B. 2536 in 2012 to significantly increase the use of evidence-based programs in the state’s child behavioral health, child welfare, and juvenile justice systems. These programs are administered by the Department of Social and Health Services—which includes the Behavioral Health Administration, Rehabilitation Administration, and Children’s Administration, along with the separate Washington State Health Care Authority. Although these agencies were already taking limited steps to implement evidence-based programs and practices, the Legislature wanted to expedite this progress by replicating the reforms initiated by the CJAA.7

“Change comes hard, and everyone believes they are already doing the best thing” for children and families, said Mary Lou Dickerson (D), a former state representative who was the champion and prime sponsor of H.B. 2536 and other critical legislation supporting evidence-based practices.8 “The concept I wanted to push was that over time, we would replace unproven practices with those that are evidence-based.”

The Legislature also sought to establish a baseline to track which evidence- and research-based programs were available statewide. To accomplish this, it required WSIPP and EBPI to create formal definitions of evidence, systematically review scientific research to identify effective programs in each policy area, and develop inventories categorizing programs based on their evidence of effectiveness. The agencies were directed to use these inventories to prepare a baseline assessment of the programs they administered and establish fidelity monitoring and quality control procedures.

Although the initial version of the bill included mandates that specific percentages of funding be directed to evidence-based programs, the Legislature changed its approach after receiving public feedback during committee hearings. Community-based provider representatives were concerned that the bill would negatively affect providers who were not yet implementing proven programs.9 The final legislation stipulated that each agency attain “substantial increases” in their use of evidence-based programs above the baseline assessment and submit a report to the Legislature by 2013 detailing strategies, timelines, and costs for achieving this increase.10

Each of the agencies affected by the law had unique structures and varied on the use of evidence-based programs when the law was enacted. For these reasons, its impact has also varied.

Behavioral Health Administration

The legislation had the greatest impact on the Behavioral Health Administration (BHA). Although the agency had implemented some evidence-based programs before the law went into effect, it lacked a good metric for tracking these programs and relied on informal surveys from providers for this information.11 The law spurred the BHA to create a standardized process for tracking the use of evidence-based programs over time, which has become a valuable tool for managers. After WSIPP updates the list of evidence-based practices each year, service providers must update their electronic codes to track and report program utilization to agency leaders.12

The BHA was the only state agency to create internal targets for evidence-based programs based on the law. According to Gregory Endler, former program administrator in the Children and Youth Behavioral Health Unit, the agency knew that it had the potential for significant improvement and sought to increase the use of evidenceand research-based programs by 7.5 percent a year, eventually reaching 45 percent.13

Although meeting the targets has been challenging, the agency has made progress, including embedding its goals in provider contracts and using corrective action plans for providers who have not met the thresholds.14 To help support the providers, the BHA collaborated with the University of Washington in 2015 to identify key barriers to successful implementation. The university also provides targeted technical assistance to the agency.15

Rehabilitation Administration

The Washington State Rehabilitation Administration was already advanced in evidence-based policymaking at the time of the 2012 act, based on its experience in implementing the 1997 CJAA, which directed the agency to target funding to programs backed by rigorous research in reducing recidivism in the juvenile justice system. By 2012, the agency was spending 74 percent of its treatment program funds on evidence-based programs.16

The legislation benefited the division’s work in several ways. It prompted increased dialogue about evidence-based policymaking, particularly with provider organizations and communities across the state that were initially resistant to the approach. It also clarified the division leadership’s intention to invest scarce resources in programs that were backed by strong research. “We were already doing the intent of the bill, but it enabled us to further our conversations and expand to subpopulations that didn’t have evidence-based programs,” said Cory Redman, acting director of the Office of Juvenile Justice, who noted that this dialogue promoted greater consistency within the division, such as ensuring that juvenile justice drug courts across the state followed a similar model.17

We were already doing the intent of the bill, but it enabled us to further our conversations and expand to subpopulations that didn’t have evidence-based programs.Cory Redman, acting director, Office of Juvenile Justice, Rehabilitation Administration

Despite these successes, the agency faces several challenges in creating a comprehensive system that supports evidence-based policymaking, including establishing more robust fidelity monitoring and quality assurance protocols. “In 1997, the Legislature provided us with funding to implement evidence-based programs. However, additional funding to conduct quality assurance and fidelity monitoring was not provided. The tough decision was made to use some of the direct service dollars to build a quality assurance infrastructure,” said Redman. The agency is also developing services in areas where there is limited research on effective programs, particularly in substance abuse treatment and sex offender treatment.18 To address these gaps, the division seeks to expand the use of “promising” programs, which do not meet the stringent requirements of “evidence-based” but which available data indicate are likely to be effective.

Children's Administration

The Children’s Administration started implementing evidence-based programs in 2006 and was administering nine of them when H.B. 2536 was passed, although these programs represented a relatively small percentage of all funds spent on treatment for children and families.19 The law helped raise awareness of evidence-based programs and promote support among agency staff, which was particularly important because Washington administers child welfare programs through regions, and regional staff have substantial authority in selecting programs. “The law helped us get a clearer mandate that we were going to utilize evidence-based programs and demonstrated to our staff that prioritizing these programs was a requirement,” said Tim Kelly, program manager of family preservation services.20

Throughout the implementation process, the Children’s Administration has sought to work collaboratively with its providers to address common challenges, including the need to provide financial incentives for implementing programs that use an evidence-based model. “We recognized that the only way to do this successfully was by establishing strong partnerships with our providers,” said Kelly.21 Realizing that evidence-based programs can be more expensive than other programs, the agency went through a rate-setting exercise with its providers and now funds a 25 percent higher rate for programs backed by evidence, in addition to paying for time spent monitoring program fidelity. The agency also worked with regional program leaders and providers to identify funding targets (to meet the legislative requirement) and used these targets to create regional funding levels. According to Kelly, it was beneficial that the law itself didn’t mandate specific percentages, allowing for greater flexibility in setting feasible targets.22

Although the law has helped increase the use of evidence-based programs, some regional leaders and providers have raised concerns about becoming overly focused on such programs, which may not meet the needs of all families. Based on available local data and information from partners at the University of Washington, Kelly estimated that evidence-based programs currently meet the needs of about half of the families receiving services, in part due to gaps in programmatic research.23 To mitigate this concern, the agency has created a library of “evidence-informed” services to provide program alternatives and is considering using the Child and Adolescent Needs and Strengths assessment tool to gauge the need for new programs.24

Health Care Authority

The Health Care Authority (HCA) has responded to the law by increasing its investment in evidence-based programs while moving toward measuring key outcomes in all of its programs (both evidence-based and those informed by practices in the field). As of 2015, the HCA administered 16 evidence-based programs and had worked with its providers and managed care organizations to collect information on program outcomes while including specific expectations regarding these programs in its contracts.25 The agency is addressing feedback from its providers on barriers to using evidence-based programs—including training on program models, supervision costs, and retention of staff.

According to Lin Payton, HCA mental health program manager, focusing exclusively on evidence-based programs initially raised concerns from providers who were required to replace homegrown programs that they believed were getting good outcomes. The HCA intends to continue to promote the use of programs backed by evidence but is altering its approach to also fund programs lacking rigorous evaluation if they are meeting their outcome targets. “Ultimately, the Legislature wanted to ensure people are getting better, and we are going to be focusing more on outcome measurement,” said Payton. “I believe the best approach is to track functional outcomes while also providing more evidence-based program training and trying to encourage these programs where appropriate.”26

The law helped us get a clearer mandate that we were going to utilize evidence-based programs and demonstrated to our staff that prioritizing these programs was a requirement.Tim Kelly, program manager, family preservation services, Children’s Administration

Oregon

Legislation: S.B. 267, passed in 2003.

Policy areas: criminal justice, juvenile justice, behavioral health, and child welfare.

Key requirements: An increasing percentage of programmatic funding (up to 75 percent) must go toward evidence-based programs in five state agencies.

Oregon passed one of the first and most comprehensive evidence-based laws in the country, which served as a precedent for Washington’s H.B. 2536 and similar efforts in other states. After making significant increases in Oregon’s community corrections budget, the Legislature wanted to ensure that the additional resources were invested in criminal justice and other social service programs that had been rigorously evaluated and proved effective.27

The legislation provided a wide-ranging mandate for evidence-based policymaking in Oregon. It required five agencies—the Department of Corrections, Oregon Youth Authority, State Commission on Children and Families (which in 2011 became the Oregon Early Learning Division), the section within the Oregon Health Authority that administers addictions and mental health services, and the Oregon Criminal Justice Commission—to spend an increasing percentage of funds on evidence-based programs.28 Funding targets were established: 25 percent of funds were to be spent on evidence-based programs by 2007, growing to 50 percent by 2009 and 75 percent by 2011, and remaining at that level thereafter.29 Agencies must submit biennial reports to the Legislature, including an assessment of each program, the percentage of state and federal money the agency receives that is spent on programs backed by evidence, and a description of agency efforts to meet the percentage goals.30

Implementation of the law

The Oregon Youth Authority and Department of Corrections met with contracted service providers and other state officials to identify the programs that were subject to the 2003 law. The agencies then used the Correctional Program Checklist, a research-based assessment tool, to determine whether those programs included characteristics that have been proved effective at reducing recidivism.31 These assessments were initiated in 2005, and most programs have been assessed multiple times, although the agencies have encountered some ongoing staffing and resource challenges in using the checklist. The results have been included in the agency biennial reports to the Legislature.32

Several Oregon stakeholders noted that the law was instrumental in getting support from community-based providers to evaluate their programs and prioritize those that are evidence-based. The Department of Corrections initially encountered resistance from providers in using the checklist. But with the legislation as impetus, the department has been able to work with providers to build evaluation requirements into their contracts and in some cases replace ineffective programs with ones that are backed by research. “When we’ve encountered providers that are reluctant to shift course, it’s been helpful to have the law behind us to say that this is now a requirement. The more we contracted with counties, the more we required them to do evaluations and show what programs they are providing,” said Jeremiah Stromberg, the department’s assistant director of community corrections.33

Agencies have generally met the required funding thresholds established by the law. In its 2012 report to the Legislature, the Health Authority reported that 82 percent of its programmatic funds went toward evidence-based programs, while the Youth Authority reported spending 89 percent of its offender treatment programming resources on evidence-based practices.34 In both agencies, the percentages had increased significantly since the law was enacted.

However, the percentage requirements described in the legislation weren’t always stringently applied. Officials from the Department of Corrections noted that they didn’t feel overly constrained by the benchmarks and chose not to set up a comprehensive reporting mechanism, instead creating incentives for increasing use of evidence-based programs through their contracting process.35 In the 2010 report to the Legislature, the department noted that not enough programs had been reviewed to see if they were meeting the goal—due in part to resource constraints and an influx of new programs around that time—but the majority of those that had been reviewed were deemed evidence-based.36

When we’ve encountered providers that are reluctant to shift course, it’s been helpful to have the law behind us to say that this is now a requirement.Jeremiah Stromberg, assistant director, community corrections, Department of Corrections

Although it hit funding targets in 2012, the Health Authority has since adjusted plans for meeting the law’s requirements. Initially agency leadership broadly applied the legislative requirements to all agency operations beyond programs aimed at reducing recidivism, which had been the intent of the law. While this approach helped to cultivate awareness of evidence-based programs, it also generated criticism from Native American tribes and other minority populations that said cultural, homegrown programs weren’t being adequately considered. The agency eventually narrowed its application of the law to corrections-related health programs only and shifted its emphasis in other areas to tracking program outcomes, allowing providers greater flexibility to operate programs that outcome data showed had promise but had not been evaluated. “We still encourage [our providers] to use evidence-based programs when appropriate, but we are more focused on outcomes now,” said Jon Collins, director of the Health Authority’s Office of Health Analytics.37

Recent developments

Oregon stakeholders conclude that the 2003 law has introduced a major shift toward evidence-based policymaking, spurring plans to improve program monitoring and service matching as well as expand the use of justice reinvestment and cost-benefit analyses. “The legislation drove the use of evidence-based practices and was very beneficial for the state. After meeting the requirements of the legislation, folks have taken the initiative to move beyond the law with practices that support its intent,” said Paul Egbert, the Criminal Justice Commission’s operations manager.38

In 2011, the Youth Administration began developing the Program Evaluation Continuum (PEC) model to more comprehensively measure the success of all programs, including both evidence-based interventions and those lacking rigorous evaluation. The agency’s internal analysis found that the programs deemed evidence-based by the Correctional Program Checklist hadn’t achieved significantly lower recidivism rates than other programs, due in part to inconsistent implementation and a failure to properly screen individuals and match them to the most effective intervention based on their needs.39 To address this concern, the agency created the Youth Reformation System tool to help forecast the needs of incoming youth, predict which treatment would be most successful for each and match the person to that program, and determine program effectiveness based on short- and longterm outcomes, as measured by the PEC model.40 The Department of Corrections is adopting a similar program evaluation model and service-matching tool for adult offenders.41 Implementation of these new systems is ongoing, and they are seen as an important next step in advancing evidence-based policymaking in the state.

“Promoting the use of evidence-based programs through legislation was a great start, but once we developed a strong research unit, we wanted to quantify program effectiveness to answer the question of what actually happened with evidence-based programs,” said Shannon Myrick, strategic initiatives manager for the Oregon Youth Authority. Myrick said the service matching tool has led to improved outcomes and given the agency a better sense of the evidence base in Oregon.

[The law] spurred our investment in evidence-based programs and led us to pursue related initiatives such as justice reinvestment and cost-benefit analyses. We created a culture in Oregon of investing in evidence-based programming, and we see all of these initiatives as being linked together and supporting one another.Paul Egbert, operations manager, Oregon Criminal Justice Commission

The Oregon Criminal Justice Commission has also built upon the legislation with related efforts, including the Justice Reinvestment Initiative—an approach that focuses on reinvesting dollars saved from lower incarceration in effective community programming42—and cost-benefit analyses. Justice reinvestment has been closely linked to evidence-based policymaking, with much of the reinvestment funding going toward programs proved to reduce recidivism and 3 percent set aside to research program outcomes, Egbert said. The commission has also developed—in partnership with the Pew-MacArthur Results First Initiative—a robust cost-benefit analysis to examine the effects of programs and plans to partner with the Department of Corrections to further this work. “S.B. 267 spurred our investment in evidence-based programs and led us to pursue related initiatives such as justice reinvestment and cost-benefit analyses,” said Egbert. “We created a culture in Oregon of investing in evidence-based programming, and we see all of these initiatives as being linked together and supporting one another.”

Tennessee

Legislation: codified in Tennessee Code Section 37-5-121, enacted in 2007.

Policy areas: juvenile justice (Department of Children’s Services).

Key requirements: An increasing percentage of programmatic funding to be spent on evidence-based programs, with all funding evidence-based by fiscal 2012-13 and annually thereafter.

In 2007, Tennessee enacted legislation codified in Section 37-5-121 to ensure that state juvenile justice programs were supported by strong evidence and designed to meet the specific needs of the population in each community. The Legislature also sought more effective program monitoring.43 “The Legislature had a sincere desire to be good conservators of state dollars and for the state to provide the very best services for youth that had research behind them,” said Debbie Miller, deputy commissioner for juvenile justice.44

The law mandated that the Department of Children’s Services target an increasing percentage of program funding to evidence-based juvenile justice programs. And to “prevent undue disturbance to existing department programs,” the percentages were phased in: 25 percent of funding by fiscal 2009-10, 50 percent by 2010- 11, 75 percent by 2011-12, and 100 percent by 2012-13.45 The law also established tiered definitions for rating program effectiveness—including “evidence-based,” the highest tier; “research-based”; “theory based”; and “pilot program”—and stipulated that the department require evidence-based services in all provider contracts, that providers use monitoring and quality control procedures, and that the department complete a baseline assessment of its programs at the time and report those findings by Jan. 1, 2009.46

Implementation of the law

Children’s Services determined which of its programs were evidence-based and realized, after an initial scan, that some of its providers were utilizing homegrown programs rather than proven “brand names.” These typically lacked important elements, such as a practice model for implementing the program or tools for monitoring fidelity.47 Seeking to keep these providers while also implementing the law’s requirements, the department decided to use the Standardized Program Evaluation Protocol—a rating scheme that determines the extent to which juvenile justice programs align with research of programs shown to reduce recidivism rates—in partnership with Vanderbilt University and professor Mark Lipsey.48 Researchers from Vanderbilt surveyed contracted providers and conducted site visits to assess whether the services being delivered were aligned with research on reducing recidivism.

The initial results, which were published in a 2008 baseline assessment report to the governor and General Assembly, were promising.49 The report analyzed 30 provider organizations in 80 locations and concluded that approximately 94 percent of the programs included components associated with decreases in youth recidivism.50 The report also noted that none of the practices then in use produced negative outcomes.

However, several factors delayed further implementation and compliance with the law. Compiling the additional program data necessary for analysis using the evaluation protocol tool was an ongoing challenge due to capacity and technological constraints, particularly for homegrown programs. “A number of things had to be in place in order to score each provider using the tool, and along the way getting good data on the levels of risk and needs for the population we are serving was difficult,” said Elvie Newcomb, special projects manager for the Office of Juvenile Justice.51 Furthermore, while Children’s Services generally complied with the law and placed evidence-based program requirements into provider contracts, there were communication challenges and not all providers consistently reported the quantity and quality of services provided to each youth they served.

These challenges were exacerbated by budget cuts in 2012 and ongoing staff turnover. Although Children’s Services shifted priorities to address these challenges, it made little progress in implementing the law for several years. “The department had major budget issues and lost key staff who had conducted the initial program survey, planning, and evaluation design phases, contributing to the delay in the law’s implementation,” said Newcomb.52 In 2014, a state comptroller audit noted that although the department had made progress in implementing the law, it had not yet attained full compliance and there had been gaps and inconsistencies in implementation.53

Recent developments

Since that performance audit, the department has placed new emphasis on implementing the law and ensuring oversight of the delivery of evidence-based programs. It upgraded its case management system to enable providers to record data on the specific services provided to youth. This data is sent to the researchers at Vanderbilt University for further analysis to determine whether the services being provided match the levels found effective by research.

The enhanced data collection will enable the department to document each time an evidence-based service or program is provided to an individual and to monitor the fidelity of the services being delivered. Newcomb said the automation improves upon past data collection efforts in which providers entered information in a spreadsheet. “The law has served as a catalyst for our department to collect and analyze data that demonstrates the use of evidence-based approaches in the treatment of our delinquent youth in residential facilities. But in order to accomplish this, all of the needed data elements have to be in place. It’s been a learning process for us, but we are very close to having a good program evaluation system in place, and it’s exciting,” said Newcomb.54

To assess the quality of the services, the department’s program accountability review team began in January 2016 to collect data required for the Standardized Program Evaluation Protocol tool. A random sample of youth files from provider organizations are being reviewed and will be compared to reported data on the length of time spent on treatment. The review will also assess the program protocols and processes to ensure programs are implemented with fidelity. The agency will share scoring information with providers and discuss with them how their programs fit into the spectrum of effectiveness and how best to improve their scores.

The law has served as a catalyst for our department to collect and analyze data that demonstrates the use of evidence-based approaches in the treatment of our delinquent youth in residential facilities. It’s been a learning process for us, but we are very close to having a good program evaluation system in place, and it’s exciting.Elvie Newcomb, special projects manager, Office of Juvenile Justice, Department of Children’s Services

Although getting provider support has been challenging at times, Children’s Services reports that it has been helpful to include evidence-based program requirements in provider contracts. “With the law in place, providers are now aware of this requirement from the beginning; we ask them upfront about what evidence-based programs they provide and follow up with them to ensure the programs are implemented with fidelity,” said Newcomb.55

Overall, stakeholders from Tennessee are optimistic about their recent progress in implementing the law. Newcomb was hopeful that the use of the evaluation tool will help to determine if homegrown interventions are just as effective as the “brand name” interventions in reducing recidivism. “It’s best practice and just good business to know that you are getting effective programs for your money,” said Newcomb.

Conclusion

Although the three states profiled here were at different starting points when implementing evidence-based program mandates, each reported that its law has helped drive greater use of evidence-based policymaking. Key impacts of the laws have included spurring dialogue about and awareness of evidence-based programs, generating baseline information on current services and the level of evidence that exists on these programs’ effectiveness, creating new data systems to monitor ongoing implementation and measure outcomes, and prioritizing evidence-based programs when making funding decisions.

Although the laws were generally prescriptive, the states were flexible in interpreting and applying them, helping to encourage the affected agencies to act while allowing them leeway to make adjustments when necessary. All three states also have moved beyond the initial intent of their legislation—identifying and investing in evidence-based programs—and are grappling with related issues. For example, certain agencies in Washington are concerned with overprescribing evidence-based programs and are looking at ways of filling service gaps where there is limited evidence on what programs are effective. The Oregon Youth Authority is implementing a research-based tool to help match youth to the correct evidence-based program (which research shows is critical to achieving predicted outcomes) and has employed a model that can evaluate and compare providers based on the results they achieve. And Tennessee has created a system for monitoring program providers to ensure that the programs they operate are delivered with fidelity to their research-based models.

Institutionalizing evidence-based policymaking through state law is one of several strategies states can use to invest in programs that yield the greatest impact for limited taxpayer dollars. In the states profiled here, this strategy has produced measurable results. “Other states should absolutely do this,” said Cory Redman of the Rehabilitation Administration. “You are talking about spending state tax dollars, so you want to make sure they are spending that money on things that are proven to work.”

Endnotes

  1. Other strategies states have used include making internal policy changes or changing budget or contracting processes. See The Pew Charitable Trusts, “Evidence-Based Budget Development” (July 2016), http://www.pewtrusts.org/en/research-and-analysis/issue-briefs/2016/07/a-guide-to-evidence-based-budget-development.
  2. Pew-MacArthur Results First Initiative, “Legislating Evidence-Based Policymaking” (March 2015), http://www.pewtrusts.org/~/media/ assets/2015/03/legislationresultsfirstbriefmarch2015.pdf?la=en.
  3. Pew-MacArthur Results First Initiative, “Implementation Oversight for Evidence-Based Programs” (May 2016), http://www.pewtrusts.org/~/media/assets/2016/05/rf_programimplementationbrief.pdf.
  4. Elvie Newcomb (special projects manager, Office of Juvenile Justice, Tennessee Department of Children’s Services), interview by the Pew-MacArthur Results First Initiative, April 28, 2016.
  5. Eric Trupin and Suzanne Kerns, “Introduction to the Special Issue: Legislation Related to Children’s Evidence-Based Practice” (June 2015), Administration and Policy in Mental Health and Mental Health Services Research, http://link.springer.com/article/10.1007/s10488-015-0666-5.
  6. Washington Rev. Code § 13.40.530: Community Juvenile Accountability Programs—Effectiveness Standards” (1997), http://app.leg.wa.gov/rcw/default.aspx?cite=13.40.530.
  7. Ibid.
  8. As a representative, Mary Lou Dickerson was also the prime sponsor of other legislation supporting evidence-based practices, including a 2007 bill (H.B. 1088) calling for the increased use of evidence in funding children’s mental health services.
  9. Gabrielle D’Angelo, Michael D. Pullmann, and Aaron R. Lyon, “Community Engagement Strategies for Implementation of a Policy Supporting Evidence-Based Practices: A Case Study of Washington State” (June 2015), Administration and Policy in Mental Health and Mental Health Services Research, http://link.springer.com/article/10.1007/s10488-015-0664-7.
  10. Washington State Legislature, “Engrossed Second Substitute House Bill 2536” (June 2012), http://apps.leg.wa.gov/documents/billdocs/2011-12/Pdf/Bills/Session Laws/House/2536-S2.SL.pdf.
  11. Gregory Endler (former program administrator, Children and Youth Behavioral Health Unit, Washington Behavioral Health Administration), interview by Pew-MacArthur Results First Initiative, March 31, 2016.
  12. Ibid.
  13. Washington State Department of Social and Health Services, “Report to the Legislature: Evidence-Based and Research-Based Practices Updates and Recommendations” (December 2014), 7, http://app.leg.wa.gov/ReportsToTheLegislature/Home/GetPDF?fileName=Evidence-Based and Research-Based Practices E2HB2536_80910208-1bee-4d86-9d99-b3f2a1906e2a.pdf.
  14. Endler interview.
  15. Washington State Department of Social and Health Services, “Report to the Legislature,” 10–11.
  16. Cory Redman (acting director, Office of Juvenile Justice, Washington Rehabilitation Administration), interview by the Pew-MacArthur Results First Initiative, April 26, 2016.
  17. Ibid.
  18. Washington State Department of Social and Health Services, “Report to the Legislature,” 21-22.
  19. Tim Kelly (program manager, family preservation, Washington Children’s Administration), interview by the Pew-MacArthur Results First Initiative, April 20, 2016.
  20. Ibid.
  21. Ibid.
  22. Ibid.
  23. Ibid.
  24. Children’s Administration, “Family Preservation Resource Library,” accessed April 29, 2016, https://www.dshs.wa.gov/ca/contracted-providers/family-preservation-library.
  25. Washington State Department of Social and Health Services, “Report to the Legislature,” 22-4.
  26. Lin Payton (program manager, mental health, Washington Health Care Authority), interview by the Pew-MacArthur Results First Initiative, April 20, 2016.
  27. Elizabeth Craig, Jeremiah Stromberg, and Heidi Steward (Oregon Department of Corrections), interview by the Pew-MacArthur Results First Initiative, April 29, 2016.
  28. Oregon Health Authority, “Senate Bill 267 Summary,” accessed April 20, 2016, http://www.oregon.gov/oha/amh/ebp/Summary of ORS 182.525 (SB 267).pdf.
  29. Ibid.
  30. Ibid.
  31. Craig, Stromberg, and Steward interview.
  32. Ibid.
  33. Ibid.
  34. Oregon Youth Authority, “Interim Judiciary Committee Progress Report on SB 267 (ORS 182.525)” (September 2014), https://www.oregon.gov/oya/docs/SB 267_2014 OYA.PDF; Linda Hammond, interim director, Oregon Health Authority, letter to lawmakers, “ORS 182.525 Evidence Based Programs Report—Summary,” Sept. 28, 2012, https://digital.osl.state.or.us/islandora/object/osl:79765.
  35. Craig, Stromberg, and Steward interview.
  36. Oregon Department of Corrections, “2010 ODOC Report on Senate Bill 267 Compliance” (September 2010), http://www.oregon.gov/doc/CC/docs/pdf/2010_agency_report_on_senate_bill267_compliance.pdf.
  37. Jon Collins and Karen Wheeler (Oregon Health Authority), interview by the Pew-MacArthur Results First Initiative, May 18, 2016.
  38. Paul Egbert (operations manager, Oregon Criminal Justice Commission), interview by the Pew-MacArthur Results First Initiative, Dec. 17, 2015.
  39. Shannon Myrick and Paul Bellatty (Oregon Youth Authority), interview by the Pew-MacArthur Results First Initiative, Jan. 6, 2016.
  40. Oregon Youth Authority, “Interim Judiciary Committee Progress Report on SB 267.”
  41. Oregon Department of Corrections, “Interim Judiciary Committee Progress Report on SB 267 (ORS 182.525)” (2014), 3–4, http://library.state.or.us/repository/2014/201412011552541/2014.pdf.
  42. See also Bureau of Justice Assistance, “What is JRI?” https://www.bja.gov/programs/justicereinvestment/what_is_jri.html.
  43. Elvie Newcomb, Debbie Miller, and Michelle Hamblin (Tennessee Department of Children’s Services), interview by the Pew-MacArthur Results First Initiative, Jan. 20, 2016.
  44. Ibid.
  45. Tennessee Code, “37-5-121: Pilot Programs—Evidence-Based Programs for the Prevention, Treatment, or Care of Delinquent Juveniles” (Justia 2010), http://law.justia.com/codes/tennessee/2010/title-37/chapter-5/part-1/37-5-121.
  46. Ibid.
  47. Newcomb, Miller, and Hamblin interview.
  48. Vanderbilt University, “About PRI and SPEP,” accessed Jan. 20, 2016, https://my.vanderbilt.edu/spep.
  49. Tennessee Department of Children’s Services, “Progress Toward Evidence-Based Practices in DCS Funded Juvenile Justice Programs” (December 2008).
  50. Ibid.
  51. Newcomb, Miller, and Hamblin interview.
  52. Ibid.
  53. Tennessee Comptroller of the Treasury, “Performance Audit: Department of Children’s Services” (January 2014), 69–71, http://www.comptroller.tn.gov/repository/SA/pa12104.pdf.
  54. Newcomb, Miller, and Hamblin interview.
  55. Ibid.