Colorado’s ‘Evidence Continuum’ Promotes Efficient, Effective Public Programs

States could benefit from research standards that support program advancement in their own budget processes

Navigate to:

Colorado’s ‘Evidence Continuum’ Promotes Efficient, Effective Public Programs
An illustration of the five steps: program design, identity outputs, assess outcomes, attain initial evidence, and attain casual evidence.


Colorado is a leader among states using data and research to strengthen their budget investment choices. Since 2014, The Pew Charitable Trusts’ Results First initiative has partnered with Colorado to help the state apply research evidence to fund programs that work. To be sure that the state invests taxpayer funds in projects that are effective, the governor’s Office of State Planning and Budgeting (OSPB) requires evidence in an agency’s budget requests for changes to program funds, including expansions, reductions, and new projects.1 Colorado’s legislature passed a bipartisan bill in June 2021 that requires agencies and the OSPB to use consistent definitions of evidence-based programs in budget requests.2

By defining categories of evidence in budget guidelines, states create a common language to clarify when a program is backed by research showing impact, or when it lacks studies showing effects.3 While other states have set such language in budget plans to define evidence-based programs, Colorado has created a distinct framework called the evidence continuum that has evolved over the years to encourage innovation and expand the use of evaluation to understand whether programs work. With the continuum, the OSPB has increased the number of budget recommendations supported by research. The continuum helps agencies build in program research to ensure efficacy, which the OSPB supports with grants for evaluation, and to encourage continuous improvement. During the economic downturn caused by the COVID-19 pandemic, the evidence continuum also helped to inform budget plans to preserve or reduce program funds.

This brief provides a detailed look at the characteristics of Colorado’s evidence continuum, its use in budget planning, the collaboration across government branches and with nongovernmental stakeholders who developed the continuum, and ways in which the continuum continues to evolve to meet stakeholder needs. To study the evidence continuum, researchers at Pew interviewed 13 individuals involved in the creation and use of the framework and analyzed the content of 30 reports, government documents, legislative testimonies, department presentations, and email communications with officials and nongovernmental stakeholders.4 Policymakers and government staff members in states aiming to develop evidence criteria that strengthen budget planning can learn from this brief how Colorado implemented and sustained standards that support innovation.

The evidence continuum sets standards for building evidence to assess whether programs work

Defining levels of evidence is a step that other states have taken to incorporate data and research into their budget planning.Colorado’s evidence continuum stands out as an example of step-by-step criteria that both enable agencies to adopt programs and services guided by rigorous research and help the executive budget office determine funding priorities it can recommend to the governor and the legislature.

Colorado’s evidence continuum serves as a guide for determining whether programs have rigorous research that can show they are achieving their intended outcomes and how they could be improved if not.6 Programs that are studied with methods consistent with scientific standards can give legislators assurance that money given to state agencies supports effective programs. “We’re trying to create a methodical way to look at budget

submissions. We want to see the best available evidence, and your chance of success goes way up if you’ve made that case. Over time, it will pay big dividends to Colorado,” said Senator Chris Hansen (D), a member of the Joint Budget Committee, the legislative body that reviews executive agency budget change requests.7

Since 2014, the Pew Results First initiative has partnered with Colorado to help the state apply research evidence to fund programs that work. In 2016, the OSPB, which prepares the annual state budget, began requiring agencies to provide in their budget requests for new programs, and for expansion of existing ones, the research showing the expected effects on outcomes.8 Ann Renaud Avila, formerly director of research and evidence-based policy initiatives at the OSPB, helped create the evidence continuum. She said its purpose “was to help the legislature and governor’s office get a handle on when departments say they have a proven or an evidence-based program.… It was meant to be a way for them to have consistency when they were looking at different programs or initiatives.”9

Yet many of the state’s services and operations had not yet been evaluated to show their impact.10 For example, the OSPB found in its 2017 assessment of Colorado’s 108 programs to prevent substance use and mental health disorders that 69 programs had no studies to determine their effectiveness.11 To address this gap, the OSPB in 2018 worked with legislators and a nonpartisan group of professionals from universities, nonprofit organizations, private companies, and government agencies called the Colorado Evidence-Based Policy Collaborative. (See the list of stakeholders and their roles in Appendix A.) Together, they developed evaluation benchmarks for agencies to show when programs have research demonstrating they could yield the intended outcomes, or what additional support and evaluation they needed.12

The continuum’s five ascending steps represent the stages of building and assessing program information. (See Figure 1.) The first step is Program Design, which requires a logic model showing how program activities should cause the desired changes. At the top of the staircase is the fifth step—Attain Causal Evidence—which could include randomized controlled trials (RCTs), experiments in which researchers randomly assign participants to a treatment group or a control group without the same treatment.

The Colorado Evidence Continuum Provides a Path to Measuring Impact: Following these 5 steps helps programs to verify effectiveness. STEP 1: Program Design, Create Theory of Change/Gather Evidence. STEP 2: Identify Outputs, Performance Measures (Outputs). STEP 3 Assess Outcomes, Performance Measures (Outcomes). STEP 4: Attain Initial Evidence, Outcome Evaluation. STEP 5: Attain Casual Evidence, Rigorous Outcome Evaluation. Effective Implementation: Theory-Informed, Evidence-Informed, and Proven. Source: Colorado Governor’s Office of Planning and Budgeting, “FY 22-23 OSPB Budget Instructions Evidence Section (Updated)” (2021),

As programs move up the steps of the evidence continuum by using more advanced research designs, studies with higher levels of precision offer decision-makers greater confidence that program investments could lead to intended outcomes. These research levels range from “theory-informed,” denoting advice from program providers, experts, and user satisfaction surveys (at the lowest level); to “evidence-informed” in the middle, indicating preliminary evaluation with a before and after design; to “proven” at the highest level, requiring one RCT or two or more comparison studies with strict statistical controls.

Program evidence requirements in agency budget requests help demonstrate that state programs can deliver intended outcomes

The OSPB uses the evidence continuum when considering agency budget requests that propose changes to program funding amounts. (See Figure 2.) A core staff of four analysts in the OSPB’s evidence-based policy unit provide guidance to agencies on preparing their budget requests. All OSPB staff analysts are involved with reviewing the research summaries that the agencies provide and deciding on what level of the evidence continuum to place a program in a budget request. The OSPB considers the evidence rating information in final budget decisions and uses the evidence continuum information in presenting the governor’s budget to the legislature.13

Colorado Uses the Evidence Continuum in the Budget Cycle: OSPB integrates program research in planning, reporting to the governor, and presenting to the legislature. Agency Management, OSPB Review, Governor’s Briefing, Budget Submission, Joint Budget Committee. Source: Colorado Governor’s Office of Planning and Budgeting, “FY 22-23 OSPB Budget Instructions Evidence Section (Updated)” (2021),

When submitting their budget requests to the OSPB’s, executive branch agencies assign each program a level on the evidence continuum in their budget request form. The OSPB analysts review that information for each program’s evidence level and confirm the agency’s rating. If the OSPB disagrees, analysts work with the agencies to make changes and help them understand the reason for the discrepancy. The OSPB and the agencies then use the evidence rating details when briefing the governor and the legislature to demonstrate how well programs have achieved their outcomes.14

Generally, the OSPB receives requests for program budget expansions and new program proposals. In the governor’s fiscal year 2020-21 budget request, the OSPB included evidence ratings on the continuum for 46 existing or newly proposed programs. For example, the Department of Education requested $27.6 million to expand the delivery of the Colorado Preschool Program to a greater number of at-risk 3- and 4-year-old children.

The OSPB gave it Level 5, the highest rating on the evidence continuum, based on research from multiple experimental studies and analysis of state data showing children meeting or exceeding learning expectations after attending the preschool program.15

Less common are proposals to reduce program funding or eliminate services based on the evidence. The OSPB instructs agencies to propose decreased budgets where appropriate, reflecting the level of program effectiveness shown in research and performance management data. The OSPB looks to the evidence in program budget proposals to inform how agencies plan to allocate limited resources when facing fiscal shortfalls.16

When the COVID-19 pandemic shocked the economy in 2020 and forced agencies to propose budget reductions, the OSPB used the evidence continuum to inform decisions on preserving or expanding funds as well as reducing them in response to the economic downturn. The evidence continuum brought a perspective on how to prioritize those cuts. For example, the Colorado Office of Children, Youth and Families (CYF) proposed reductions to programs that were rated lower on the evidence continuum while maintaining budgets for statutorily required services for child and family safety.17 OSPB Director Lauren Larson said: “In that budget cycle, where we were doing major cuts, [the evidence continuum] definitely came into play in discussions. We’re proposing to not increase this item or even to reduce it some because it’s just not proven to be working. But things that were absolutely working we wanted to, especially in areas where the economy was weak, we put more money into.” She continued: “So, in that budget season, programs that received increases or that were not reduced rated higher overall on the evidence continuum than those that did not.”18

The legislative Joint Budget Committee (JBC), which monitors government operations and prepares budget recommendations for the General Assembly, uses the OSPB’s program ratings on the evidence continuum along with other information sources to decide on the governor’s recommended budget requests. The committee’s legislative staff reviews the governor’s budget program items and explains to the committee members what the evidence level means in each request, and whether the program results in improved outcomes.19 “This is how we can look at the work that these departments do in the context of available research and how we could improve the evaluation of their programs,” said Robin Smart, the JBC’s principal legislative budget and policy analyst.20

Along with interpreting the amount of research and its degree of rigor reported in the evidence continuum, the JBC looks for facts on how proposed programs improved outcomes and results from cost-benefit analyses.

The OSPB submits budget requests to the JBC for programs rated at Level 1 or 2 (starting points for assessing programs) to support testing and building up programs before they are funded for statewide deployment. For example, in the governor’s fiscal 2020-21 budget request, the Department of Personnel and Administration requested $400,000 to install a vehicle telematics program.21 The OSPB rated it at Level 2 on the evidence continuum. The data collection system enables the evaluation of the state fleet, which would raise the program’s level on the evidence continuum.22 The OSPB encourages departments to prioritize programs with robust research (outcome evaluations shown in Levels 4 and 5) indicating favorable results and to gather more evidence on programs with uncertain or unknown effects.23

Mollie Bradlee, deputy director of CYF at the Colorado Department of Human Services, said that including research findings about programs in budget requests fits into the agency’s system of monitoring child welfare outcomes and using data to plan services. “From our perspective, when we are making decisions, this is kind of an initial roster to understand the level of maturity of our programs and how much we know about their efficacy,” said Bradlee. “The way we use [the evidence continuum] is not to say step one is something that is not working. Step one is something that is very early in implementation of performance management or evaluation. At step five are the things that we are seeking to replicate and to grow.”24

The OSPB reported an increase in recent years in budget recommendations for programs with higher levels of evidence. According to the OSPB, the fiscal 2021-22 budget included 53 program requests (33% of all requests) that contained a rating along the evidence continuum, an increase from the prior year. Programs for agencies that asked for increased spending averaged a Level 3 on the continuum—a high standard that, according to the OSPB definition, has shown preliminary benefits from outcome assessments. Programs for agencies that requested

a decrease in funding averaged a Level 2, indicating the programs’ effects had yet to be determined.25 “[This] tells us that agencies are asking for more money for programs for which we have a greater confidence in their level of evaluation. And agencies are asking for reduced funding for programs that we have less confidence in by their level of evaluation. And we think that that’s one appropriate use of the continuum,” said Aaron Ray, former deputy director for policy at the OSPB.26

Cross-branch collaboration to develop evidence-based definitions helped the executive and legislative branches build common expectations about the state’s investments in effective programs while refining those standards and requirements over time in response to changing needs. The OSPB and the JBC prioritize research and data in the budget process, but the JBC used different criteria to define levels of evidence to inform decisions on funding requests. The OSPB and JBC recognized the need to align their rankings and found ways to do so through dialogue over the past few years. A bipartisan bill proposed by JBC members and enacted in June 2021 established definitions for different levels of evidence to be used in analysis of programs and requires state agencies and the OSPB to use consistent definitions when describing evidence-based programs in budget requests. The evidence level definitions align with the steps in the evidence continuum. The act also appropriates funding for a JBC staff analyst to assist legislators in understanding the findings from reviews of evidence.27

Senators Hansen and Bob Rankin (R), who proposed the bill, said they want to ensure that JBC can give the same level of analysis to evidence as the OSPB does. Sen. Hansen said, “We think that creates the right feedback loop to the executive branch when they’re proposing new programs or their spending changes.”28

Agencies use the evidence continuum to plan evaluations

Colorado’s evidence continuum also plays a role in granting outcome evaluation and technical assistance to improve programs. Colorado is one of only three states with a dedicated revenue source for program evaluation grants. Since 2017, the General Assembly has appropriated $500,000 annually from the Marijuana Tax Cash Fund to the OSPB for program evaluation and implementation enhancement. Departments apply for these funds and in- clude their programs’ rating on the evidence continuum in their request.29 Collaborative member David Anderson, vice president of evidence-based policy at Arnold Ventures, pointed out the importance of this incentive. “The idea behind [the continuum] is aspirational: You may be at a one right now on the rating system, but what is your plan to move up that continuum? And if you have a thoughtful path for doing that, that should also be rewarded.”30

In the first year the funds were available (2017), the OSPB received 13 proposals from six departments and select- ed five projects after rigorous review. All but one project received funding for multiple years so that each evaluation could be completed without any interruption, with the OSPB granting a total of $1.7 million spread across multiple years. In 2020, the OSPB granted $250,000 to four new evaluations. Programs receiving evaluation funds include the Marijuana Impaired Driving Program, School Counselor Corps Grant Program, and Telemedicine Practices on Access to Care. The OSPB expects the funds to increase the evidence base for programs operating in Colorado.

Programs use the funds to ensure that their implementation follows the research the original program was based on to increase the chance of replicating the original program’s results. Those program evaluations can also inform program modifications to meet the needs of Colorado’s diverse populations and settings, yet remain consistent with the program’s original design.31

The CYF used the evidence continuum to inform investments in evaluation that aim to improve program effectiveness. Bradlee gave the example of Multi-Systemic Therapy (MST), a program designed to reduce child substance use and delinquency, which has multiple experimental studies showing success; Bradlee wants to confirm its effectiveness throughout Colorado. “[MST] was an example of us taking a program that we already knew existed in Denver, already existed in Colorado Springs, existed where there were resources and where therapists could easily access homes, where there was lots of broadband, etc., and then you try and drop it in a region like Park County, which does not have the same sorts of resources. Therapists have to drive longer. It’s more difficult to access families,” she said. “And then we used that as an opportunity, really, to learn about how we implement something that’s a step five [on the evidence continuum] in rural areas specifically.” The office plans to use information gleaned to better understand how to invest in and implement MST services in rural and underserved communities.32

The evidence continuum’s long-term success relies on leadership, dedicated staff, and collaboration with experts

Ensuring that the evidence continuum is officially incorporated into Colorado’s budget and planning processes takes time, according to many stakeholders inside and outside government. Bradlee said the OSPB embeds the evidence review in the budget instructions so that over time, the agency staff will become used to it and come to expect it. “In one or two cycles, folks aren’t going to immediately jump on board and be ready and be thinking about this all the time. It is a process,” Bradlee said. “So I think the most we can do is just institutionalize, institutionalize, institutionalize.”33

Integrating the evidence continuum into existing organizational cultures requires resources in the form of dedicated staff, training personnel, and hours of collaboration with stakeholders. For example, the OSPB has a team that coordinates the use of research evidence in budget development and management of evaluation grants to agencies. That staff provides training to state agencies in assessing evidence using the continuum, conducts literature reviews to verify the evidence in the budget proposals, and selects successful evaluation grant applications.34 To help agencies determine their programs’ levels on the evidence continuum, the OSPB also added a flow-chart diagram to the latest budget instructions (for fiscal 2022-23). (See Figure 4.) Bradlee

described how during the last budget cycle (preparing for fiscal 2021-22), the OSPB supported the Department of Human Services in assessing evidence and preparing evaluation fund requests. “There were a couple things that we put forward where we weren’t quite sure how to identify something on the continuum, and so our analyst went back and forth with us to try and figure out where it landed.”35

Tiffany Madrid, director of legislative affairs and policy for the Child Protection Ombudsman of Colorado, who previously worked in the OSPB, recalled assessing the program evidence that agencies provided in their budget requests. “If we noticed there were gaps, then we would call up an agency. We would get program folks on the phone or meet in person and have a conversation really driving the message home that we want to make sure that they’re well supported,” said Madrid. Sometimes the departments did not have the research citations for the evidence. The OSPB staff would find the references, “because ultimately, these agencies’ requests are a reflection of the governor’s final budget package,” said Madrid. “So we really had the onus to make sure that the research that we were passing along to the legislature really reflected the best available research and was rigorous.”36

When the OSPB began asking agencies to rate their programs on the evidence continuum, staff members worked to build trust between departments to dispel the fear of being audited.37 Adrienne Russman, former senior policy adviser to then-Governor John Hickenlooper, said the OSPB staff helped agencies understand that committing to evidence use was intended to improve, not threaten, their work: “There’s a lot of hand holding and talking to different stakeholder groups about what is it that you want, what is it that you want to see,” Russman said.38

With respect to the legislature, Russman, a member of the collaborative, has observed a shift away from confusion about evidence-based policymaking in 2016 to legislators now asking for levels of evidence along with what research says about program effectiveness. “There’s discussion of evidence for particular programs, and that’s been a big change, quite frankly,” she said.39 Sen. Rankin similarly remarked on the change: “We are less susceptible to lobbying, you know. … How much are we influenced by lobbyists as opposed to data?”40 His committee colleague, Sen. Hansen, said asking for research evidence increased credibility: “Our colleagues trust us over time. They see repeated decision-making. They see the approach that Bob [Rankin] takes to the budget, that I take to the budget, our colleagues on the JBC take, and you build up that trust even if they don’t have as in- depth understanding of double-blind evidence-based data,” said Hansen.41

OSPB Instructs Colorado Agencies How to Place Their Programs in Levels on the Evidence Continuum: The budget office assists departments in determining evidence levels

External stakeholders provide crucial guidance on using the evidence continuum

Colorado’s experience also shows how collaboration with external stakeholders improved the evidence standards and sustained the continuum. The Colorado Evidence-Based Policy Collaborative played a key role in ensuring the evidence continuum’s routine application across state agencies. Scholars from universities and program implementation experts provide training to legislators and program staff in agencies to help them understand scientifically sound research and how to use it.42 Partnerships with stakeholder nongovernmental and university groups, including collaborative members such as the Colorado Evaluation and Action Lab at the University of Denver, continue to partner with the OSPB to evaluate agencies’ programs.43

Members of the collaborative lent expertise to shaping the evidence continuum into a learning process for programs to realize effective outcomes.44 The application of the evidence continuum has shifted over time from building confidence in the evidence that programs work to determining how to make the programs better.45 (See timeline of changes to the evidence continuum in the text box below.) Collaborative member Cindy Eby, CEO of ResultsLab, proposed an update to the evidence continuum to recognize that programs in early stages of development have different assessment needs, emphasizing continuous improvement as a building block for effective programs.46 That model added steps to indicate the need for strong program theory, and its testing, to inform improvements when deficiencies are found.47 In November 2018, the OSPB and the JBC adopted the revised model for the committee staff to use when reviewing budget requests. Collaborative members were available as a resource to answer the JBC members’ and staff members’ questions.48

The Evidence Continuum and Its Use Evolved Over Several Years

The OSPB made changes to evidence-level criteria to meet needs of legislators and nongovernmental stakeholders

2016: Under the administration of Governor John Hickenlooper, the OSPB began adding evidence requirements to its budget requests for new or expanded programs.

2018: The OSPB introduced an evidence continuum that emphasized levels of confidence in program effectiveness based on the status of scientific research. Later in the year, the JBC recommended guidelines for describing evidence-based programs in budget requests. The Evidence-Based Policy Collaborative proposed an expanded evidence continuum framework, which the OSPB and JBC adopted.

2019: Governor Jared Polis succeeded Gov. Hickenlooper and continued to emphasize data-driven policymaking. The new OSPB staff reorganized research and evaluation efforts but continued to use evidence utilization tools developed under the previous administration. The OSPB updated the evidence requirements in budget instructions to include the expanded evidence continuum and guidelines recommended by the JBC.

2020: The OSPB expanded budget request guidelines to require evidence ratings for all requests and asked agencies to provide inclusion, diversity, and equity information.

2021: The Legislature passed S.B. 21-284, which the governor signed, to require agencies and the OSPB to use consistent definitions for levels of evidence in budget requests. The evidence continuum aligns with the new law. The act added legislative staff for the JBC to review the OSPB’s program evidence summaries submitted in agency budget proposals.

Legislative leadership and advocacy by the collaborative’s experts helped continue the evidence continuum’s use after Gov. Hickenlooper’s administration ended in 2018. In January 2019, under Governor Jared Polis, the OSPB maintained the evidence continuum in budget planning guidelines and expanded its use in budget instructions to apply to all agency requests.

Historically, many states have struggled to sustain new evidence-based policymaking efforts after a major change in legislative leadership or gubernatorial administration.49 Colorado’s ability to continue use of the evidence continuum through a gubernatorial change reflects the state’s cross-branch commitment cultivated over several years involving external stakeholders, as well as leadership buy-in from the new administration. The law enacted in 2021 that requires agencies to use specific evidence definitions in budget requests will further sustain the use of the evidence continuum.50

The evidence continuum remains a work in progress aiming to meet stakeholders’ needs

Although changes to the evidence continuum address concerns that stakeholders have raised, some limitations remain. Findings from rigorous program research can be complicated and require expertise to interpret. Officials in the governor’s office and legislators need guidance from OSPB and JBC staff to understand how to properly use the summarized evidence rating information.51 Smart observed that the outcomes from the research on programs need to be clearly stated so that JBC members can make decisions quickly.52 She said the continuum does not clearly express all outcomes of a given initiative, along with their degree of effectiveness, which Smart says are critical in the budget process. Similarly, because of complexity, determining the extent of research and drawing conclusions can be challenging to agency staff members preparing program budget requests. Some observers from outside the OSPB expressed concern about the accuracy of evidence levels indicated in program budget requests.53 Russman said: “Just because something has been evaluated in an RCT doesn’t make it evidence- based. Many RCTs show that the program being evaluated is either ineffective or harmful, which is an important distinction that was missing from some of the conversations.”54

The OSPB and JBC have found common ground to align continuum and statutory evidence-level definitions to identify programs capable of affecting outcomes and the kind of studies that would be needed to show effectiveness. This process involves ongoing discussion, meetings, and training with the help of the collaborative. The OSPB maintains a regular dialogue with the JBC to ensure understanding of the JBC’s needs. Likewise, the JBC relies on frequent communication with the OSPB to discern the criteria and content of program evidence presented in budget requests.55 In its fiscal 2022-23 budget instructions, the OSPB notes how the four evidence definitions in the new law align with the five research levels in the evidence continuum.56 Stakeholders also observed that, although the evidence levels indicate research rigor, they do not detail the effects of programs specific to Colorado.57 This is because many programs are studied in different locations around the United States and even other countries. Much of the information about program research is summarized in national program evidence clearinghouses, like those found in Pew’s Results First Clearinghouse Database. However, studies demonstrating the effects of well-tested programs elsewhere should be replicated in Colorado, in part because programs often need to adapt to cultural, ethnic, and regional context differences specific to a state. According to agency staff and collaborative members, programs developed in Colorado might not be rated highly on the evidence continuum because they had not been evaluated, are more suited to qualitative than quantitative evaluation, or are in the formative stages of operating effectively.58 To address this challenge, the budget instructions state that the OSPB will not penalize requests for new or locally created programs that have not yet been evaluated and therefore have a low rating on the evidence continuum. But agencies should still provide data collection and evaluation plans for those programs.59

While acknowledging the limitation of the evidence continuum, the OSPB emphasizes continuous learning and improvement. “What we have said to agencies, and what we believe is, every program can seek to be more effective, and the first step in being more effective is evaluating your current effectiveness. And the continuum helps you do that if you’re a program,” Ray said. He emphasized that there’s no requirement for a program to be at a specific step on the continuum or to reach an arbitrary level of research. Data and research gathered at each step of the continuum are valuable for programs to inform changes that can improve services and outcomes.

A program may need to revisit earlier steps in the evidence continuum to make changes to be more effective. Assessments at the lower stages might even be routinely conducted for quality assurance, while the impact evaluations represented in the highest two stages might never be appropriate or possible for some programs.60

The CYF monitors programs using the evidence continuum criteria. That helps the agency ask for funds to bolster program delivery with additional research. Bradlee described how the agency applies scientific studies to inform program choices. The office regularly partners with university researchers to test out services in one county or in multiple places around the state. When findings show beneficial changes, the agency will propose to fund the program statewide. But some programs need help. “We also have a lot of programs that haven’t had a rigorous evaluation. So we have gone back in a couple of different instances for those programs that are maybe step two or step three, and we’ve actually asked for funding or contract changes to introduce an evaluation to learn more about it,” said Bradlee.61

In addition, the evidence continuum does not measure ethnic and racial equity to indicate the extent to which a program’s evidence supports reduction of differences between people of color and White people in outcomes related to health, education, and criminal justice. As Ray acknowledged, “It’s not something that we think the continuum is designed to do, but we’re working on other ways to address those considerations as part of the budget process.”62  Collaborative members stressed the importance of reducing ethnic and racial disparities through effective program design and implementation. This could include ensuring that programs serve representative demographics of the community while conducting evaluation.63 Some collaborative members also observed that few programs led by and for non-White communities have research evidence found at Levels 4 and 5 because they are often locally developed, and many lack investment to sufficiently fund studies.64 Collaborative member Eby of ResultsLab said, “If we’re not investing in strong, locally developed programming with good potential at early stages on the continuum, there’s a chance we will continue to promote racial inequity by only focusing on programs that have been able to garner research funding. We need to be careful there.”65 Ray clarified that since 2020, the governor’s office has incorporated diversity, equity, and inclusion considerations into all its decision-making, from hiring practices to budget and policy choices.66 Although the evidence continuum does not measure equity, the OSPB directs its budget analysts to assess how departments consider equity in their budget requests.67

Yet the OSPB recognizes the need to communicate details while maintaining brevity for legislative budget decision-makers. OSPB staff members emphasize to the JBC and agencies that the continuum is intended to be a starting point for evidence-based policymaking. “It’s the beginning of the conversation. There’s certainly lots of granularity that has to be understood as you dig into what does the continuum actually mean for a given program,” said Ray.68 The OSPB’s leaders acknowledged the criticisms and said they will keep working to improve their presentation of program evidence levels.69 Further, the new law requires the JBC staff to independently analyze and describe the program evidence with each budget request.70


Colorado’s evidence continuum serves as a framework within the budget planning process to encourage agencies to plan for program evaluation that emphasizes innovation and continuous improvement. As an incentive, agencies can apply for grants to pay for evaluations from a dedicated state fund. Key to developing and sustaining the continuum is a collaborative effort involving the governor’s office, legislators, agency staff members, and experts from universities and nonprofit organizations. The governor’s budget staff worked with stakeholders to develop common evidence definitions. The evidence continuum continues to evolve as a practical tool for funding services based on research and supporting program development through evaluation. Pew’s Results First initiative has demonstrated how incorporating research evidence into budget decisions has helped states make more cost- effective investments. States looking to build on this work can learn from Colorado and, using evidence levels, engage stakeholders across government to commit to funding programs with strong research and invest in robust program evaluation.

Appendix A: Key Stakeholders Using The Evidence Continuum

Stakeholders from multiple government agencies and nongovernmental organizations contribute to and use the Evidence Continuum

Legislative branch:

Joint Budget Committee (JBC)—the agency for the Colorado General Assembly that proposes the annual appropriations bill for approval during the legislative session.

Executive branch:

Colorado Governor’s Office of State Planning and Budgeting (OSPB)—the governor’s office charged with developing revenue estimates, recommending the state budget, and analyzing operations of executive branch departments. The Research and Evidence division within OSPB promotes use of research, evidence, implementation science, and cost-benefit analysis in state decision-making processes.

State agencies—executive branch agencies that must provide research and data and indicate their programs’ evidence level in annual budget request documents to OSPB. This brief features the Office of Children, Youth and Families in the Colorado Department of Human Services. Some agencies, such as the Colorado Department of Public Health and Environment and the Colorado Department of Public Safety, have staff members who volunteer as members of the Evidence-Based Policy Collaborative.

Nongovernmental organizations:

Colorado Evidence-Based Policy Collaborative—a nonpartisan group of professionals from nonprofit, private, and public organizations in Colorado that advocates for and provides resources on research, evaluation, and program implementation in state and local government. Members include state agency representatives and nongovernmental organization representatives from:

Arnold Ventures, a philanthropy promoting evidence-based policymaking. ARA Strategies, consultants on state policy and budget processes.

Blueprints for Healthy Youth Development, a clearinghouse of evidence-based programs.

Center for the Study and Prevention of Violence at the University of Colorado, a higher education research institute.

Colorado Evaluation and Action Lab at the University of Denver, a higher education research institute. Colorado State University Prevention Research Center, a higher education research institute.

Invest in Kids, a nonprofit community services organization.

ResultsLab, a consulting firm focused on quality program design and effective use of data for informed decision- making and stronger outcomes.

Uncharted, a social impact accelerator that supports early-stage ventures tackling economic inequality in the United States.


  1. The Pew Charitable Trusts, “Colorado Dives Into Evidence-Based Policymaking” (The Pew Charitable Trusts, 2018),; The Pew Charitable Trusts, “How States Engage in Evidence-Based Policymaking: A National Assessment” (2017),
  2. Colorado Senate Bill 21-284, Evidence-Based Evaluations for Budget (2021),
  3. The Pew Charitable Trusts, “A Common Language for Evidence-Based Programming” (2017),
  4. We used a case study method to understand Colorado’s evidence continuum, describe the process of using it, learn stakeholders’ perceptions of it, and to explain the steps that were taken to cement the evidence continuum for use after administration change. Pew researchers conducted semi-structured interviews using a common set of questions to allow for comparison of responses. We developed a purposeful sample of 13 individuals to interview, based on prioritizing stakeholders who were involved in reviewing and approving budget proposals, submitting budget requests, and developing the evidence continuum. We used NVivo 12 software to code the transcripts and documents and compile summaries of each theme. QSR International Pty Ltd., “NVivo,” (2020),
  5. The Pew Charitable Trusts, “A Common Language for Evidence-Based Programming.”
  6. Colorado Evidence-Based Policy Collaborative, Evidence Standards (Second Revised Version) (2018); R.J. Smart, Memorandum: Evidence-Based Policy as It Relates to the State of Colorado Budget Process, Colorado Legislature Joint Budget Committee, (2021), eb_policy-01-28-21_0.pdf (; Colorado Governor’s Office of Planning and Budgeting, FY 22-23 OSPB Budget Instructions Evidence Section (Updated), (2021),
  7. C. Hansen (senator, Colorado General Assembly), interview with The Pew Charitable Trusts, April 06, 2021.
  8. The Pew Charitable Trusts, “Colorado Dives Into Evidence-Based Policymaking.”
  9. A.R. Avila (consultant, ARA Strategies Inc.), interview with The Pew Charitable Trusts, March 26, 2021.
  10. Ibid.; D. Anderson (vice president of evidence-based policy, Arnold Ventures), interview with The Pew Charitable Trusts, March 30, 2021; Christina Biesel et al., Memorandum: Evidence-Based Policy, Colorado Legislature Joint Budget Committee (2017),
  11. Colorado Governor’s Office of Planning and Budgeting, “Colorado Results First Update, February 2017” (2017),
  12. Colorado Evidence-Based Policy Collaborative, Evidence Standards (Second Revised Version); Smart, Memorandum: Evidence-Based Policy as It Relates to the State of Colorado Budget Process; Colorado Governor’s Office of Planning and Budgeting, FY 22-23 OSPB Budget Instructions Evidence Section (Updated).
  13. Colorado Governor’s Office of Planning and Budgeting, FY 22-23 OSPB Budget Instructions Evidence Section (Updated); A. Ray (former deputy director for policy, Colorado Governor’s Office of State Planning and Budgeting), interview with The Pew Charitable Trusts, March 26, 2021.
  14. Colorado Governor’s Office of Planning and Budgeting, FY 22-23 OSPB Budget Instructions Evidence Section (Updated); Ray, interview.
  15. Colorado Governor’s Office of State Planning and Budgeting, FY 2021-22 Budget Attachment 3: Evidence-Based Policy, (2020),
  16. The Pew Charitable Trusts, “How Public Officials Can Use Data and Evidence to Make Strategic Budget Cuts” (2020),
  17. M. Bradlee (deputy director of the Office of Children, Youth and Families, Colorado Department of Human Services), interview with The Pew Charitable Trusts, June 1, 2021.
  18. L. Larson (director, Colorado Governor’s Office of State Planning and Budgeting), interview with The Pew Charitable Trusts, April 23, 2021.
  19. Colorado Governor’s Office of Planning and Budgeting, FY 22-23 OSPB Budget Instructions Evidence Section (Updated); Smart, Memorandum: Evidence-Based Policy as It Relates to the State of Colorado Budget Process.
  20. R. Smart (Joint Budget Committee principal legislative budget and policy analyst, Colorado General Assembly), interview with The Pew Charitable Trusts, March 30, 2021.
  21. Telematics is an automated system to collect real-time data on fleet usage, location, maintenance, and other factors that allows the state to measure vehicle utilization.
  22. Colorado Governor’s Office of State Planning and Budgeting, FY 20-21 Governor’s Budget Attachment 3: Evidence-Based Policy (2019).
  23. Colorado Governor’s Office of State Planning and Budgeting, FY 2022-23 Submission Manual for Operational, Capital, and Information Technology Requests (2021),
  24. Bradlee, interview.
  25. Colorado Governor’s Office of State Planning and Budgeting, FY 2021-22 Budget Attachment 3.
  26. Ray, interview.
  27. Colorado Senate Bill 21-284, Evidence-Based Evaluations for Budget.
  28. Hansen, interview; B. Rankin (senator, Colorado General Assembly), interview with The Pew Charitable Trusts, April 06, 2021.
  29. Colorado Governor’s Office of Planning and Budgeting, FY 22-23 OSPB Budget Instructions Evidence Section (Updated); Colorado Legislature Joint Budget Committee, Joint Budget Committee Staff Briefing FY 2020-21 Evidence-Based Policy (2019),
  30. Anderson, interview.
  31. Colorado Governor’s Office of State Planning and Budgeting, “Implementation and Evaluation Grant, November 2020 Update” (2020),
  32. Bradlee, interview.
  33. Ibid.
  34. Colorado Governor’s Office of State Planning and Budgeting, 2021, “Using Data and Evidence in the FY 2022-23 Budget,” April 29, 2021),; T. Madrid (director of legislative affairs and policy, Child Protection Ombudsman of Colorado), interview with The Pew Charitable Trusts, March 24, 2021.
  35. Bradlee, interview.
  36. Madrid, interview.
  37. A. Russman (director of strategy and insights, Uncharted), interview with The Pew Charitable Trusts, April 2, 2021; Anderson, interview; H. Sobanet (senior vice chancellor for administration and government relations/chief financial officer, Colorado State University System), interview with The Pew Charitable Trusts, April 13, 2021.
  38. Russman, interview.
  39. Ibid.
  40. Rankin, interview.
  41. Hansen, interview.
  42. Avila, interview; Ray, interview; Russman, interview.
  43. Colorado Governor’s Office of State Planning and Budgeting, “Research & Evidence,” accessed Sept. 13, 2021,; Colorado Governor’s Office of State Planning and Budgeting, “Using Data and Evidence in the FY 2022-23 Budget.”
  44. Colorado Evidence-Based Policy Collaborative, Evidence Standards (Second Revised Version).
  45. Ray and Smart, 2021, personal telephone communication with The Pew Charitable Trusts, Feb. 9, 2021.
  46. C. Eby (CEO and founder, ResultsLab), interview with The Pew Charitable Trusts, May 7, 2021; Avila, interview; Russman, interview; R. Puttick and J. Ludlow, “Standards of Evidence: An Approach That Balances the Need for Evidence with Innovation” (Nesta, 2013),
  47. Colorado Evidence-Based Policy Collaborative, Evidence Standards (Second Revised Version).
  48. Avila, interview; Madrid, interview; Russman, interview.
  49. The Pew Charitable Trusts, “How States Engage in Evidence-Based Policymaking: A National Assessment.”
  50. Colorado Senate Bill 21-284, Evidence-Based Evaluations for Budget.
  51. Smart, Memorandum: Evidence-Based Policy as It Relates to the State of Colorado Budget Process; Smart, interview; Ray, interview; Rankin, interview; Hansen, interview.
  52. Smart, interview.
  53. Russman, interview; Anderson, interview; Smart, interview.
  54. Russman, interview.
  55. Ray, interview; Smart, interview; Rankin, interview.
  56. Colorado Governor’s Office of Planning and Budgeting, FY 22-23 OSPB Budget Instructions Evidence Section (Updated).
  57. Smart, Memorandum: Evidence-Based Policy as It Relates to the State of Colorado Budget Process; Colorado Governor’s Office of Planning and Budgeting, FY 22-23 OSPB Budget Instructions Evidence Section (Updated); Ray, interview; Smart, interview; Hansen, interview; Rankin, interview.
  58. Colorado Legislature Joint Budget Committee, Joint Budget Committee Staff Briefing FY 2020-21 Evidence-Based Policy; Madrid, interview; Bradlee, interview.
  59. Colorado Governor’s Office of State Planning and Budgeting, FY 2022-23 Submission Manual for Operational, Capital, and Information Technology Requests.
  60. Colorado Governor’s Office of Planning and Budgeting, FY 22-23 OSPB Budget Instructions Evidence Section (Updated); Ray, interview.
  61. Bradlee, interview.
  62. Ray, interview.
  63. Madrid, interview.
  64. Russman, interview; Eby, interview; Madrid, interview.
  65. Eby, interview.
  66. Ray, interview. In August 2020, Gov. Polis issued Executive Order D 2020 175, which directs the Department of Personnel and Administration to lead state action on equity, diversity, and inclusion for the state of Colorado.
  67. Colorado Governor’s Office of State Planning and Budgeting, FY 2022-23 Submission Manual for Operational, Capital, and Information Technology Requests.
  68. Ray, interview.
  69. Ibid.; Larson, interview.
  70. Colorado Senate Bill 21-284, Evidence-Based Evaluations for Budget.
Composite image of modern city network communication concept

Learn the Basics of Broadband from Our Limited Series

Sign up for our four-week email course on Broadband Basics

Quick View

How does broadband internet reach our homes, phones, and tablets? What kind of infrastructure connects us all together? What are the major barriers to broadband access for American communities?

What Is Antibiotic Resistance—and How Can We Fight It?

Sign up for our four-week email series The Race Against Resistance.

Quick View

Antibiotic-resistant bacteria, also known as “superbugs,” are a major threat to modern medicine. But how does resistance work, and what can we do to slow the spread? Read personal stories, expert accounts, and more for the answers to those questions in our four-week email series: Slowing Superbugs.