Maryland Works With Government Agencies to Track Program Performance

State budget office and legislators consider outcomes in strategic planning

Partager
Maryland Works With Government Agencies to Track Program Performance

To improve the effectiveness of state programs, the Maryland Department of Budget and Management implemented a performance measurement and strategic planning system called Managing for Results (MFR) in 1999. That process has been tremendously useful in helping agencies measure performance, but the system had been largely unchanged since its creation. Policymakers felt it was due for an update to ensure that the indicators being tracked were still accurate and meaningful.

In 2017, the department began an effort to reinvigorate the MFR process by helping agencies identify better ways to measure impact and by streamlining the measures being reported.

“They had become really long documents over time with a lot of measures. So we worked with the agencies to pare them down and focus on outcomes,” said Carissa Ralbovsky, performance budgeting specialist with the budget department.

The initial law required each state agency to develop a strategic plan outlining its goals for the coming year, as well as specific outcome and efficiency measures with meaningful targets to track progress. The state reports the collected information in the annual budget book and in a separate online annual report to the Legislature that gauges progress on roughly 100 key indicators, such as homeownership and reductions in youth recidivism.

Analysts at the budget office provide guidance to agencies on recommended research-based indicators to track outcomes but rely on the agencies to make final decisions on which to include. The feasibility of collecting certain information is an important consideration, because some data take significant time and financial resources to gather. To foster greater efficiency, analysts help the agencies identify and use data already available and advise them on updating and revising indicators and strategic plans when necessary.

“We work with agencies to bring existing data sources together so as not to reinvent the wheel,” Ralbovsky said.

For example, to measure visitor satisfaction at Maryland state parks, the Department of Natural Resources worked with budget staff members to pick appropriate proxy measures. The agency chose to survey those who make camping and pavilion reservations because that process is online, making it easy and inexpensive for the department to follow up. The survey results provided the state with important data on how people feel about the facilities and helped identify areas for improvement.

Although agencies are required to certify that the data they report are accurate, the budget staff also regularly monitors the information collected to ensure that it is reliable and that the measures used to analyze it are appropriate for gauging success. Budget staff members are encouraged to ask probing questions when they see discrepancies, or when they identify measures that are not meaningful or that could be replaced with something better. For example, agencies that consistently achieve 100 percent on a particular measure may be asked to identify other indicators that can measure progress over time.  

Lawmakers strongly support the MFR process and have required agencies to add outcome measures that can provide clearer pictures of performance and help identify areas for improvement. For example, as part of the fiscal 2019 budget preparation, they required the budget department to add performance measures to track the state’s Wellness Program, starting in fiscal 2020.

The MFR framework has pushed agencies to think critically about how to allocate time and funding to achieve desired outcomes. The yearly process of developing strategic plans with specific performance measures has become a useful tool for agencies to better understand why they collect certain information and how it influences statewide outcomes. For the Legislature, MFR has provided targeted outcome data to analyze agency performance and better appropriate state funds.

“Some agencies have taken the push that we started 20 years ago and have really run with it. It’s a great conversation piece between us, budget folks, and the agencies about their priorities and goals,” Ralbovsky said.

Maryland’s lessons can be applied elsewhere

For states interested in updating their performance management systems, Maryland’s experience shows that policymakers should:

  • Regularly review performance measures to ensure that they are accurate and meaningful.
  • Work with agencies to pare down the use of indicators that aren’t necessary or useful.
  • Find ways to measure the impact of programs or services when multiple factors can influence an outcome.
  • Tie performance measures to strategic planning goals and track progress over time to ensure that data are useful to decision-makers. This also helps to improve data accuracy, because staff members know that the information is being used.

For more information:

Sara Dube is a director and Mariel McLeod is an associate with the Pew-MacArthur Results First Initiative.