How States Can Gather Better Data for Evaluating Tax Incentives

Solutions for compiling and analyzing information on key economic development programs

How States Can Gather Better Data for Evaluating Tax Incentives

Overview

Over the past five years, many state governments have begun to study their economic development tax incentives—programs that are central to creating jobs, attracting businesses, and strengthening the economy. Today, 28 states have processes in place to regularly evaluate tax incentives, such as credits, exemptions, and deductions, as do Philadelphia, New York City, and the District of Columbia. And many states are reforming incentives based on the findings of those assessments, leading to better results for their budgets and economies.

But states have also faced obstacles to implementing evaluation processes. The personnel tasked with studying incentives—such as auditors, economists, and tax policy experts—report that many of their foremost challenges relate to the availability and usability of data that are spread across multiple agencies, sensitive or confidential, or incomplete. In some cases, these difficulties have prevented analysts from determining the effectiveness of tax incentives, leaving lawmakers without the information they need to make informed decisions.

These barriers are not insurmountable, however. To identify solutions, The Pew Charitable Trusts reviewed tax incentive studies and other documents and interviewed staff from evaluation offices across the country. This research shows that lawmakers, agency leaders, and analysts each have a role to play to ensure that evaluations draw valuable conclusions about the design and effectiveness of incentives. States can overcome their data challenges by:

  • Ensuring access to existing data. Several states have enacted policies allowing analysts to access sensitive data while protecting confidentiality and are working to ensure that evaluation offices receive data in a format that facilitates high-quality studies.
  • Collecting new information. Companies are often obliged to report on their activities as a condition of receiving incentives, but these data have not always proved useful for evaluation. Recently, states have begun crafting business reporting requirements to facilitate analyses.
  • Conducting high-quality analysis.  By using analytic approaches that require limited amounts of data, such as reviewing evaluations of similar incentives in other jurisdictions and considering best practices for designing incentives, analysts are proving that they can draw well-supported conclusions even without perfect information. When robust data are available, these qualitative approaches provide a useful complement to quantitative analyses.

As states expand their use of these strategies, evaluation offices are receiving more data to examine tax incentives and making better use of the information they have. And by doing so, they are demonstrating that even formidable data challenges do not need to stand in the way of high-quality analysis of tax incentives.

Ensure access to existing data

States already possess much of the data they need to evaluate incentives effectively, but it is often spread across multiple agencies and subject to confidentiality restrictions. To overcome these challenges and facilitate high-quality evaluation, states are using three approaches:

  • Authorizing evaluation offices to access relevant data.
  • Creating targeted exemptions from confidentiality rules.
  • Directing agencies to improve the usability of data.

Authorize evaluation offices to access relevant data

Much of the data that are most useful for evaluating tax incentives also are either sensitive or confidential. For example, tax returns include information about companies’ activities and incentive use, and unemployment insurance records are valuable for analyzing job creation. But the state agencies that house these data sources are typically reluctant to share them because both are generally protected under state and federal law.1

To help address this challenge, the statutes that create evaluation processes generally include provisions to allow analysts to access needed data from other state agencies. In some states, these stipulations have been sufficient. For instance, Indiana’s evaluation office, the Legislative Services Agency, has benefited from a statute mandating that other agencies cooperate with its work and from strong relationships with the state departments of Revenue and Workforce Development and the Indiana Economic Development Corp.2

State agencies often sign memorandums of understanding that establish the terms by which data will be shared. Such interagency agreements generally require analysts to adhere to the same confidentiality standards as the agency that houses the records. In most cases, evaluation offices only need access to these data to analyze an incentive program’s performance and would not have cause to publish sensitive company-specific information.

In addition to granting evaluators access to data in general terms, lawmakers should monitor whether further legislative action is necessary to expand on those permissions. After Nebraska passed a law in 2015 requiring the Legislative Audit Office to regularly evaluate tax incentives, the auditors struggled to persuade the Department of Revenue that they were legally entitled to directly access the department’s tax database.3 To resolve this disagreement, the Legislature in 2016 clarified that the auditors are permitted to access the database.4

By allowing evaluation offices to access existing records, states can avoid asking businesses to report the same information multiple times. Under Washington H.B. 1296, which was approved in 2017, companies are exempt from other reporting requirements if they authorize staff members from the Joint Legislative Audit and Review Committee, the state’s evaluation office, to access unemployment insurance records. State lawmakers intended H.B. 1296 to provide more reliable information for counting jobs, while eliminating the need for businesses to file multiple reports on the jobs they created and wages they paid.5

In addition to granting evaluators access to data in general terms, lawmakers should monitor whether further legislative action is necessary to expand on those permissions.

Create targeted exemptions from confidentiality rules

When states allow evaluation offices to access sensitive data, they usually require analysts to maintain the confidentiality of the information. However, they also have the option of exempting specific data related to an incentive from confidentiality restrictions to allow public transparency.

When Maine created a new tax incentive to encourage businesses to locate their headquarters in the state, it made the cost of the credits used by each participating company a matter of public record.6 Lawmakers viewed this disclosure as a reasonable trade-off for the substantial benefits that businesses would realize from the program—up to $16 million per project.7

Some states have worked to find a middle ground between protecting sensitive business information and allowing public disclosure. After a 2015 North Dakota law required regular evaluations for incentive programs, the early reviews faced data challenges. One problem was that the Office of State Tax Commissioner interpreted state law as forbidding disclosure of the cost of any tax incentive with fewer than five recipients, and several credits up for review in 2016 met this criterion. This policy even prevented legislators—who are responsible for weighing the merits of credits and balancing the state budget—from knowing how much the programs cost.In response, the Legislature approved a bill in 2017 that allows the commissioner to disclose the “amount of any tax deduction or credit.” In making the change, however, lawmakers explicitly preserved other confidentiality rules, including the prohibition against disclosing the names of companies in connection to the amount of incentives used.9

Direct agencies to improve the usability of data

Various agencies—including tax, commerce, and workforce departments—often collect data on incentive programs and their beneficiaries as a part of their day-to-day administrative responsibilities. Evaluation offices sometimes struggle to produce high-quality evaluations with this information, however, because it is not in a format that is conducive for analysis. Overcoming these challenges often involves a mix of technological solutions and interagency coordination.10

When analysts in the District of Columbia’s Office of the Chief Financial Officer requested tax data as part of their ongoing evaluation work, they found that some relevant information was only available on paper forms. The city had chosen not to capture the data electronically to avoid paying a third-party vendor. But even when the analysts examined the forms, they found that companies had not always filled them out completely.11

In part to address this problem, in 2017 the city began requiring that the forms be submitted online. This change will ensure that, going forward, all the data are available electronically and will allow the city to automatically reject forms with missing information. The district’s evaluators are also working with tax officials to add additional fields to capture more data.12

In Iowa, the Department of Revenue, which conducts incentive program evaluations, has overseen the creation of a database of awards and claims that includes information from four state agencies and authorities. The system allows the department to track which companies receive credits and when they apply the credits to their taxes. This information is helpful not only for evaluating credits, but also for other revenue department responsibilities, such as forecasting the cost of the programs.13

Collect new information

Policymakers and evaluation offices can collect new data specifically for use in evaluating incentives. Key strategies for doing so include:

  • Designing business reporting requirements with evaluation in mind.
  • Surveying companies.
  • Focusing on a subset of beneficiaries.

Design business reporting requirements with evaluation in mind

To increase transparency, many states routinely require businesses that receive incentives to report job-creation and capital investment figures as well as other performance data. However, many of those requirements were enacted before evaluation processes were in place and, as a result, are not tailored to analysts’ needs. Lawmakers should work with evaluation offices to design requirements that provide useful information for analysts and avoid unnecessarily burdening businesses.

In Washington, lawmakers bolstered the reporting requirements for the state’s film incentive program after a 2010 Joint Legislative Audit and Review Committee (JLARC) study pointed out weaknesses in the available information.14 Using the resulting detailed data on films’ budgets and spending, JLARC produced a high-quality evaluation of the program in 2015.15 In addition, H.B. 1296 of 2017 gave the committee the authority to help design reporting requirements, so committee staff members are working with the departments of Revenue and Employment Security to improve the information that businesses report.16

In addition to upgrading reporting for existing tax incentives, states can ensure that new programs launch with appropriate requirements. When the Maine Legislature created an incentive in 2017 to encourage businesses to locate their headquarters in the state, they worked with the staff of the Legislature’s Office of Program Evaluation and Government Accountability (OPEGA) to ensure the program had well-designed reporting requirements.17 Legislators also tasked OPEGA with conducting a preliminary study to provide guidance on the program’s design and assess whether requiring businesses to submit additional data would be helpful for measuring its effectiveness. OPEGA’s report included a list of potential performance measures for the program and a description of the data that would be needed to use those measures.18 Based on this study, the Legislature in April 2018 approved a bill enhancing reporting requirements for the program before the first credits were issued.19

Lawmakers should work with evaluation offices to design requirements that provide useful information for analysts and avoid unnecessarily burdening businesses.

Survey companies

Analysts occasionally survey companies to collect specific information for their evaluations, but this approach has potential drawbacks. Unlike routine reporting, this technique does not generally offer longitudinal data and can suffer from low response rates, though some evaluation offices have done it successfully to inform conclusions about the results of incentives.

Business surveys can be particularly useful for determining whether state agencies are administering incentive programs effectively and identifying ways to reduce barriers to company participation. For example, a 2017 evaluation of Minnesota’s Research Tax Credit suggested that the state’s Department of Revenue enhance the information it provides about the credit to make it easier for companies to determine the value of the credits they are eligible for and to substantiate their claims if they are audited. These findings were based partially on a survey of nearly 500 program participants and on interviews with individual companies.20

Because businesses benefiting from incentives have a vested interest in the continuation of the programs, analysts should use care when designing surveys and interpreting the findings. Simply asking businesses whether incentives made a difference in their decisions to locate or expand may not yield reliable results. However, states have found creative ways to use surveys to assess whether programs are influencing business behavior. For instance, Florida’s Office of Program Policy Analysis and Government Accountability has surveyed incentive recipients for each of its annual evaluations dating back to 2014, and when asked directly, respondents have generally said that incentives are important to their decisions. But other questions offer a more nuanced picture of the effects on business behavior: The surveys showed that incentives primarily benefit companies that were in Florida before the awards and that having an established presence was a key factor in a firm’s decision to expand within the state.21

Focus on a subset of beneficiaries

Most evaluations attempt to analyze data on all companies benefiting from an incentive. However, some states have been able to reduce the need for comprehensive data collection by examining a subset of businesses or projects in-depth instead of all beneficiaries.

For example, the Missouri State Auditor’s Office used this approach in 2014 evaluations of the Brownfield Remediation Tax Credit Program and the Historic Preservation Tax Credit. The brownfield evaluation closely examined 10 development projects and found that only one had created the promised number of jobs. The study recommended that state officials add provisions to incentive agreements to reclaim credits if recipients fail to meet their obligations.22 The historic preservation evaluation identified “lavish and expensive” owner-occupied residential projects that had minimal economic impact and were unlikely to be caused by the credit, and it recommended that lawmakers consider tightening the eligibility criteria.23

Conduct high-quality analysis

High-quality evaluations use a range of methodologies, from detailed economic modeling to more qualitative strategies. Several qualitative approaches do not require rich data and have proved useful when evaluation offices face data challenges, but they are also valuable supplements to rigorous quantitative analysis when good information is available. These strategies include:

  • Analyzing whether incentive programs reflect best practices.
  • Examining whether programs are designed to achieve their goals.
  • Reviewing evaluations of similar programs in other jurisdictions.

Analyze whether incentive programs reflect best practices

Leading scholars have reached a degree of consensus on the characteristics that generally increase the effectiveness of incentive programs. Even without complete data, analysts can study whether incentives reflect these best practices.

In 2016, analysts at Mississippi’s University Research Center lacked sufficient data to formally study the costs and benefits of the state’s Tourism Rebate Program, but the evaluation still went on to raise concerns about the program based on economic development theory, pointing out, “In general, economic development only occurs when dollars that were previously not here flow into the State.”24 Several shopping malls received incentives under the program, and the study noted that many of the customers at retailers that benefited were Mississippi residents and would have spent those dollars at other in-state retailers had the malls not been built. The report concluded that focusing on projects that would draw a greater share of customers from out of state would have a larger impact.25

Examine whether programs are designed to achieve their goals

High-quality evaluations often start by identifying the goals of the programs. They then assess the extent to which those objectives have been achieved. In some cases, the design of the program—such as the terms by which individuals or companies qualify—raises doubts about whether the incentive is serving its purpose.

For example, an evaluation of Iowa’s Beginning Farmer Tax Credit Program pointed out that the program defined "beginning farmers” based on their net worth rather than years of experience like the U.S. Department of Agriculture. As a result, many eligible individuals were not new to agriculture; 74 percent had at least 10 years of experience.26

In some cases, the design of the program—such as the terms by which individuals or companies qualify—raises doubts about whether the incentive is serving its purpose.

Likewise, a 2017 Office of Program Evaluation and Government Accountability evaluation of Maine’s Pine Tree Development Zones found design weaknesses that could prevent the incentive from effectively serving its goal of creating jobs. The study noted that companies can receive incentives for up to two years without creating any jobs and are not required to return the benefits if they fail to boost employment during that period. OPEGA identified this and other design problems without sufficient data to conduct a full economic analysis.27

Review evaluations of similar programs in other jurisdictions

With more states launching evaluation processes in recent years, common types of incentive programs—such as job creation, film, and research and development tax credits and enterprise zones—have been rigorously examined at least a handful of times. Thanks to this growing body of research, analysts can glean insights from high-quality evaluations conducted in other states as a starting point for studies of their own programs.  

In some cases, this research can uncover challenges and design concerns. For example, a 2017 evaluation of Alabama’s Certified Capital Companies tax credit identified weaknesses of similar incentives in other states—such as design flaws that caused benefits to be offered to established firms instead of startups as intended—and concluded that the state’s program probably suffered from similar problems.28

But reviewing evaluations from other jurisdictions can also help identify aspects of incentive design that promote success. For instance, a 2017 evaluation from the Maryland Department of Legislative Services found that most research and development tax credits obligate participating companies to increase research spending year over year. However, the study noted that one of Maryland’s two R&D credits provides incentives based on a company’s entire research budget rather than incremental spending increases, and it concluded that, compared with other states’ R&D credits, this aspect of the program probably resulted in more awards for investments firms would have made anyway.29

Conclusion

Across the country, states are proving that data challenges do not have to stand in the way of rigorous tax incentive evaluations. But policymakers and analysts both must play a role in overcoming data barriers. Lawmakers and agency leaders can devise policies that improve evaluation offices’ access to high-quality information. And when data are not perfect, analysts can use a variety of methodological approaches to draw meaningful conclusions about the design and effectiveness of incentives. Through these efforts, states can ensure that they have robust analyses of some of their most important economic development programs.

Endnotes

  1. Center for Regional Economic Competitiveness, “Improved State Administrative Data Sharing: A Strategy to Promote Evidence-Based Policymaking for Economic and Workforce Development” (2017), 15, http://www.statedatasharing.org/about/SDS_Initiative_Research_Paper_2017.pdf.
  2. Heath Holloway (senior fiscal/program analyst, Indiana Legislative Services Agency) and Allison Leeuw (senior fiscal/program analyst, Indiana Legislative Services Agency), interview with The Pew Charitable Trusts, Dec. 13, 2017.
  3. Martha Carter (legislative auditor, Nebraska Legislative Audit Office) and Anthony Circo (performance auditor, Nebraska Legislative Audit Office), interview with The Pew Charitable Trusts, Dec. 12, 2017.
  4. Nebraska L.B. 1022 (2016), https://nebraskalegislature.gov/bills/view_bill.php?DocumentID=28631.
  5. Washington H.B. 1296 (2017), http://lawfilesext.leg.wa.gov/biennium/2017-18/Pdf/Bills/Session%20Laws/House/1296-S.SL.pdf; Keenan Konopaski (legislative auditor, Washington Joint Legislative Audit and Review Committee) and Dana Lynn (research analyst, Washington Joint Legislative Audit and Review Committee), interview with The Pew Charitable Trusts, Dec. 27, 2017.
  6. Maine Rev. Stat. tit. 36 §5219-QQ, http://legislature.maine.gov/statutes/36/title36sec5219-QQ.html.
  7. Maine Office of Program Evaluation and Government Accountability, “Assessment of the Design of the Newly Enacted Major Business Headquarters Expansion Program” (2018), http://legislature.maine.gov/doc/2162.
  8. Emily Thompson (counsel, North Dakota Legislative Council), interview with The Pew Charitable Trusts, Dec. 26, 2017.
  9. North Dakota H.B. 1354 (2017), http://www.legis.nd.gov/assembly/65-2017/bill-index/bi1354.html.
  10. The Pew Charitable Trusts, “How States Use Data to Inform Decisions” (February 2018), 33–35, http://www.pewtrusts.org/~/media/assets/2018/02/dasa_how_states_use_data_report_v5.pdf.
  11. Lori Metcalf, Farhad Niami, and Charlotte Otabor (fiscal analyst, director of economic affairs, and fiscal analyst, respectively, District of Columbia Office of the Chief Financial Officer), interview with The Pew Charitable Trusts, Jan. 12, 2018.
  12. Ibid.
  13. Angela Gullickson (senior fiscal and policy analyst, Iowa Department of Revenue) and Amy Rehder Harris (administrator and chief economist, Iowa Department of Revenue), interview with The Pew Charitable Trusts, Dec. 15, 2017.
  14. Washington Joint Legislative Audit and Review Committee, “Review of Motion Picture Competitiveness Program” (2010), http://leg.wa.gov/jlarc/AuditAndStudyReports/Documents/10-11.pdf; Washington Joint Legislative Audit and Review Committee, “JLARC Final Report: 2015 Tax Preference Performance Reviews, Motion Picture Program Contributions” (2016), http://leg.wa.gov/jlarc/taxReports/2015/MotionPictureProgramContributions/f/default.htm.
  15. Ibid.
  16. Washington H.B. 1296; Konopaski and Lynn interview.
  17. Beth Ashcroft (director, Maine Office of Program Evaluation and Government Accountability) and Maura Pillsbury (analyst, Maine Office of Program Evaluation and Government Accountability), interview with The Pew Charitable Trusts, Dec. 12, 2017.
  18. Maine Office of Program Evaluation and Government Accountability, “Assessment of the Design.”
  19. Maine L.D. 1903 (2018), http://legislature.maine.gov/LawMakerWeb/summary.asp?ID=280068606.
  20. Minnesota Office of the Legislative Auditor, “Minnesota Research Tax Credit: 2017 Evaluation Report” (2017), 58–67, https://www.auditor.leg.state.mn.us/ped/pedrep/researchcredit.pdf.
  21. Florida Office of Program Policy Analysis & Government Accountability, “Florida Economic Development Program Evaluations – Year 4” (2017), 10–11, http://www.oppaga.state.fl.us/MonitorDocs/Reports/pdf/1702rpt.pdf.
  22. Missouri State Auditor, “Economic Development: Brownfield Remediation Tax Credit Program” (2014), 18–19, 23, https://app.auditor.mo.gov/Repository/Press/2014023457179.pdf.
  23. Missouri State Auditor, “Economic Development: Historic Preservation Tax Credit Program” (2014), 13, 15, https://app.auditor.mo.gov/Repository/Press/2014018370056.pdf.
  24. University Research Center, Mississippi Institutions of Higher Learning, “The Annual Tax Expenditure Report” (2016), 69, http://www.mississippi.edu/urc/downloads/TER/2015.pdf.
  25. Ibid.
  26. Iowa Department of Revenue, “Beginning Farmer Tax Credit Program: Agricultural Assets Transfer Tax Credit and Custom Farming Contract Tax Credit, Tax Credits Program Evaluation Study” (2015), https://tax.iowa.gov/sites/files/idr/BFTC%20Evaluation%20Study%202015.pdf.
  27. Maine Office of Program Evaluation & Government Accountability, “Pine Tree Development Zones Tax Expenditure Review” (2017), 7, 26–29, http://legislature.maine.gov/doc/1809.
  28. Matthew N. Murray and Donald J. Bruce, “Evaluation of Alabama’s CAPCO Credit and Historic Rehabilitation Tax Credit” (2017), 14–16, https://revenue.alabama.gov/wp-content/uploads/2017/05/TaxIncentives_CAPCO_201701.pdf.
  29. Maryland Department of Legislative Services, Office of Policy Analysis, “Evaluation of the Research and Development Tax Credit” (2017), 46, http://dls.maryland.gov/pubs/prod/TaxFiscalPlan/DRAFT-Evaluation-of-the-Research-and-Development-Tax-Credit.pdf.