How Organizing, Sharing Data Can Boost Court Transparency

Steps for making civil courts more open

Navigate to:

How Organizing, Sharing Data Can Boost Court Transparency
An illustration showing part of the front of a courtroom, including the door to the judge’s chambers, back wall, and a corner of the bench. The phrase “Open Courts” appears in white letters over an orange brushstroke in the middle of the image.

Overview

When civil court data is available to relevant stakeholders and court staff, courts can keep other branches of government informed about how people use the courts, support community-based efforts to help litigants resolve legal issues, and increase public trust in the judicial branch. By sharing key information—such as how many court users lack lawyers, who files the most cases, what demographic trends are seen, how cases are decided, what happens after a case, and what litigants say about their experiences—courts can improve public trust in the civil legal system and enable research staff to focus on analyses instead of requests for information.

Courts seeking to increase transparency and reduce the number and complexity of data requests can begin by implementing two key practices:

  • Determine what data should be shared with internal and external stakeholders and publish that information in usable formats.
  • Dedicate staff to ensuring timely responses to data inquiries from the public, other courts, and other government branches.

After extensive research, The Pew Charitable Trusts has developed a framework outlining how and why courts should modernize.1 These steps arise from that work and can help programmatic and operational court staff, along with court leadership, assess how they are publishing data and making it publicly available; identify opportunities to improve; and decide—with input from relevant stakeholders—which of those opportunities to pursue and how.

Step 1: Bring together relevant court staff and external stakeholders

These groups can contribute important perspectives and insights about making data trends public.

Court leadership can set priorities for which data should be publicly available and work with relevant stakeholders to share the information and prioritize responding to requests.

Court researchers can work with court leadership and access to justice staff (e.g., people who lead initiatives to improve the availability of legal and language services or to train staff on supporting court users) to determine which data fields are most meaningful and conduct analyses required for statistical reports, data dashboards, and responses to requests.

Court website administrators can work with court researchers to upload information to the website and ensure that dashboards are user-friendly and accessible.

Clerks can support court researchers by collecting and reporting data according to agreed-upon standards and providing feedback on which data should be made public.

Access to justice staff can offer input about how to disaggregate data into key categories, such as race and ethnicity, so that information is not personally identifiable but still useful to researchers, policymakers, and other stakeholders.

External researchers can share perspectives on which data fields are most important to aggregate and share and how best to present the data.

Policymakers can identify the information that would most help them understand how their policy agendas play out in the civil courts (e.g., How does increased rental assistance affect the number of eviction cases filed?).

Step 2: Assess current practices and set next steps

The following metrics can help courts assess their progress toward sharing information publicly and making it useful, undertaking necessary reforms, and conducting cross-jurisdictional comparisons. (See Tables 1 and 2.)

For each metric, determine whether the answer to the initial question is yes or no using the suggested measure. If the answer to the metric question is no, pursue the suggested next steps in collaboration with staff and stakeholders. The suggested steps are not prescriptive; instead, they provide ideas and options for getting started. The state examples can help courts determine what actions are feasible given available resources.

Table 1

Aggregated Data on Case Processing and Users’ Experiences Should Be Public

Metrics, suggested steps, and state examples and resources

Metric If not, suggested next steps State examples

Does the court regularly and publicly report information on caseloads and processing for all case types?

How to measure it:

Examine the level of detail about case filings, dispositions, and other case data provided on the website.

  • Review annual statistical reports, the data dashboard, and other public-facing sources to identify areas needing greater detail.
  • Convene relevant court staff to identify gaps and determine which information would be most useful to their operations, such as case dispositions by date or time from filing to disposition.
  • Ask external experts to help identify priorities for dashboard development, such as an eviction dashboard that highlights areas with high eviction rates to help improve outreach to communities on rental assistance.
  • Concurrent with improving dashboards and public reports, work with court clerks to address any gaps in local data reporting
    and ensure that local data aligns with statewide standards.

Internal Experts

  • Clerks
  • Leadership
  • Researchers

External Experts

  • Researchers
  • The Court Statistics Project’s State Court Guide to Statistical Reporting offers guidance about the level of detail necessary for statistical reports to be useful to users.
  • Indiana worked with local researchers and relevant stakeholders to update the state’s methods for collecting and sharing information about court users without lawyers.
  • States take a variety of approaches to ensuring that their information appears in a usable and relevant format. Colorado provides detailed annual statistical reports, and Indiana and New York publish aggregated data about various case types in a variety of formats, such as graphs and tables. Other states, such as Michigan and Minnesota, have worked with external experts to develop detailed dashboards for certain case types, such as debt collection.

Does the court publicly share aggregated information on users’ experiences at least annually?

How to measure it:

Review whether the website includes publicly available user survey results.

  • Identify the jurisdictions, pilots, or projects that capture user feedback and determine whether that feedback should be included in the court’s annual report.
  • Ask court staff and leaders as well as other stakeholders what they want to know about court users’ experiences.
  • Work with relevant court staff and external experts to survey court users at least annually.

Internal Experts

  • Access to justice
  • Clerks
  • Leadership
  • Researchers

External Experts

  • Researchers
  • States publish user feedback in different ways. The Michigan courts publish a report on their website outlining the results of their recurring public satisfaction survey. After pausing during the pandemic, the courts are relaunching their survey to obtain and make plans to respond to feedback.
  • The National Center for State Courts (NCSC) CourTools resource has sample questions for measuring user perceptions of court access and fairness.

Does the court publicly report information about sociodemographic trends in case filings and outcomes?

How to measure it:

Review whether the website includes information about sociodemographic trends.

  • Work with external experts to routinely assess equity in court access, navigation, participation, and outcomes across sociodemographic groups—at a minimum, race, ethnicity, disability, gender, age,
    and language.
  • Identify what sociodemographic data the court already collects that would support imputing (i.e., making educated guesses about) information—such as race and gender, based on name and location—as well as geomapping (i.e., using addresses and census tracts) to
    understand demographic trends at the neighborhood level.
  • Add data fields to the case management system as needed to capture sociodemographic information.

Internal Experts

  • Access to justice
  • Clerks
  • Leadership
  • Researchers

External Experts

  • Researchers

Courts can use several methods for capturing demographic data:

  • Linking to census data. The Utah courts are pursuing sharing court data with the U.S. Census Bureau to match court users in certain dockets to information those users previously shared with the bureau.
  • Statistical modeling. January Advisors used Bayesian Improved Surname Geocoding—a system for making educated guesses of an individual’s race and ethnicity based on name and address information—to identify and understand disparities in civil filings and outcomes in Minnesota.
  • Geomapping. Reinvestment Fund geomapped eviction court data, which revealed disparities in eviction rates and other variables among residents of Black, White, and Hispanic neighborhoods.
  • Self-identification. Courts can collect demographic data by asking court users. The NCSC has a resource on how to do this and has recommended data fields.

Do court reports and dashboards comply with federal, state, and local digital accessibility requirements?

How to measure it:

Test reports and dashboards for accessibility against relevant guidelines (e.g., Is the platform screen reader accessible, and does it use color contrast?).

  • Consider partnering with an external evaluator with expertise in accessibility to address issues identified through testing.
  • Educate staff members, across departments and levels of seniority, about the importance of accessibility and what needs to be done at the local and state levels to ensure that resources are accessible.

Internal Experts

  • Access to justice
  • IT staff
  • Leadership
  • Researchers

External Experts

  • Researchers
Sources: Court Statistics Project, “State Court Guide to Statistical Reporting” (2023); R. Rath (chief innovation officer, Indiana Office of Judicial Administration), J. O’Malley (director of e-filing and innovation, Indiana Office of Court Technology), and J. Wiese (deputy director of legal support division, Indiana Office of Court Services), Jan. 30, 2023; Indiana Courts, “Indiana Trial Court Statistics by County”; Colorado Judicial Branch, “Research and Data”; New York Division of Technology and Court Research, “Family Court Caseload Activity Dashboard”; January Advisors, “Michigan Debt Collection Data Dashboard”; January Advisors, “Minnesota Consumer Debt Collection Dashboard”; Michigan Supreme Court, “Public Satisfaction Survey”; CourTools, “Why Measure Performance?” (2005); T. Samuelsen (director of judicial data and research, Utah Administrative Office of the Courts), (August 29, 2023); D. McClendon and J. Reichman, “Debt Collection Lawsuits in Minnesota” (Principals, January Advisors), (April, 14, 2023); Court Statistics Project, “Collecting Race & Ethnicity Data” (2022); World Wide Web Consortium, “Web Content Accessibility Guidelines (WCAG) 2.1” (2018); Web Accessibility Initiative, “Accessibility Fundamentals Overview”

Table 2

Courts Should Respond to Requests for Information in a Timely Manner

Metrics, suggested steps, and state examples and resources

Metric If not, suggested next steps State examples

Has the court published a standard procedure for handling information requests from local jurisdictions and external stakeholders?

How to measure it:

Review court processes and determine whether this information is publicly available online.

  • Develop a clear mandate or chain of command to ensure that inquiries from local courts and external stakeholders can be answered quickly.
  • Publish instructions for requesting information and estimated response times on the court website.

Internal Experts

  • IT staff
  • Leadership
  • Researchers

When responding to requests for information, can court staff quickly produce data that is not included in standard reporting?

How to measure it:

Analyze response times for data requests to see whether the court is meeting its benchmarks.

  • Track past and begin tracking new questions from the legislature or executive branch and the response times. Use the findings to identify additional data to make public to eliminate the need for staff to manually answer multiple inquiries for the same information.

Internal Experts

  • Leadership
  • Researchers
  • In addition to an externally facing dashboard with civil filing information, the Indiana courts also have internal dashboards that enable court staff to quickly pull information, such as how court users without lawyers navigate the courts.
  • To help illuminate how case processes differ across jurisdictions, Georgia’s Judicial Council/Administrative Office of the Courts is standardizing the data it receives from local jurisdictions. The office worked with NCSC to obtain buy-in from the clerks responsible for data entry and to ensure that jurisdictions report priority data fields to the state in a timely and consistent manner.
Sources: Arkansas Judiciary, “Office of Research & Justice Statistics”; Minnesota Judicial Branch, “Court Statistics and Reports”; The Pew Charitable Trusts, “How to Standardize Court Data for Greater Transparency and Ongoing Improvement” (2023)

The work in action: Dashboards help Indiana courts identify local needs

In 2022, the Indiana courts used Odyssey, the state’s case management system (CMS), to develop and launch a dashboard that keeps count of court users without attorneys.2 The data on self-represented litigants has helped the courts better track and address the needs of these users. And because all Indiana courts use the same CMS, this important information can be compiled without additional demands on court staff and clerks’ time.

The Indiana Supreme Court created the dashboard in response to findings by Indiana’s Coalition for Court Access—a group of civil legal aid partners, bar foundation representatives, judges, and other stakeholders.3 In 2019, the coalition published a report on civil legal needs that highlighted variation in legal representation rates across and within counties. The findings raised questions about the number of Indiana court users without attorneys, and about inconsistencies in the accuracy and quality of the state’s court data. For example, statewide data indicated that 51.5% of family law cases involved unrepresented court users, but at the individual jurisdiction level, the figure ranged widely from zero to 76%.4

Indiana worked to develop a response. The state’s Innovation Initiative, a court personnel and stakeholder group, and several agencies in the Office of Judicial Administration set out to improve the accuracy of courts’ counting of and reporting on self-represented litigants. But the state first had to define the problem: Are people who have a lawyer at one stage of a case but not another represented or unrepresented? The Indiana courts opted to treat as unrepresented all litigants who did not have an attorney for even one significant event during their case (e.g., a hearing or a trial).

Next, the state had to collect and visualize the data using Odyssey and build the dashboard, which is currently accessible only by court staff as well as staff from the Indiana Bar Foundation, which serves as the fiscal and administrative agent for the Coalition for Court Access.

The coalition is working with stakeholders to act on the trends revealed in the data and to train court staff on data entry and quality. It intends to share aggregated county-specific data with legal aid providers who will use it to schedule legal clinics and direct services in target areas and—in collaboration with the coalition’s Rural Legal Services work group—to learn more about the challenges faced by individuals living in areas that have few lawyers or are far from a courthouse.

“Ultimately, having this data clearly defined and visualized gives us control of what we’re sharing and analyzing,” says Robert Rath, chief innovation officer at the Office of Judicial Administration. “It means we’re able to be better partners with on-the-ground organizations, draw comparisons between jurisdictions, and answer questions in a timely manner from folks external to the courts.”

Endnotes

Civil Court Modernization Toolkit

Resources and strategies to make civil courts more open, effective, and equitable

Quick View

U.S. civil court staff and leaders support using data to modernize their policies and processes to better serve litigants. But the courts were designed for attorneys, so updating them to function well for the majority of today’s users—most of whom do not have lawyers—is a major undertaking.