When the coronavirus pandemic shuttered courthouses throughout the country, state courts were forced to move processes online at an unprecedented scale. But even before the public health emergency, some courts had already begun adopting new technologies, including online dispute resolution (ODR)— which has its origins in the private e-commerce sector and enables people to resolve some legal disputes entirely online—as well as other online procedures, such as virtual hearings. However, the evidence base necessary to understand these systems’ effects on core court functions is incomplete.
This interview with Donna Shestowsky, Martin Luther King Jr. Professor of Law at the University of California, Davis, has been edited for length and clarity.
A: Procedural justice has to do with how fairly we feel we’ve been treated during a process or procedure used to determine an outcome in the legal system—for example, during a trial or mediation.
Many people assume that parties to a lawsuit care only about whether they win or lose, and therefore that litigants will say they like a procedure such as a trial or mediation if they felt they won, and will say they dislike it if they felt they lost.
A: No, it’s not. We know from decades of psychological research that people evaluate the process used to resolve their dispute separately from how they evaluate the outcome.
Those who get favorable outcomes through a process they regard as fair are generally highly satisfied. But those who receive unfavorable outcomes often report high levels of satisfaction if they perceive the process as fair. Process matters, and often as much as—and sometimes more than—outcome. We call this the “procedural justice effect.”
A: When people view legal procedures as just, they’re more likely to comply voluntarily with case outcomes—making it less likely that cases will be appealed or that settlement agreements will be breached—and to follow the law more generally.
A: Because procedural justice is largely subjective, when trying to create ODR platforms that litigants will regard as fair, courts need to consider the litigant perspective. And that means surveying litigants to understand their points of view. For instance, survey questions might ask people whether the process used to handle their case was fair overall. Or more specific questions might tap the components of procedural justice: Voice, neutrality, respect, and trust.
A: Questions about voice might inquire about how much opportunity people were given to share their version of the story behind the dispute. Neutrality questions might measure perceptions of how unbiased third parties—like the mediators that sometimes help litigants resolve legal issues as part of an ODR system—seemed to be. Questions concerning respect might measure how much litigants felt their concerns were taken seriously or how easy the technology was to use, since perceived barriers to access—or lack of user friendliness—can be viewed as signs of disrespect. Questions regarding trust might assess whether litigants viewed a technology platform as transparent or whether they thought the mediators or other third-party participants genuinely tried to do what was right.
A: They can draw from evaluations done elsewhere to get ideas for their own survey questions and use a tool like SurveyMonkey or Qualtrics to create an online questionnaire. Then they can direct litigants to the survey once their cases conclude. They should invite litigants to participate in a welcoming way, assure them that their responses will remain anonymous, and let them know the data will be used to improve court services. For the court to get valid data, it’s important that litigants don’t anticipate negative repercussions for giving the technology an unfavorable review, or positive treatment for reviewing it favorably.
A: Definitely. Surveys can be used, for example, to determine how easy litigants found the ODR registration process, whether they felt they could communicate effectively on the platform, or whether they think the mediator handled their case fairly. This kind of information can help courts pinpoint which parts of their ODR system need improvement. For example, if a large percentage of a court’s ODR users didn’t trust their mediators or view them as neutral, the court might require their mediators to undergo additional training to help them better adapt their approach to online modalities.
When combined with demographic data, such as age, race, ethnicity, gender identity, education level, or disability status, fairness measures could also be used to determine whether some subsets of litigants struggle to use ODR more than others, which can help courts eliminate barriers for the affected populations.
A: Absolutely! It’s never too late for a court to use procedural justice metrics to evaluate its ODR program or to rely on data from other courts to make data-informed decisions for its own program. Even courts that already have undergone extensive evaluations should seek to learn from the latest evaluations and studies conducted elsewhere. They might learn something valuable that can help them make decisions about whether to try something new with their platform.
A: To complement procedural justice metrics, courts can look at other factors likely to be relevant to court users, either in their own right or in conjunction with procedural justice questions. For example, courts might ask litigants to evaluate different elements of the platform, such as how user-friendly the interface was, and what additional information should be included in the Q&A section.
A: Over the past year, many courts began offering parties the opportunity to file cases online or participate in virtual hearings, in addition to the ODR option. Because so much of this experimentation was conducted out of necessity, only time will tell if litigants and legal professionals actually prefer these innovations, and whether, if given a choice, they would use them once regular, in-person activities resume.
I’m currently evaluating ODR platforms in two states in collaboration with Resolution Systems Institute. In one court that required litigants to try to resolve their cases using ODR, we observed that many parties appeared for a trial over Zoom without having tried ODR. Our evaluation included a survey which revealed that many litigants didn’t know about the ODR platform even though the court had required them to use it.
Innovations can be wonderful. But courts need to invest in resources that educate parties about new alternatives so they can make informed decisions about whether to use those options. Surveys can help to uncover knowledge gaps and identify solutions to improve the litigant experience.