Rethinking: The Justice Algorithm

The access to justice (A2J) community currently sees technology as a vital component of any effort to reduce the “justice gap.” Technology is a hopeful solution because it has the ability to amplify and accelerate legal tasks currently performed only by a limited pool of human experts – with many anticipating solutions that reduce the need for human expertise in many situations.

No matter the definition, the justice gap is wide. When looking at civil issues, Legal Services Corporation defines the justice gap as the “difference between the civil legal needs of low-income Americans and the resources available to meet those needs” (LSC, The Justice Gap Report, 2017). For the year 2017, LSC concluded that “86% of the civil legal problems reported by low-income Americans received inadequate or no legal help.”

When looking at the expanded definition put forward by the World Justice Project (WJP) as occurring when people “cannot obtain justice for everyday civil, administrative or criminal problems… are excluded from the opportunities the law provides… [or] live in extreme conditions of injustice” (World Justice Project, Measuring the Justice Gap, 2019), the justice gap in 2019 was estimated to affect 5.1 Billion people, or 2 out of every 3 individuals on the planet. In theory, the justice gap exists because too many have too little ability to access the legal procedures needed to seek justice. If the efforts of the few currently working to close this gap can be amplified through the use of technology, many argue that this increased access to legal procedures enabled by technology will lead to justice.

Much of the discussion around A2J revolves around the technological limits of access to legal procedures. Does someone have broadband? Does someone have technological comfort? Does someone need communication support? Yet effectively defining access begs an important question: access to what?

Operation versus Optimization
Even with truly equitable access to legal procedure, it is not clear that procedural efficiency will lead to justice. In many communities, injustice is baked into the culture such that legal processes reinforce an intentionally designed injustice. Like a dying star, the time it takes to update legal procedures in a rapidly changing society means the justice delivered will be radiating from a reality that no longer exists. Outcomes that were once not opposed by a community may become unacceptable once the community makeup or power dynamics shift.

In these communities, since access to justice technologies are currently little more than process accelerants, accelerating what were or have become unjust processes leads to a more rapid delivery of injustice. Technology cannot be effectively deployed without clearly designing the process leading towards the intended outcome. So what should be technology’s intended outcome if it is seeking to enable justice?

The A2J community’s traditional focus on procedural justice has led most dispute resolution technology (DRTech) efforts to digitize legal operations. By focusing on the operation, the limited view of the unaided human mind can quickly identify operational injustice. Bias can be seen when the same variables produce different outcomes. Yet justice is not an operation, nor is it the same for everyone. Justice is a state of balance. The true value of DRTech is in its ability to simultaneously manage the breadth of variables that an unaided human mind cannot process, and optimize those variables for the conditions that lead to balance. Technology cannot find justice through mimicking human operation. Technology is uniquely capable of discovering an unlimited number of unique states of justice through optimization.

The Target
Every optimization routine must be given a target that it will work towards. Many of the greatest failings in DRTech come from inappropriate targets. For example, facial recognition has routinely failed the African American community when it is optimizing based on datasets of white faces. Predictive policing has routinely failed when it is optimizing based on datasets of inequitable decisions. Online arbitration has routinely failed when it is optimizing based on past decisions. So what would happen if we defined the “justice algorithm” as one seeking the following target:

Optimize the deployment of resources in a manner that minimizes the reduction of equitable access to future opportunities.

Potential Application
The potential elegance of such an optimization approach is its reflection of our knowledge that “just outcomes” are never static. This optimization approach accepts the fluidity of justice and leverages technology to help us refine what the unaided mind is unable to see. For those in the alternative dispute resolution (ADR) space, the greatest feature of ADR is its ability to discover acceptable outcomes which cannot be arrived at through procedural justice. As the number of participants and issues expand in any ADR process, we are challenged to balance all of them. Arriving at a “just” divorce settlement is challenging enough when balancing financial, social and emotional, and the operational issues around breaking a “micro-community” into equitable parts. Expanding what works in a divorce settlement to define optimized justice in our communities more broadly requires accounting for an immense number of variables.

Implementing such a definition of justice without the aid of online dispute resolution (ODR) technologies is likely impossible. Yet the processing power of technology could allow for more variables to be considered and balanced at once. It could allow us to test which variables are more impactful than others. Technology could offer us the vision we need to see extremely complex systems clearly enough to determine what we, as a society, believe are appropriate and just outcomes. Employing technology in this way where it is most useful, by allowing it to accelerate and amplify our vision, will allow us to see the outsized impact that many of our more subtle resource allocation decisions have on our ability to access those just outcomes.

For example, consider the scenario where a man breaks into a house, steals personal possessions while the family is home, and is caught, convicted, and incarcerated. This model would assume justice will have been done if we optimize for no less than the following variables:
– The cost of retrieving the personal items that were stolen,
– The time and cost of mental health services needed for the affected family to feel safe in their own home again,
– The reduced earning potential of those affected by the home invasion,
– The cost of incarceration,
– The cost of the rehabilitation programs available to the incarcerated individual,
– The economic impact borne by the incarcerated individual’s family during incarceration, and
– The reduced earning potential of the incarcerated individual post release.

Each variable is a vital element within a complex balance. The punishment inflicted upon the perpetrator must be balanced by the cost of the victim becoming whole. The cost of rehabilitation must account for all those impacted by the punishment. The view must be towards what will come next, towards how the path forward is proportionate and equitable. We know the victims will carry this event with them in some manner for the rest of their lives, as will the perpetrator. Technology now allows us to evaluate with remarkable precision the inputs and their impact on the opportunities that will remain for all parties impacted by such an event.

We must embrace the idea that technology can help us see better than it can help us decide.

Justice, as proposed here, is ensuring that the resolving and preventing of our disputes does not take away or cause anyone to lose access to opportunities inequitably. It is a solution so large that even conceptualizing it can be a challenge, especially since the outcomes at any point cannot be known in advance. Yet it is a uniquely positioned solution that turns technology away from accelerating and amplifying known failings, and towards exploring what we have previously been unable to effectively define.

Chris Draper is an NCTDR Fellow, Managing Director of Trokt, Co-Chair of the ABA Dispute Resolution Technology Committee, and Visiting Scholar at the Indiana University Ostrom Workshop. He is accessible on Twitter as @theotherdraper