Notice & Comment

Technology is Not the Boogeyman: Orly Lobel’s “The Equality Machine,” by Christopher Slobogin

*This is the first post in a symposium on Orly Lobel’s The Equality Machine: Harnessing Digital Technology for a Brighter, More Inclusive Future, selected by The Economist as a best book of 2022. All posts from this symposium can be found here. Further reviews can be found at ScienceThe Economist, and Kirkus.

The message of Orly Lobel’s book The Equality Machine: Harnessing Digital Technology for a Brighter, More Inclusive Future is distinctly different from many books that examine the future of artificial intelligence, machine learning, data mining and related technological advances. The usual screed cautions that the future is likely to be increasingly haunted by pervasive surveillance, mechanistic decision-making by the numbers, profiling using biased data, and disembodied or robotic replacements for humans who need work. Lobel’s more optimistic take is that technology can improve on the accuracy and neutrality of human decision-making, provide a lever for changing archaic white-male-dominated practices, and even improve our sex life and other interactions with humans. At the same time, she cautions that government oversight of all of this is necessary: information sources need to be transparent, corporate manipulation exposed and sanctioned, and privacy protected. 

The central message of the book might be this: AI and its adjuncts are coming to your workplace, home and bedroom soon, if they have not already arrived. The response should be experiment and regulate, not shy away from or prohibit. I am fully on board with this message. In Just Algorithms: Using Science to Reduce Incarceration and Inform a Jurisprudence of Risk I make the same types of points, albeit confined to the much smaller space of the American criminal justice system.[1] The central problem confronting that system today is the scope of our incarcerated population. Compared to Western Europe, we imprison six times more people per capita, impose sentences three times as long, and resort to pretrial detention far more often. The cost of this system is huge, whether measured in terms of lost human capital, family disruption, or diversion of funding from social programs. And it also does very little to reduce crime.[2]

Most of the solutions proposed to date are either too radical (abolish prisons altogether) or not radical enough (release people imprisoned for committing minor crimes, a miniscule number of prisoners). Algorithms—specifically, statistically-derived or machined risk assessment instruments—may provide a much better solution. They can help identify those offenders who pose the highest risk of violent recidivism (a group that comprises less than 20% of the prison population[3]) as well as match offenders to the most effective rehabilitation programs, which would then allow early release of most prisoners without significantly endangering the public. The benefits and criticisms of risk assessment instruments at sentencing echo the arguments that Professor Lobel canvasses in her book over the use of digitization in hiring, medicine, policing and other settings. Algorithms are more accurate than people at assessing risk, as well as less likely to arrive at biased outcomes in doing so, because they eliminate the “noise” that infects human decision-making; furthermore, any bias that does exist is more easily detected and corrected with algorithms than with humans.[4] Algorithms also provide a quantification of the trade-offs that are made in deciding to release certain types of people (specifically, the likely rate of false positives who should not be detained and the likely rate of false negatives who should be). This quantification of risk concretely exposes the normative choices that must be made about whom to keep in prison and whom to release, choices that today are hidden behind vague pronouncements about dangerousness.[5]

At the same time, algorithms have an impersonal quality to them that may seem particularly troublesome in the criminal context, where there is a strong tradition of procedural justice. Enhancing a person’s sentence based on data derived from study of other people and on risk factors over which a person may have little or no control (e.g., childhood upbringing, diagnosis, age or gender) may be repugnant to many. The perceived lack of individualization is exacerbated when the instruments have been developed by private companies claiming trade secret protection for their data sources and their regression and machine learning analysis. Some have argued that these problems counsel in favor of prohibiting the use of algorithms in the criminal process. A better reaction, consistent with Professor Lobel’s open-minded approach, is to legislatively finetune the questions the law wants answered, mandate that algorithms meet certain accuracy thresholds in answering them, and either force companies to release their data (under the Sixth Amendment[6]) or outsource the risk assessment job to universities with government grants.[7] Further, individuals should not be subject to enhanced sentences based on risk unless they receive a full hearing, represented by counsel, at which they can counter the evidence.

The criminal context raises two other issues that have parallels with Professor Lobel’s discussion in The Equality Machine. The first is whether certain demographic characteristics—particularly race and gender—should ever be explicitly integrated into an algorithm. Lobel’s answer is yes—usually—because it will help discover racialized or gender-oriented tendencies.[8] But Lobel does not address whether this use of legally protected characteristics as classifiers violates equal protection doctrine. The argument in the risk assessment context is that it does not if one trains the data on groups organized by race and gender, because these characteristics are not being used as explicit risk factors but rather merely to establish which algorithm to use (much like some parts of the military have different physical thresholds for women compared to men[9]). If may be that, because of racialized policing, a black person with three arrests poses no greater risk than a white person with one arrest; Lobel’s approach would discover that fact at the same time it would produce more accurate risk assessments for both races.

The second issue is privacy. Artificial intelligence usually works best with more data rather than less. But this “data greed” could undermine personal security and chill engagement in activities that create the data.[10] If data is linkable to identifiable people, government (or a private company) can easily compile digital dossiers on everyone. Lobel notes, however, that algorithms can often be trained using anonymized data; [11] that move is particularly appropriate in the criminal context, given its stigmatizing potential. Further, as Lobel proposes, that type of stricture on how to use data is something that should be mandated legislatively, rather than left to chance (or to a profit-motivated company). Orly Lobel’s book is a clarion call to think carefully about the burgeoning digital world, with a stance somewhere between welcoming it with open arms and rejecting it out of hand. The Equality Machine has significant implications for all walks of life.

Christopher Slobogin is the Milton Underwood Professor of Law at Vanderbilt University.


[1] Christopher Slobogin, Just Algorithms: Using Science to Reduce Incarceration and Inform a Jurisprudence of Risk (Cambridge Univ. Press, 2021).

[2] Id. at 1-7.

[3] Id. at 28-29.

[4] For analogous points by Lobel, see The Equality Machine, at 26-27; 33; 53; 92; 147.

[5] For analogous points by Lobel, see id. at 11; 220-221; 303.

[6] The Sixth Amendment guarantees that “in all criminal prosecutions, the accused shall enjoy the right . . . to be confronted with witnesses against him.”

[7] For analogous points by Lobel, see The Equality Machine, 213; 294-300.

[8] The Equality Machine, at 29-30; 70; 301-302.

[9] Michelle Dwyer, The Differences in Being in the Military for Men and Women, Oct. 4, 2017, https://classroom. synonym.com/the-differences-in-being-in-the-military-for-women-men-13583848.html (“Each military branch gives physical training tests to its troops, and they all have different requirements. . . . Passing standards for women in all services are usually lower.”).

[10] See Sarah Brayne, Predict and Surveil: 89 (describing “data greed”); 114-116 (describing “system avoidance” by people who do not want to become data points). 

[11] The Equality Machine, at 106; 151; 268; 304.

Print Friendly, PDF & Email