Editor’s Note: This article is a review and includes subjective thoughts, opinions and critiques.
This review contains spoilers.
Every winter, Stanford’s COLLEGE 102: “Citizenship in the 21st Century” ranks among the most-enrolled courses of the quarter. As part of the required COLLEGE (Civic, Liberal and Global Education) program, the class asks first-years to confront what it means to be a citizen in a rapidly changing world.
One of the course’s themes examines technology’s role in democracy. Students read a piece in the MIT Technology Review on algorithmic fairness and the Correctional Offender Management Profiling for Alternative Sanctions (COMPAS) software-based tool used in several U.S. state courts that assigns defendants a risk score from 1 to 10, predicting rearrest. Drawing on criminal histories, demographics and questionnaires, the algorithm acts as a “robo-judge,” generating a score judges consult when setting bail or sentences.
The article invites readers to experiment with an interactive slider, representing a threshold for classifying a defendant as “high risk.” Lowering this cutoff imprisons more people who would not reoffend, a false positive, while raising it releases more who will, a false negative. In other words, the system cannot eliminate errors; it can only decide which type occurs more often. No single threshold satisfies every definition of fairness. Thus, the article suggests that even a perfectly consistent algorithm cannot be value-free; someone must decide which mistakes matter more.
This conflict becomes reality in “Mercy”, a 2026 sci-fi thriller directed by Timur Bekmambetov and starring Chris Pratt. The film envisions a near-future Los Angeles where violent crimes are adjudicated by artificial intelligence (AI) judges in the Mercy Capital Court, a system designed to address soaring crime rates by delivering faster, supposedly impartial justice. In court, defendants have 90 minutes to prove their innocence before facing immediate execution.
Pratt plays Detective Chris Raven — one of the system’s original architects and early supporters — who becomes its next defendant. Accused of murdering his wife, he is strapped into a high-tech courtroom while an uncanny, human-like holographic judge named Maddox (voiced by Rebecca Ferguson) computes his guilt to be 97%. To survive, he must lower that probability below 92%. And for the 90 minutes of the film, probabilities flicker across the courtroom’s digital displays as Raven scrambles to chip away at the score.
Portrayed almost entirely through screens — surveillance feeds, drone footage, police body cams and video calls — the film adopts a “screenlife” format that mirrors its themes. The audience never fully escapes the digital interface, and instead, we watch probabilities flicker and countdown clocks tick in real time, creating genuine tension. Like Raven, we are also confined, dependent on whatever data the system surfaces next.
Watching those numbers tick down reminded me of dragging the slider in our COMPAS reading. Except, in COLLEGE 102, adjusting the threshold changed who was jailed; in “Mercy,” this cutoff determines who lives.
This tension drives the film’s suspense. As Raven scrambles to reconstruct the night of his wife’s murder, Maddox feeds him surveillance clips and cloud data in real time. The system appears rational and impartial — even helpful. It ultimately assists in identifying the real culprit: Raven’s friend Rob, who framed Raven to avenge his brother — the first person the Mercy Court executed.
But we later learn the brother was innocent. A police officer had deleted evidence that would have supported him before the case reached the Mercy system, so the AI calculated his guilt using only the incomplete information it received — data that ultimately pointed toward conviction. Here, the system appeared impartial, but its outcome was in fact shaped by a human decision made long before the probability of guilt flashed on screen.
Where “Mercy” falters is in how fully it explores this insight. As the film progresses, philosophical questions about automated justice give way to more conventional thriller beats and predictable twists. What does it truly mean to hand life-or-death decisions to a computational probability score? Who is accountable when that score is wrong? The film raises these questions but doesn’t expand on them.
Pratt, physically restrained for much of the runtime, conveys desperation through close-ups rather than action, while Ferguson remains deliberately composed and emotionally neutral. In other words, the script ultimately trades reflection for momentum.
Still, the film’s central dilemma lands. Even a perfectly functioning algorithm forces someone to draw a line — to decide how much uncertainty justifies punishment. That line — 70%, 92%, whatever the number — is not objective. Instead, it is a subjective moral judgment disguised as math. Someone sets the cutoff. Someone decides what level of risk is acceptable.
COLLEGE 102 and “Mercy” suggest that citizenship requires questioning the systems that govern us — especially when those systems claim neutrality. When we hand the gavel to a machine, we do not eliminate human judgment; we merely conceal it behind code. But no robo-judge, no matter how advanced, can make that choice for us.