Current Issues at the Intersection of Tech and Justice

With news breaking at the pace of your Twitter feed, it can be a challenge to keep up with all the important topics impacting our communities today. We asked each Paladin team member to take a moment for learning and reflection by choosing a brief article on a topic related to justice and technology, and to share their thoughts.

Read on to see what we learned and how it shifted our perspectives.

Felicity Conrad, Co-Founder and CEO: The Justice System as a Digital Platform (Jason Tashea, The Commons)

From the article: “It is safe to say that courts that function as a digital platform will increase the diversity of people and organizations that can assist them, ultimately expanding services, transparency, and access-to-justice.”

The COVID-19 crisis has shed light on a critical opportunity for the justice system (and courts in particular) to embrace a digital platform model structured as an information layer, platform layer, and the presentation layer. Adopting such a model would allow for improved service delivery, interoperable data, increased innovation, and ultimately help move the needle on access to justice and improved client outcomes.

With the rise of remote legal services during the pandemic, as well as the growing community of access to justice technologists, there is a real opportunity for the broader ecosystem to collaborate on ambitious initiatives, such as working with court systems evolve to a digital platform model.

Eric Krawczyk, Senior Software Engineer — A Tipping Point for Asian American Lawyers? (Vivia Chen, Bloomberg Law)

From the article: “Being accepted just enough is also why Asian Americans seem sidelined in the race discussion, even though every law firm, company and organization seems to be engaged in corporate soul searching about race these days. And it’s also why some Asian Americans find it awkward to bring up the racism they’ve faced.”

Spurred on by media portrayals of COVID-19 and a widening cultural divide, the recent terrorist attacks in Atlanta have illuminated exactly how severe anti-Asian racism in the US has become. While Asian-Americans have been able to elevate their socioeconomic status in ways that other minorities have not been given the chance to, there is still a glass ceiling in how far Asian-Americans can rise, even in the legal industry. Columnist Vivian Chen calls on all Americans to not only learn and acknowledge the ways that racism manifests against Asian-Americans, but also implores Asian-Americans to make their voice heard in the conversation about building a racially equitable country.

Even though Asian-Americans have the appearance of privilege due to higher class-mobility than other minority groups, they are still subject to discrimination at all levels. Because of this perceived privilege, it is easy to omit the Asian-American experience from discussions about race and class. But by failing to acknowledge the racism of their experience and the intersectionality with other minority groups, it allows the myth of the “Model Minority” to continue and perpetuates systemic inequity.

Mitch Oram, Enterprise Account Executive — The Black Hole at the Heart of the Eviction Crisis (Peter Hepburn and Yuliya Panfil, The New York Times)

From the article: “The federal government collects data on evictions from public housing authorities. But it has little to no eviction information on the private rental market, where the vast majority of American renters live.”

There isn’t a way for the government to efficiently track and assess eviction data for the private rental market. During the pandemic, this issue was magnified —individuals are being wrongfully evicted without getting the proper help beforehand.

The eviction crisis is leaving countless people without a roof over their head, and in part due to something completely out of their control. If there was a more reliable way to track who needs help, families and individuals could receive the assistance they need, allowing them to remain in their home.

Kristen Sonday, Co-Founder and COO—Whistleblowers: Software Bug Keeping Hundreds Of Inmates In Arizona Prisons Beyond Release Dates (Jimmy Jenkins, KJZZ/Public Radio Arizona)

From the article: “There are choices being made like business decisions, when they need to be thinking about what’s in the best interest of the staff, and ultimately, what’s in the best interest of the inmates.”

According to Arizona Department of Corrections whistleblowers, their inmate management software’s inability to account for a 2019 legislative change for earned credit is keeping hundreds of incarcerated people locked up who should be eligible for release in prison.

The inmate management system clearly wasn’t designed with a thorough understanding of both institutional and human needs, nor the second and third order consequences of potential changes to the system. A human-centric approach to leveraging technology to serve inmates and prisons would significantly increase accuracy and efficiency in managing one of the most charged parts of our judicial system.

Maya Bielinski, Director of Product — Can Auditing Eliminate Bias from Algorithms? (Alfred Ng, The Markup)

From the article: “The big problem is, we’re going to find as this field gets more lucrative, we really need standards for what an audit is…There are plenty of people out there who are willing to call something an audit, make a nice looking website and call it a day and rake in cash with no standards.”

In more and more contexts, decisions are being made by algorithms (rather than by human judgment). These algorithms are often black boxes, shielded by proprietary claims. Because of growing concerns about the harmful bias embedded in those algorithms, companies are contracting auditing companies to determine whether the algorithms are fair. But with no industry standards or regulations, internally contracted audits can often be mere PR stunts. Adversarial audits can be effective if there’s public outcry — but what kind of accountability is that?

Algorithmic bias issues arises in a huge range of important contexts, including in healthcare, criminal justice, and employment law. There exists great scholarship about ethics in algorithmic applications — there’s a general popular consensus that algorithms should not discriminate unjustly. But how do we put that into practice? Algorithmic auditing by a disinterested third party — much like a financial audit — is one way to counter the general lack of transparency and monitoring of algorithmic systems. This article discusses some options to give those audits teeth.

Sigourney Norman, Community Manager — Why filming police violence has done nothing to stop it (Ethan Zuckerman, MIT Technology Review)

From the article: “After years of increasingly widespread bodycam use and ever more pervasive social media, it’s clear that information can work only when it’s harnessed to power.”

Tech advances are empty without structural change, therefore, placing agency in the community is a more important tool than body cameras. Consequences for police officers need to be structurally integrated to make video recordings of their behavior an effective over-use-of-force prevention tool. Accordingly, municipalities will benefit from not only instituting body cameras but also ending qualified immunity for police officers, as has been done recently in New York City.

Interestingly, cameras can actually skew juries in favor of police officers because low film quality and physical speed of all parties suggests a “scary” environment wherein a lethal response may be seen as “objectively reasonable” in a court of law. These findings suggest that perhaps body cameras centering the experience of the person with the ability to abuse power is not the most important perspective to record. Indeed, bystander filming may ultimately be more valuable for prosecuting police violence because these types of recordings communicate the violence done to the victim.

We can only support justice outcomes for communities if we understand the current landscape. Differentiating between the tools that help communities and those that are only surface responses empowers us to promote the work that truly shifts justice outcomes for the better.

Phoebe Duggan, Director of Programs — Scaling Justice: Will regulatory sandboxes help close the justice gap? (Max Houben, The Legal Technologist)

From the article: “Regulatory sandboxes are in essence a policy structure that establishes a controlled environment. In this controlled environment, access to justice innovators can, together with regulators, develop, test and monitor new legal products and services.”

In an effort to help lower the barrier to entry for legal innovators seeking to address the access to justice gap, “regulatory sandboxes” are emerging as a way for those innovators to “together with regulators, develop, test and monitor new legal products and services.”

Pioneering the concept, the State of Utah through the Utah Work Group on Regulatory Reform has established a detailed five phase process for interested participants. While the jury is still out on whether the framework strikes the right balance and has the “intended effect of safely delivering innovation to users” — what is clear is that “sandbox-inspired ideas and methodologies are starting to get a food in the ground’ both in the U.S and beyond (particularly throughout Europe).

Given the “commonly acknowledged barrier to the innovation needed is the strict set of rules regulating our legal services” the emergence of regulatory sandboxes is a promising sign that regulators are open to (at least testing) new ways of providing legal assistance.

Matt Tucker, CTO — Wrongfully Accused by an Algorithm (Kashmir Hill, The New York Times)

From the article: “(Mr. Williams’ daughter) has since taken to playing “cops and robbers” and accuses her father of stealing things, insisting on “locking him up” in the living room.”

In 2018, a facial recognition algorithm used by the Detroit police department falsely identified Robert Julian-Borchak Williams, a black man, as the perpetrator of a theft. He escaped conviction thanks in part to the help of an ACLU volunteer attorney but the process was expensive, “humiliating” and traumatic for his young daughter. While more departments around the country turn to facial recognition systems to investigate crime, a federal investigation has found that they are racially biased, “falsely identifying African-American and Asian faces 10 times to 100 times more than Caucasian faces.”

While technology offers the possibility to build a more just society, it also has the potential to scale biases and systemic injustice already embedded in our system. As citizens, we need to demand transparency into how these algorithms operate, when they are utilized and remain vigilant to ensure the rights of our communities are not violated.

Ilana Flemming, Director of Customer Success — “Infodemiologists” Dive Into the Battle Against Vaccine Misinformation on Facebook (Dawn Stover, Mother Jones)

From the article: “[W]hen false information becomes rampant, as in today’s ‘infodemic,’ it becomes difficult for the general public to sort fact from fiction. Eventually, that can lead to a breakdown in public trust.”

A new pilot project is training and organizing “infodemiologists” to respond to misinformation about the COVID-19 vaccine on Facebook. Targeting the comments of news articles from verified sources, the infodemiologists’ job is not to confront or argue with vaccine skeptics. Rather, they seek to understand individuals’ concerns and provide accurate data in response. The overarching goal is not to change minds one commenter at a time, but to ensure that the social media “bystanders” reading these exchanges are being presented with scientific facts to rebut the misinformation.

Cultural competency is an important component of the infodemiology project, as is responding to people with openness and curiosity, not condemnation. The infodemiologists are trained to inquire into vaccine hesitancy to better understand a commenter’s concern, and then provide facts to clarify and counter the misinformation or misunderstanding. Infodemiologists are compared to CDC epidemiologists, who “serve on the front lines, springing into action when they detect an outbreak that threatens public health.” Given the rapid spread of misinformation on Facebook, and the critical need for widespread vaccine uptake, the infodemiology project is a promising strategy for harnessing the power of social media to educate, inform, and spur collective action.

Diego Allen, Contract Software Engineer— ‘They track every move’: how US parole apps created digital prisoners (Todd Feathers, The Guardian)

“[The use of probation tracking apps] doesn’t do anything to get at the fact that the criminal justice system itself is geared toward punishment, especially when you’re talking about underprivileged communities of colour…And once you get into these practices where you’re pulling data, biometric data, and these companies are using that data to further monetise their programmes and experiment, often it’s people of colour who are having their data extracted from them. This valuable commodity is literally the body of black individuals.”

App tracking can be a less stigmatizing experience than wearing an ankle bracelet. It can also lead to increased supervision, and has the drawback of less human interaction. Analyzing location and biometric data to assess risky behavior seems straight out of a Black Mirror episode, and I’m really skeptical of the use of voice recognition algorithms as a way to test for drugs and alcohol. Building a dataset for predictive use out this sort of technology in a biased justice system could also lead to further biased ruling. Ultimately, the use of technology doesn’t solve the human problem of a justice system geared toward punishment.

Travis Lloyd, Lead Product Engineer—Powerful DNA Software Used in Hundreds of Criminal Cases Faces New Scrutiny (Lauren Kirchner, The Markup)

From the article: “Life-or-death software programs — like those used in medical equipment or airplanes — must undergo independent validation and verification processes, but not so DNA software. There are few federal regulations about how police or crime labs introduce new technologies and methods into crime-solving.”

Software can provide a powerful tool to the legal system, but how it works and is used needs to be publicly available to ensure it’s applied justly. As technologists, we need to make sure we’re open to feedback on the unintended consequences of our software. Bias is present everywhere and only by being open to public scrutiny and feedback can we address it!

Try this with your own team or even a group of friends! It’s a great way to expand our perspectives and educate ourselves and others on emerging questions in legal technology that impact all areas of life in today’s world, from racial justice to data privacy to safe housing. And when we know more, we can do more.


Current at the Intersection of Tech and Justice was originally published in Paladin on Medium, where people are continuing the conversation by highlighting and responding to this story.