CALENDER FAQs CONTACT TV

ARTICLES NEW LAWS ARTICLE




Dealing with Algorithmic Injustice


Share
International Edition 23'


Lifestyle

2023-08-10

WRITTEN BY:
Thomas J. Freeman



Attorneys must constantly adapt to new technology. More and more decisions that were once made by humans are now made by algorithms. If a lawyer seeks to challenge a decision that affected his client, he is more and more likely to have to challenge an algorithm.

Algorithms are instructions that tell a computer what to do. They are now being deployed for all sorts of purposes, from helping to determine which job applicant receives an interview or interviewee gets a job, to deciding which criminal suspects are granted bond or which criminal convicts receive parole.

One problem is that algorithms are often wrong. Computer programs are not infallible; they make errors. And surprisingly too many, algorithmic decision making is riddled with bias. The errors they make affect real people and are more likely to affect some than others.

Here in the United States, algorithms are trained on datasets composed mainly of middle-aged white men. That tends to normalize their appearances and mannerisms. Because of that, algorithms are more likely to experience difficulties in assessing women and people of color. For example, many studies have shown that some algorithms have more difficulty ‘seeing’ darker skin.

Algorithms are also designed to use historical data to try to predict the future. That creates a significant risk of encoding historical biases. In the United States, we have a history of tense relations between our police departments and minority communities. Many law enforcement agencies now utilize predictive policing programs, which use historical data about crimes to try to predict future crimes and deploy police resources accordingly. If more police are in a neighborhood, they are more likely to observe more crime and arrest law breakers, which can exacerbate tensions and result in yet more law enforcement being assigned there.

Algorithms, like all tools, can be used for good or evil. But the assumptions that they make better or fairer decisions than humans have been disproven. More and more companies and government agencies employ algorithmic decision making affecting more and more of our clients. As attorneys, we will need to be prepared to fight back. The practice of law will increasingly require us to have a basic understanding of how algorithms work and how to challenge them in court.