By Kairan Quazi
The last few years have been really tough and overwhelming and frightening to just be a normal kid. It feels like everywhere we look, we can’t escape the angry faces and angry voices of hate. But racism is much more than people screaming at rallies or marching down our streets with assault weapons and Nazi symbols. Racism is a structural disease with poisonous branches that cover every single aspect of our civil society.
I recently tweeted one of my favorite quotes by Ava Duvernay:
“The system is not broken; it was built to be this way. It was built to oppress, it was built to control, it was built to shape our culture in a specific way that kept some people here and some people there. It was built for profit, it was built for political gain. And it has come upon us, it lives off of us — our taxpayer dollars, our votes. We need to be held accountable.”
Racism and its remedies are complicated due to many social, political, economic, and psychological forces they include. There is a lot of discussion now about criminal justice reform. However, I believe that criminal justice reform to be effective must include new rules for law enforcement’s use of Artificial Intelligence (AI) technology to ensure that the data is used justly. I have been really inspired by mathematician Hannah Fry who has written extensively about the complicated relationship between data, power, race, and justice.
While many people imagine a dystopian future ruled by killer robots, some of us are more concerned about a dystopian future ruled by invisible but deadly algorithms that are straight out of George Orwell’s nightmares. Policymakers in every level of government have exploited AI to perpetuate the status quo racial oppression in our criminal justice system. I wrote last year for MIT Tech Review:
“Since predictive algorithms are based on finding patterns in historical data, skewed inputs generate skewed outputs. For example, since our criminal justice system has historically incarcerated minorities disproportionately for every category of crime, any predictive algorithms that law enforcement uses for criminal identification and recidivism patterns will only perpetuate this problem.”
There are 3 prevailing AI tools used by law enforcement agencies that have significantly worsened the system injustices against minority communities: Facial Recognition; Predictive Risk Assessment Models; and Crime Prevention software.
- Facial Recognition Technology, uses biometric technology to scan millions of faces from videos and photos. Law enforcement institutions then match these facial scans to databases of known criminals and suspects. The ubiquitous use of facial recognition software encourages false arrests and potentially wrongful convictions on top of the disproportionate arrests of minority groups for equivalent crimes.
- Predictive Risk Models are used by prosecutors and judges to (i) predict a Recidivism Score, meaning the probability that a criminal defendant will re-offend; and (ii) help decide the length and type of sentence given to a defendant. The models use more than 100 data points related to a defendant’s history with many of the data points rooted in racial profiling. Researchers have shown that the data invariably results in black defendants receiving higher recidivism scores than white defendants for identical crimes.
- Crime Prevention software is used by police departments to mostly target nuisance crimes such as loitering and panhandling which are more widespread in poor neighborhoods. This then leads to the over-policing of minority neighborhoods and greater arrests for petty crimes.
So what should happen? I believe strongly in pursuing an aggressive legal strategy to reclaim our civil liberties and human rights.
The way law enforcement currently uses Facial Recognition, Predictive Risk Models and Crime Prevention software results in a violation of the Constitution’s First Amendment’s implicit privacy clause, Fourth Amendment’s Unreasonable Search and Seizure Clause, Fifth Amendment’s Due Process Clause and Fourteenth Amendment’s Equal Protection Clause.
In October 2019, the ACLU announced that it is suing the FBI, the Department of Justice, and the Drug Enforcement Agency over secret surveillance tactics using facial recognition technology.
Another immediate remedy would be to “fix” the bad data by decriminalizing non-violent nuisance offenses, which currently constitute the vast majority of “crimes”. On the other side, violent crimes should be banned from plea bargain arrangements and be subject to federal guidelines that are not skewed by a defendant’s income or zip code or race.
As a long-term remedy, I strongly believe that a coalition of experts should be formed under the Civil Rights Commission to develop an ethical and just application of these technologies. Technology can be re-tooled to mitigate the bad data and instead be an ally in the battle to remedy some of our systemic injustices.
Thank you for giving our generation a voice on this Juneteenth.