Skip to Content
Big data and the rise of the philosopher

Big data and the rise of the philosopher

September 3, 2020 at 11:28am


Clinton Castro is taking philosophy to places it hasn’t gone before.

Artificial intelligence. Big data. Machine learning. Algorithms. These may sound more like buzzwords than topics devoted to serious philosophical thought, but Castro — an assistant professor of philosophy — knows they present serious moral and ethical dilemmas.

Computer algorithms are able to find patterns in large quantities of data, helping humans do something that’s often humanly impossible. This new decision-making technology holds great promise. And also, quite a few problems and concerns about the more consequential decisions affecting human lives — from credit scores to hiring practices to the criminal justice system.

Castro is laser focused on how these technologies could be contributing to injustices in the justice system, especially when it comes to bail, parole and prison sentencing. The question that he keeps asking and exploring in his work is whether these systems are unfair and biased, and how they can perhaps be made to be fair and unbiased.

Castro explores two Florida criminal cases in a paper published in Ergo. The first involved the attempted theft of a bicycle and scooter by two young suspects. The second case involved a man stealing power tools from Home Depot.

In both cases, a computer program — which is used in different jurisdictions throughout the country — predicted the likelihood of each suspect committing a future crime. The two who attempted to steal the bike and scooter were considered high risk. The man who stole the tools, and who also carried previous convictions for multiple armed robberies, was considered low risk.

These risk assessment scores are predictions about the future that inform decisions for the present. In this case, high risk scores equated to a higher bond amount.

But, the program was far from accurate.

The “low risk” man whole stole the tools went on to steal again, racking up 30 felony counts for burglary and grand theft. He was sent to prison. The “high risk” defendants completed probation and haven’t committed other crimes.

The program has a history of misidentifying Black defendants as high risk at nearly twice the rate of white defendants. The suspects who attempted to steal the bike and scooter are Black. The man who stole the tools is white.

This got Castro thinking more about the bias that has crept into the machines we rely on. He began to explore possible ways to detect unfairness and focused in on two measures of fairness — classification parity and calibration.

Classification parity means that a system should not be making more mistakes about one group compared to another. For instance, it shouldn’t be more inaccurate for black defendants.

Calibration, as the name suggests, means the ‘scale’ is balanced for everyone. Scores should mean the same thing for everyone. For example, defendants considered "high risk” should reoffend at roughly the same rate, regardless of race.

“Surprisingly, classification parity and calibration can’t both be satisfied all the time,” Castro said. “Sometimes we must choose one over the other. This raises a host of difficult philosophical questions about fairness.”

Castro knows correcting machine bias can’t — and won’t — happen overnight. But, the first step is thinking about the issue. As he says, people can better detect mistakes and pinpoint potential biases by slowing down and taking the time to really notice potential biases and achieve a better understanding of what it even means for a system to be biased or unfair. 

For a field like philosophy that is, indeed, ancient, it may seem impossible that anyone could possibly be saying anything “new.” But, Castro is saying something new — all the while wrestling with decades old questions of right and wrong.

“As philosophers, it’s our job to understand what’s going on. To bring to the surface all of the assumptions that we make about what matters and what’s important,” Castro said. “It’s never easy to figure out the right thing to do. It’s especially hard with newer technologies and systems, but the first step in doing something about the issues is giving them proper attention and thought.”

clintoncastro1.jpg

Clinton Castro