Defending The Rights Of Clients Throughout San Diego County Since 1999

Why doesn’t computer crime modeling work? 

On Behalf of | Nov 21, 2022 | Criminal Defense |

In an effort to predict where crimes will occur, some police departments have turned to computer models. This is often referred to as predictive policing. The algorithm looks at the data and predicts what types of crimes will occur and where they will take place.

This doesn’t mean that the police arrest anyone, but it means that they can go to these specific areas to look for crimes that are in process or simply to deter them from happening due to their presence.

Often, computer crime modeling is looked at as a way to eliminate any sort of racial bias. A computer isn’t going to be biased against members of a minority ethnic group, for example, but a police officer could be. If the officer predicted where a crime would take place, this prejudice may be obvious. But it is thought that the computer would not reflect that same sense of bias. Why doesn’t this hold true?

Human biases are simply reflected

This doesn’t work because any biases that the police officers may have will be reflected in the data that they put into the algorithm. As a result, predictions that the computer makes are still going to be biased. It’s not the computer that is biased, but the officers.

For example, an officer who targets a certain type of individual when making arrests will then put this arrest data into the computer program. The computer could then deduce that members of this ethnic minority are more likely to commit crimes when all that the computer is actually seeing is that the officer making those arrests is treating people unfairly. The computer has no way to sort legitimate data from biased data.

It’s clear that arrests are not always made fairly, and those who have been arrested need to know about all their legal options.

Archives