Featured ReportingLatest NewsPrison ProtestSeries: The Abolition Movement

Thousands Of Mathematicians Join Boycott Against Police Collaboration

Over 2,000 mathematicians have signed a letter agreeing to boycott all collaboration with police, and insisting their colleagues do the same.

They are organizing a wide base of mathematicians in the hopes of cutting off police technologies at their source. The letter’s authors cite “deep concerns over the use of machine learning, AI, and facial recognition technologies to justify and perpetuate oppression.”

Predictive policing is one key area where some mathematicians and scientists have enabled the racist algorithms now animating broken-windows policing, which tell cops to treat specific areas as “hotspots” for potential crime. Activists have long criticized the bias inherent in these practices. Algorithms trained on data produced by racist policing will reproduce that prejudice to “predict” where crime will be committed and who is potentially criminal.

“The data does not speak for itself, it’s not neutral,” explains Brendan McQuade, author of Pacifying the Homeland: Intelligence Fusion and Mass Supervision. Police data is “dirty data,” because it does not represent crime, but policing and arrests. 

“So what are its predictions going to find? That police should deploy their resources in the same place police have traditionally deployed their resources.”

Algorithms Perpetuate Over-Criminalization

Predictive policing of this kind came into use only in 2013, but today its implementation is both vast and—because companies claim their algorithms and customers (local governments) are valuable trade secrets—obscure. Most U.S. states and major cities are thought to use some type of predictive policing model, with known users including Chicago, Atlanta, Tacoma, New York, and LA, though not without a considerable amount of pushback.

Tarik Aougab, an Assistant Professor of Mathematics at Haverford College, was one of many mathematicians who saw the recent uprising as a push to take action against these practices. “If there is already disproportionately large amounts of time and energy being spent criminalizing Black and brown people,” Aougab explains, “the predictions the algorithm puts forth are just going to reflect that. It’s a way to perpetuate that over-criminalization.” 

Black people not only face higher murder rates at the hands of police but disproportionately high arrest rates as well—twice that of their white counterparts. Racial data is not allowed to be used as a factor within predictive policing models, but in the US, location and socioeconomic factors are employed as an easy stand-in for racial data. 

The popular use of racist predictive policing technologies is unsurprising for Aougab, who traces the history of the police, which is rooted in slave patrols and private security forces hired by the rich to break up labor strikes. Their primary motivation is not to serve and protect, Aougab said, but “preserving a social order that is put forth by the will of elites to protect property. The institution has never drifted away from that basic function over its entire existence in this country.”

McQuade argues that “surveillance isn’t just about policing, it’s about more subtle forms of social control.” Just as a reduction in prison population has only led to an increase of home monitors, so too is there a risk that reduction in police on patrol would only lead to greater use of predictive policing technologies. 

Mathematicians Boycott Police

The call to action for many among this large collaboration of mathematicians was the police killings of George Floyd, Tony McDade, and Breonna Taylor. 

“At some point we all reach a breaking point, where what is right in front of our eyes becomes more obvious,” says Jayadev Athreya, a participant in the boycott and Associate Professor of Mathematics at the University of Washington. “Fundamentally, it’s a matter of justice.”

The mathematicians penned an open letter, collecting thousands of signatures for a widespread boycott. Every mathematician within the group’s network pledges to refuse any and all collaboration with police. Called out in particular was a mathematics foundation that allowed the founder of PredPol, a major predictive policing company, to host a sponsored workshop encouraging mathematicians to work with police.

PredPol is not a neutral tool to deliver data, but a policing company with an agenda. Internally, it compares its software to the highly-criticized broken windows policing strategy, where a department heavily polices nuisance violations believing that doing so will curb more serious crime. There is little evidence that this is so. 

It has instead become a vector for mass incarceration, sending millions of mostly young Black men to prisons for non-violent crimes or, in cases like Eric Garner’s, death by police while selling loose cigarettes.

PredPol encourages departments to expand their usage of the technology to include predicting “nuisance” crimes. This provides them with more business, and it’s easier for algorithms to predict, because non-violent crimes happen with more regularity.

“There’s a big question here: is predictive policing really getting ahead of events, or is it just a self-fulfilling prophecy?” McQuade explains that “crime statistics” are more accurately referred to as “arrest statistics.” They measure police behavior, which is not directly correlated with crime and violence. These arrests justify and perpetuate more arrests. 

Athreya explains the boycotters will confront this by collaborating with criminal justice organizations. 

“We want to work through issues of how various algorithms are used in the criminal justice system, for things from facial recognition to DNA matching algorithms, where community groups and mathematicians can have a say.” 

One step along that route is the creation of ethics materials for courses and conferences, critical of policing and the role of these technologies as part of it. Equally important, however, is developing a culture around the mathematics community, where those cooperating with police are outcast.

“We want folks to be politicized,” Aougab explains. “In mathematics that is rare, because there is this widely held and false belief that mathematicians have some sort of super power to be objective, and not influenced by political considerations like any other human being.” 

The open letter, and its upcoming publication in the notices of the American Mathematical Society, is the first step in that direction. Mathematicians, when confronted with large-scale police brutality and organizing by their peers, must choose a side—that of the people or the police.

Those who chose to side with the police have pushed back against Aougab’s criticisms. “They would say ‘You all are not personally involved in writing these algorithms, so you don’t know enough about the math to be taking such a hard stance.’” 

“The point I want to make is we did not arrive at this stance by finding an error in a formula,” Aougab said. “This is a political stance that anybody can take.” 

Giving Policing A ‘Scientific Veneer’

An algorithm can only be as good as its data, and in this case, the data is derived from the actions of corrupt and biased police departments. 

One study conducted by the AI Institute last year examined police departments and their predictive policing systems. The result: “in numerous jurisdictions, these systems are built on data produced during documented periods of flawed, racially biased, and sometimes unlawful practices and policies.”

A 2019 audit on predictive policing in Los Angeles found a serious lack of oversight or procedures around the tools, rendering them relatively useless. While some trials have found a slight, but statistically insignificant improvement in prevention of crime, accurate studies are difficult to find. Complicating matters is the fact that researchers have noticed police tend to pursue their own ‘known hotspots’ rather than follow the dictates of the technology.

For Aougab, that is to be expected because police never intend to change their practices. “It’s nefarious,” he explains, “because now there’s a scientific veneer for what the police were going to do anyway. It doesn’t actually have an impact on how the police police, and where they police. But it gives them what appears to be a legitimate justification for what they were already going to do.”

One demand of the mathematics collective is that any algorithm must face a public audit before its use—a process in which mathematicians and community groups must participate. 

But the profit motivations behind tools like PredPol have resulted in fiercely guarded practices. Companies claim both their algorithms and clients are trade secrets, blocking analysis of exactly how these black-box technologies work and are used. 

Police departments are similarly tight-lipped. In 2018, the New York Police Department (NYPD)  faced a lawsuit after denying information requests for so much as the algorithmic inputs and outputs. The NYPD argued that releasing predictive policing data would allow criminals to somehow game the system, as well as harm relationships with their vendors.

Tech companies are major players in the policing business, lobbying state legislatures to approve bills that allow for use of their technology. In the case of PredPol, this means effectively throwing millions of dollars to bolster broken windows policing tactics.

Is Predictive Policing Free From Bias Possible?

Newer models have tried to break free from the bias of arrest data by building their algorithms on citizen reports, suggesting that 911 calls similarly correlate to where crime is happening. 

While this avoids the bias of individual police officers, the bias of callers remain. Only around 40 percent of the victims of violent crime even report their assault to the police in the first place, and many marginalized people avoid doing so out of fear for their safety. 

Viral videos depicting white women calling the police on Black neighbors simply for existing in public also illustrate the racist motivations behind police calls. 

Critics of predictive policing models also point out that the type of “crime” police address rarely includes far more common white-collar crimes such as wage theft and fraud. These can indirectly but violently cost people their lives, health, and homes. The state and police make an active choice to pursue certain crimes over others, and as a result, the poor are surveilled and protection of property is prioritized over human lives.

Meanwhile, the harm created by this situation is not considered criminal. 

“We don’t usually think of any deprivation of rights or basic necessities as a crime,” McQuade notes. Deprivations including disinvestment from communities, poverty, crumbling infrastructure, failing schools, public health issues, pollution, and voter suppression are all the result of decisions made by those in power. “There’s a strong case to be made that those are crimes, and that’s erased from consideration.”

Police abolitionists suggest other paths forward. Rather than tackling “crime,” society should be focusing on addressing “harm.” Reducing public urination and graffiti might be successful crime prevention to PredPol’s algorithms, but they have little to do with the material and political reasons behind why violence occurs, especially in communities constantly subjected to poverty and police violence.

Policing and prisons are not a just or effective way to prevent and heal damage, Aougab explains. “None of these algorithms, even if they worked perfectly—which they don’t—would be addressing harm.”

The Threat Of AI In An Uprising

As calls for defunding police echo across the U.S., mathematicians and activists keep a watchful eye out for increases in algorithm-based policing and surveillance. Using AI is cheaper than paying for salaries of patrolling police, but just as biased and with potentially just as deadly outcomes. 

Among the Black Lives Matter uprisings, police are doubling down on AI technologies as a deadly weapon. Facial recognition technologies are being used across the country to identify, track down and arrest protesters with alleged criminal activity, to dissuade others from taking to the streets.

As police surveillance and AI use escalate, Aougab insists ”The mathematics community is going to have to figure out how to play a bigger role in monitoring and ultimately dismantling the collaborations that exist between us, and the institution of policing.”

Maddie Rose

Maddie Rose

Maddie Rose is a freelance journalist, housing organizer and proud troublemaker based in Philadelphia. Their work has previously appeared in Teen Vogue.