Dissenter FeaturedLatest NewsThe Dissenter

NYPD Chief Bill Bratton: ‘Minority Report’ Is Modern Fact, Not Fiction

In Philip K. Dick’s “Minority Report,” the authoritarian system in place to predict crime and catch individuals before they commit crimes is dystopian fantasy. In the mind of New York Police Department Commissioner Bill Bratton, this story is part of today’s reality, one the NYPD is fueling through experiments with predictive policing.

Bratton participated in a panel hosted by The New York Times, which was called, “Data Mining the Modern City.” During the panel, Bratton referred to the film adaptation directed by Steven Spielberg. “The ‘Minority Report’ of 2002 is the reality of today.”

The commissioner called predictive policing the next phase of American policing, which the NYPD will be a leader in implementing through the data mining of “huge amounts of information” and the development of “algorithms that will effectively mine that data in a way that the human brain cannot.”

The New York Times: Data Mining in the Modern City

Predictive policing is a military concept of policing developed by Bratton when he was the chief of police for the Los Angeles Police Department. It treats crime in cities like terrorism or violence committed by insurgents in war zones.

LA Weekly reported in February 2014 that two UCLA professors from the US Army Research Office had worked with the LAPD. Their technique had been developed to “provide the Army with a plethora of new data-intensive predictive algorithms for dealing with insurgents and terrorists abroad.”

As journalist Raven Rakia explained in Medium on July 22, “Using past crime data to justify occupying specific neighborhoods with police officers, the system simply tracks past arrests of minor property offenses and contributes nothing to predicting or preventing violent crimes such as murder. These surveillance tools help make broken windows policing easier to implement in poor neighborhoods, but won’t actually make the community any safer.”

First introduced by George Killing and James Q. Wilson in an essay in The Atlantic in 1982, Bratton developed broken windows policing too. It is the theory that crackdowns on petty crime will prevent increases in violent crimes in a neighborhood.

One key problem, as Rakia has described, is that it treats human beings as property:

… “[T]he unchecked panhandler is, in effect, the first broken window.” The United States has a long history of implementing systems of control for the black population — from auction blocks and slave patrols, to black codes and sharecropping — resting on the notion that black people are not free or human but rather, someone’s property. These systems of control, depending on free or cheap labor, create great profits for the ownership class. It is quite revealing that the theory behind contemporary urban policing across America still rests on the concept that black people are property — and should be ‘handled’ as such.

Given the racism underlying the theory of broken windows policing, it is impossible not to fear that predictive policing is driven by a similar set of prejudice. However, Bratton is not one bit concerned about the NYPD abusing its authority in a prejudicial and intrusive manner. In fact, during the panel, he bluntly stated, “There are no secrets.”

“If two people share a piece of information, it is no longer a secret,” Bratton suggested. “And whether you want that second person to know that information or not, the likelihood is that they are going to get it.”

Such statements represent a flippant attitude toward the right to privacy and toward communities disproportionately impacted by policing.

In January, Josmar Trujillo, a writer and activist who has organized with New Yorkers Against Bratton, wrote, “Last year in Harlem, the city’s largest-ever gang raid resulted in 103 indictments stemming from two murders. The raid was buoyed by Operation Crew Cut, the NYPD program where social media interactions play a significant role in determining guilt and building cases oftentimes by mere association.”

He described how NYPD detectives and intelligence analysts monitored “dozens of public housing residents for years, including the collection of more than a million Facebook posts, leading up to the military-style raid.” With help from the NYPD, District Attorney Cyrus Vance created “complex conspiracy charges” that saw dozens of young men facing sentences of up to 15 years for crimes that, in most cases, they had yet to commit.

This is what Bratton sees as the “wave of the future”—slapping young vulnerable teenagers and adults, who can barely afford legal representation, with complex conspiracy charges.

Trujillo appropriately posed the question: “In an effort get out ahead of crime were we locking people up for breaking the law—or for the future dangers they posed?”

Bratton believes in the infallible ability of “reliable data” to aid in the development of algorithms, which can make it possible for officers to be able to access “phenomenal amounts of information” so it can be collated and analyzed for emerging patterns and trends. He supports criminalizing people for perceived future dangers they may pose to society.

Yet, what happens when police are only enforcing laws in certain areas that are deemed to be at risk neighborhoods?

ACLU of Massachusetts’ Kade Crockford previously argued that blacks and Latinos are arrested, prosecuted, and convicted of marijuana offenses at rates which are much higher than whites.

… Now consider that these arrest data are put into computer programs instructed to spit out information to officers about where to target police patrols — what’s called predictive policing. The returned intelligence telling police departments where to target their patrols is supposedly accurate because arrest data fed into a computer algorithm produced it.

But if historical arrest data shows that the majority of arrests for marijuana crimes in a city are made in a predominately black area, instead of in a predominately white area, predictive policing algorithms working off of this problematic data will recommend that officers deploy resources to the predominately black area — even if there is other information to show that people in the white area violate marijuana laws at about the same rate as their black counterparts.

In other words, the data used to predict crimes reflects the priorities of a police department. It also means some areas of a city become occupied territories where police conduct intense patrols while other areas are ignored by police.

How officers decide what streams of data are reliable inherently depends on the set of prejudices with which officers carry out their day-to-day policing of neighborhoods. With those prejudices, predictive policing becomes an authoritarian system for treating citizens like potential insurgents, who could threaten the stability of a military occupation.

Kevin Gosztola

Kevin Gosztola

Kevin Gosztola is managing editor of Shadowproof. He also produces and co-hosts the weekly podcast, "Unauthorized Disclosure."