Date: | November 3, 2020 |
Share: |
Last week, the City of Buffalo announced a proposed 3-year contract with SAS Institute to create a data analytics center and “use predictive modeling to determine best practices,” with “police training and practices” as the first use of the technology. We urge the Buffalo Common Council to consider the impact of this proposal. Read the full letter below.
October 29, 2020
To: Buffalo Common Council
From: Partnership for the Public Good
Re: Proposed contract with SAS for police data analytics system
Dear Buffalo Common Council Members:
Last week, the City of Buffalo announced a proposed 3-year contract with SAS Institute to create a data analytics center and “use predictive modeling to determine best practices,” with “police training and practices” as the first use of the technology.
1. Data Transparency
This announcement raises many questions, including:
Before a data system is approved, Common Council should request and share answers to these basic questions, and provide time for public comment and consultation with the Buffalo Police Advisory Board.
The current proposed contract with SAS explicitly limits all access to the system and its data to city staff and contractors. The system will be run on SAS servers, and subject to their strict access controls. As currently written, it appears that residents will not be able to review the data collected in the system. Importantly, the Common Council’s Police Advisory Board won’t have access to the system either. Instead of data being collected and released to the public, for ongoing review and input by Buffalo residents, the proposed system will be run by a private company with an algorithm that is protected by intellectual property rights.
Buffalo residents and community organizations have called for improved public data on policing in Buffalo for years. For example, requests include publicly releasing “stop data” in a searchable database to monitor for racial bias in police stops. This step has not been taken to date, despite Mayor Brown’s addition of “stop tickets” as an early part of the Buffalo Reform Agenda in June, a policy that should make stop data easily available. Data on stops, on-site appearance tickets, and desk appearance tickets should be made available to the public. Will the new system help make this transparency possible? It doesn’t appear so.
2. Public Dollars, Public Benefit
The 3-year data contract comes at a cost of $1.3 Million for the first year, and $800,000 per year for years two and three, as reported in the Buffalo News. With a public investment of this size, the City must ensure that citizens will have access to the data, to evaluate police performance, and that the Police Advisory Board is empowered to use the database in its work.
Any data system that is solely the property of the police department should not be funded; it should be accessible to the people of Buffalo.
3. A Shift to “Predictive Policing”?
The new software, according to city officials, will “use predictive modeling to determine best practices” and influence the direction of “police training and practices.”
In recent years, “predictive modeling” in policing and criminal justice has raised a great deal of concern across the United States. The potential for harm and racial bias when algorithms are relied upon to direct police operations, to determine who should have bail set, or to decide who is eligible for parole has been well documented.
Predictive technologies bring even greater risk of racial bias when algorithms are part of proprietary systems, like the one currently proposed. Residents do not have access to the data or the analysis that leads to policing decisions, leaving them unable to assess or challenge decisions made that impact them. “Predictive policing” would strongly impact particular residents of Buffalo, and yet the data, rules, and analysis behind the system would be invisible to them.
A July 2020 article from the MIT Technology Review, “Predictive policing algorithms are racist. They need to be dismantled,” outlines common predictive policing tools and the problems with them. In short, “increasing evidence suggests that human prejudices have been baked into these tools because the machine-learning models are trained on biased police data. Far from avoiding racism, they may simply be better at hiding it.” … “The problem lies with the data the algorithms feed upon. For one thing, predictive algorithms are easily skewed by arrest rates,” which are racially disparate across the US and in Buffalo. Because the algorithms are trained and built with biased data, they continue to make it much more likely that Black people will be arrested at higher rates.
In Buffalo, will “predictive modeling” be used to forecast where crimes will occur, and where officers should patrol? As MIT Technology Review describes, some predictive policing systems “draw on data about people, such as their age, gender, marital status, history of substance abuse, and criminal record, to predict who has a high chance of being involved in future criminal activity.”
If the SAS system will generate this analysis for police to inform their operations, this contract does far more than “create an evaluation system to determine the effectiveness of our police reform,” as Mayor Brown has described it. Instead, this would mark a major shift in the overall approach to policing in Buffalo—including how operational decisions are made each day, which neighborhoods are policed, and who is patrolled and ultimately arrested.
Before making any decisions or approval, we urge Common Council to obtain far more detailed information about the proposed SAS contract and data system, including precisely what “predictive policing” approaches will be brought to Buffalo.
The Council’s findings should be shared during public meetings on the proposed system, and the decision to launch “predictive modeling” and algorithm-based policing in Buffalo should be reviewed in-depth.