One City Rejected a Policing Algorithm. Could It Be Used For Other Purposes?

Pittsburgh announced that they would stop using a "hot spot" algorithm to deploy police to places suspected of being future crime sites.

Pittsburgh announced that they would stop using a "hot spot" algorithm to deploy police to places suspected of being future crime sites. Shutterstock

 

Connecting state and local government leaders

In Pittsburgh, an algorithm that deployed law enforcement officers to predicted crime “hot spots” might be repurposed to send social services to areas in need instead.

A controversial policing tactic in Pittsburgh has been discontinued following concerns that it might help perpetuate systemic inequalities by increasing police presence in neighborhoods that are largely Black and Latino.

The “hot spot” prediction program was an algorithm-informed system that alerted law enforcement to certain areas identified as places where crimes are likely to be committed, prompting proactive law enforcement deployments. In a statement confirming that police would no longer use the tool, Pittsburgh Mayor Bill Peduto said that he shared concerns “about the potential for predictive policing programs to exacerbate implicit bias and racial inequities in our communities.”

The project had been on pause since December 2019, and Peduto confirmed recently that there are no plans to restart it. Instead, the mayor has suggested that the city might rethink how it responds to places identified as “hot spots,” perhaps using the information to guide how it delivers services instead.

Pittsburgh is far from the only city to use “hot spot” algorithms to proactively deploy law enforcement to areas with potential criminal activity. Often called “predictive policing,” the programs sometimes use a mixture of gunshot detection technology, data about the recent locations and times of day of property crimes, and even information from the Facebook profiles of people with convictions. From Chicago to Los Angeles, technologies that give law enforcement potential insight into future crimes are being tested.

But where they’ve been deployed, they’ve also raised serious concerns from civil rights groups, privacy advocates, and Black and Latino advocates, who say their communities are disproportionately represented in crime data. Many fear that the technologies are in fact akin to a crystal ball—murky and sometimes wrong—and that without any oversight or community input, the algorithms could end up perpetuating racial biases in policing under the guise of data-driven policy making or “smart city” innovations.

Metro21: Smart Cities Institute at Carnegie Mellon University, which developed the Pittsburgh program, noted in a June statement that its project had concluded in December 2019 and it was no longer sharing information with the police. The institute emphasized that the tool targeted locations, not people, saying an evaluation found a 34% drop in serious violent crime in areas identified as “hot spots,” while only four arrests were made during patrols sent out because of the tool.

In Pittsburgh, advocates raised concerns about both the oversight of the program and the lack of community engagement prior to the tool’s deployment. The Pittsburgh Task Force on Public Algorithms, which is independent from the city and run out of the Institute for Cyber Law, Policy, & Security at the University of Pittsburgh, is now looking for ways the city could further engage residents around the use of algorithms. Task force members are looking beyond what the city is using in policing to also evaluate algorithms like those that determine bail conditions for pretrial release and others that use data to trigger child welfare interventions in some scenarios.

“We convened the task force with an eye towards scrutinizing algorithmic systems partly because our county has been a leader in doing things with algorithms,” said Christopher Deluzio, a task force member and the policy director at the Institute for Cyber Law, Policy, & Security. “A lot of these systems are deployed without public input or oversight—but with these types of systems, we really need to make sure the public knows what’s going on, is a partner in developing it, and has the means to scrutinize it.”

The group plans to issue a report  next year, providing the city government with suggested frameworks for oversight, ways to engage the community, and fix any situations where an algorithm may have perpetuated systemic inequalities. There are other cities Pittsburgh can look to for models in all these areas, Deluzio said. Seattle, for instance, requires the city council to approve all new uses of surveillance technology, a process that provides members of the public ample time to voice their thoughts and ask questions about how a new technology will be used.

Providing the public with a clear picture of how algorithms are overseen may prove to be a bit trickier. Many algorithms in use in the criminal justice system are black boxes even to the policymakers who implement them because they are bought from third party vendors that are allowed to hide facets of how they work to protect intellectual property. Predictive policing algorithms and other machine learning tools that aren’t transparent about their source code have been challenged in court in recent years and some city councils have tried to banned them.

Not exposing that data to the public is “inconsistent with meaningful oversight,” said Deluzio. “I think it ought to be more difficult for a police department to just procure something off the shelf and present it to the public as a black box,” he said. “If you can’t open the system to auditors … that does not engender public trust.”

Efforts to dig into algorithms can get contentious. In 2017, the New York City Council approved a bill that created a task force to study the use of algorithms in the city, which two years later did release a report about their use. But critics derided the process, saying dissenting voices were sidelined

At a task force community meeting in March, several Pittsburgh residents said that they would like to learn more about how the local government is using algorithms to see if they could possibly be repurposed. The “hot spot” prediction program, some suggested, could be refashioned to deploy resources to address root causes of crime like housing instability, poverty, and joblessness, instead of branding certain neighborhoods and the residents in them as potential sources of criminal activity. 

The mayor seems open to such an idea, saying “hot spots” could be managed by the newly created Office of Community Health and Safety, an agency that “will allow public safety to step back and determine what kind of support an individual or family needs.”

“‘Hot spots’ may benefit from the aid of a social worker, service provider or outreach team, not traditional policing,” Peduto said in a letter to the task force on June 16.

The Mayor’s Office did not respond to a request for comment as to whether progress has been made in converting the hot spot program to one focused on providing social services. But elsewhere in Allegheny County, which is where Pittsburgh is located, a county department is working on new algorithms that officials say will allow them to better allocate resources to communities most in need. 

Erin Dalton, the deputy director for the Office of Analytics, Technology and Planning at the Allegheny County Department of Health and Human Services, said that her agency is debuting a new algorithmic tool next month to help them determine who in the county’s population of homeless people should be prioritized for rapid rehousing or permanent supportive housing. Those at a high risk of four or more emergency room visits in the next year, an in-patient mental health stay, or a jail booking will be top of the list for services.

“We’re trying to reduce the harms of being left unhoused,” Dalton said. “Our job is to use scarce resources for the most vulnerable … [with this] we’ll be able to better prioritize.” 

Dalton said she hopes the new process will be “faster, better, and less traumatic” than the current process. Though the department has faced heavy criticism for its use of algorithms in the past (particularly one used to help decide whether or not to start a child welfare investigation), Dalton said that this algorithm is different because it will “help us decide who gets a set of supportive services.”

Using algorithms to deploy support instead of trigger investigations or law enforcement involvement could give communities more faith in their use, Deluzio said. “If there are tools that can tell us where things are happening that require interventions, why not send services?” he said. “That’s a shrewd way to rethink these tools as they expand. How can we use them to help people?”

Emma Coleman is the assistant editor for Route Fifty.

NEXT STORY: Progressive Prosecutors Push for Reform in Response to Protests

X
This website uses cookies to enhance user experience and to analyze performance and traffic on our website. We also share information about your use of our site with our social media, advertising and analytics partners. Learn More / Do Not Sell My Personal Information
Accept Cookies
X
Cookie Preferences Cookie List

Do Not Sell My Personal Information

When you visit our website, we store cookies on your browser to collect information. The information collected might relate to you, your preferences or your device, and is mostly used to make the site work as you expect it to and to provide a more personalized web experience. However, you can choose not to allow certain types of cookies, which may impact your experience of the site and the services we are able to offer. Click on the different category headings to find out more and change our default settings according to your preference. You cannot opt-out of our First Party Strictly Necessary Cookies as they are deployed in order to ensure the proper functioning of our website (such as prompting the cookie banner and remembering your settings, to log into your account, to redirect you when you log out, etc.). For more information about the First and Third Party Cookies used please follow this link.

Allow All Cookies

Manage Consent Preferences

Strictly Necessary Cookies - Always Active

We do not allow you to opt-out of our certain cookies, as they are necessary to ensure the proper functioning of our website (such as prompting our cookie banner and remembering your privacy choices) and/or to monitor site performance. These cookies are not used in a way that constitutes a “sale” of your data under the CCPA. You can set your browser to block or alert you about these cookies, but some parts of the site will not work as intended if you do so. You can usually find these settings in the Options or Preferences menu of your browser. Visit www.allaboutcookies.org to learn more.

Sale of Personal Data, Targeting & Social Media Cookies

Under the California Consumer Privacy Act, you have the right to opt-out of the sale of your personal information to third parties. These cookies collect information for analytics and to personalize your experience with targeted ads. You may exercise your right to opt out of the sale of personal information by using this toggle switch. If you opt out we will not be able to offer you personalised ads and will not hand over your personal information to any third parties. Additionally, you may contact our legal department for further clarification about your rights as a California consumer by using this Exercise My Rights link

If you have enabled privacy controls on your browser (such as a plugin), we have to take that as a valid request to opt-out. Therefore we would not be able to track your activity through the web. This may affect our ability to personalize ads according to your preferences.

Targeting cookies may be set through our site by our advertising partners. They may be used by those companies to build a profile of your interests and show you relevant adverts on other sites. They do not store directly personal information, but are based on uniquely identifying your browser and internet device. If you do not allow these cookies, you will experience less targeted advertising.

Social media cookies are set by a range of social media services that we have added to the site to enable you to share our content with your friends and networks. They are capable of tracking your browser across other sites and building up a profile of your interests. This may impact the content and messages you see on other websites you visit. If you do not allow these cookies you may not be able to use or see these sharing tools.

If you want to opt out of all of our lead reports and lists, please submit a privacy request at our Do Not Sell page.

Save Settings
Cookie Preferences Cookie List

Cookie List

A cookie is a small piece of data (text file) that a website – when visited by a user – asks your browser to store on your device in order to remember information about you, such as your language preference or login information. Those cookies are set by us and called first-party cookies. We also use third-party cookies – which are cookies from a domain different than the domain of the website you are visiting – for our advertising and marketing efforts. More specifically, we use cookies and other tracking technologies for the following purposes:

Strictly Necessary Cookies

We do not allow you to opt-out of our certain cookies, as they are necessary to ensure the proper functioning of our website (such as prompting our cookie banner and remembering your privacy choices) and/or to monitor site performance. These cookies are not used in a way that constitutes a “sale” of your data under the CCPA. You can set your browser to block or alert you about these cookies, but some parts of the site will not work as intended if you do so. You can usually find these settings in the Options or Preferences menu of your browser. Visit www.allaboutcookies.org to learn more.

Functional Cookies

We do not allow you to opt-out of our certain cookies, as they are necessary to ensure the proper functioning of our website (such as prompting our cookie banner and remembering your privacy choices) and/or to monitor site performance. These cookies are not used in a way that constitutes a “sale” of your data under the CCPA. You can set your browser to block or alert you about these cookies, but some parts of the site will not work as intended if you do so. You can usually find these settings in the Options or Preferences menu of your browser. Visit www.allaboutcookies.org to learn more.

Performance Cookies

We do not allow you to opt-out of our certain cookies, as they are necessary to ensure the proper functioning of our website (such as prompting our cookie banner and remembering your privacy choices) and/or to monitor site performance. These cookies are not used in a way that constitutes a “sale” of your data under the CCPA. You can set your browser to block or alert you about these cookies, but some parts of the site will not work as intended if you do so. You can usually find these settings in the Options or Preferences menu of your browser. Visit www.allaboutcookies.org to learn more.

Sale of Personal Data

We also use cookies to personalize your experience on our websites, including by determining the most relevant content and advertisements to show you, and to monitor site traffic and performance, so that we may improve our websites and your experience. You may opt out of our use of such cookies (and the associated “sale” of your Personal Information) by using this toggle switch. You will still see some advertising, regardless of your selection. Because we do not track you across different devices, browsers and GEMG properties, your selection will take effect only on this browser, this device and this website.

Social Media Cookies

We also use cookies to personalize your experience on our websites, including by determining the most relevant content and advertisements to show you, and to monitor site traffic and performance, so that we may improve our websites and your experience. You may opt out of our use of such cookies (and the associated “sale” of your Personal Information) by using this toggle switch. You will still see some advertising, regardless of your selection. Because we do not track you across different devices, browsers and GEMG properties, your selection will take effect only on this browser, this device and this website.

Targeting Cookies

We also use cookies to personalize your experience on our websites, including by determining the most relevant content and advertisements to show you, and to monitor site traffic and performance, so that we may improve our websites and your experience. You may opt out of our use of such cookies (and the associated “sale” of your Personal Information) by using this toggle switch. You will still see some advertising, regardless of your selection. Because we do not track you across different devices, browsers and GEMG properties, your selection will take effect only on this browser, this device and this website.