How to Fairly Use Algorithms to Make Tough Decisions

Shutterstock/BEST-BACKGROUNDS

COMMENTARY | With computer power increasingly used to guide policies adopted by states and local leaders, governments need to take steps to ensure the underlying algorithms aren’t biased.

The power of data to help states and localities make decisions more effectively has long been a given. But the use of complex algorithms in government complicates that picture, raising essential questions about whether the data being used is free from the biases and inequities so often present in our communities.

What exactly are algorithms? Dan Chenok, executive director of the IBM Center for the Business of Government, defines them as the complex equations built into a computer system’s software, “which enables interpretation of data in a much more powerful way than people could on their own.”

There have been discussions for some time involving the question of how to weigh the potential promises of their use against the potential risks of hidden built-in biases. But the debate came into sharp relief after Election Day, when Californians voted against Ballot Measure 25, which would have replaced the longstanding practice of judges setting bail according to a general set of guidelines with a system of “pre-trial risk assessments.”

The notion was to use a variety of factors to create an algorithm that would help a judge determine whether a person should either be released or kept in jail, based largely on the degree to which their release represented a threat to society. This would have eliminated cash bail entirely and save accused individuals the money they typically pay to bail bondsmen to provide the cash necessary to be released.  

“These particular algorithms would look at the same variables, the same information, for all people and weight those variables so they can be scored,” says Heather Harris, research fellow at the Public Policy Institute of California.

The bail industry fought back, launching a campaign against the measure. But other critics also emerged from the civil rights community, with opponents raising concerns about algorithms used in areas that have direct impact on human lives, such as bail, parole and sentencing. These kinds of concerns aren’t new. As long as five years ago, ProPublica did a deep dig into a number of issues with algorithms and found, for example, that in Broward County, Florida, which was using them for sentencing, “the formula was particularly likely to falsely flag Black defendants as future criminals, wrongly labeling them this way at about twice the rate of white defendants.”

Josh Martin, the chief data officer in Indiana, says states need to consider these dangers when building and using algorithms. “There have been numerous case studies that highlight how using race in machine-learning algorithms can lead to biases in unrepresentative training data or the unconscious/conscious biases held by the people writing and tuning the algorithms,” Martin writes in an email.

Indiana has been particularly successful in using algorithms in the realms of education and workforce development. For example, they are used to determine the education pathways from high school graduation through higher education that lead to the highest wage outcomes for residents. This information is used to help students in the state understand college costs, how to borrow wisely, and what to expect from the choices they make—for example, whether to pursue an associate, bachelor or more elevated degree and how the choice of study will affect future earnings.

Based on our research, the key is to recognize the risks that algorithms present and to be sure that they are used in a transparent way. This means making public the variables used and the weightings applied to them.

This kind of transparency is particularly important when decisions are being made in areas like criminal justice. Things get much easier in fields like transportation. Atlanta, for example, saw a 25% reduction in the crash rate along a 2.3 mile stretch known as the North Avenue Corridor after it piloted a data application to analyze key risk factors. This allowed city engineers to predict crashes before they happened and take steps to avoid them, such as adjusting the timing of traffic lights.

A couple of localities stand out for their use of great care in using algorithms: Cook County, Illinois, the second most populous county in the United States,7 and New York, the nation’s largest city.

Says Dessa Gypalo, Cook County’s chief data officer, “Data analytics has typically been opaque. It’s been a black box. But everything we do has to be done transparently and with caution.”

One example she cites was the use of algorithms to help parcel out the money from the federal government’s CARES Act, intended to provide economic assistance for workers, families, businesses, and local governments to help them get through the coronavirus-created economic downturn. This was a particularly tricky endeavor in Cook County, which includes the city of Chicago and more than 130 other municipalities.

The algorithm used included five data points: the population of each municipality; median income; Covid-19 deaths per 100,000 people; tax base per capita; and percentage of the population located in an economically disconnected or disinvested area. The last data point was developed by municipal partner CMAP—Chicago Metropolitan Agency for Planning—in an effort to quantify inequitable regional investment practices over the years. Using these variables, it was able to create a score for each municipality and then base funding allocations on that data point.

But here’s the important part: The county didn’t stop there. It ran the computerized results through a rigorous evaluation process that included a great many conversations with people with longstanding institutional knowledge about the differences in the municipalities in Cook County. “We are so large, and we have a lot of different stakeholders,” says Gypalo, “and they were able to validate the findings.” The input from knowledgeable human beings led to a few tweaks, and Gypalo felt comfortable that the computerized recommendations weren’t just mathematically defensible, but fit in with real world-considerations, as understood by human beings.

In New York City, on November 19th, 2019, Mayor Bill de Blasio signed an executive order establishing the position of Algorithms Management and Policy Officer, (AMPO) in his city. As the mayor said that day, the position was established to “ensure the tools we use to make decisions are fair and transparent.”

Such was the importance of this new effort that the mayor appointed Jeff Thamkittikasem, his director of operations, to be acting AMPO for the city.

Thamkittikasem is abundantly aware of the risks of bias that algorithms can present. “Systemic racism exists,” he says, “and sometimes it can come into tools like algorithms, which are often based on previously gathered data . . . If you’re using bad data, you’ll get bad outcomes.”

The first step the city is taking is going back to algorithms that are already in use. By January, New York City will have the first published list of the systems that qualify for review. Then the agencies will carefully assess these tools to ascertain that “there isn’t any bias or disproportionality in how they were developed,” says Thamkittikasem.

One of Thamkittikasem’s goals is to empower the individual agencies to address their own use of the algorithms they’re putting in place by setting policies and practices that the agency leaders all understand. He’s savvy enough about the workings of government to know that different agencies will need to use different tools. “Human services are different than financial practices,” he says.

Will the kind of care New York City and Cook County are taking allow algorithms to be used in the complicated areas of criminal justice? That’s hard to say. One observer noted in mounting an argument that algorithms can be justified as equally as fair as judges in making decisions, that “judges are also black boxes.”

Ultimately, wherever algorithms are used, Cook County’s Gypalo makes a strong point by quoting Uncle Ben from Marvel Comics’ Spider-Man, “With great power comes great responsibility.”

X
This website uses cookies to enhance user experience and to analyze performance and traffic on our website. We also share information about your use of our site with our social media, advertising and analytics partners. Learn More / Do Not Sell My Personal Information
Accept Cookies
X
Cookie Preferences Cookie List

Do Not Sell My Personal Information

When you visit our website, we store cookies on your browser to collect information. The information collected might relate to you, your preferences or your device, and is mostly used to make the site work as you expect it to and to provide a more personalized web experience. However, you can choose not to allow certain types of cookies, which may impact your experience of the site and the services we are able to offer. Click on the different category headings to find out more and change our default settings according to your preference. You cannot opt-out of our First Party Strictly Necessary Cookies as they are deployed in order to ensure the proper functioning of our website (such as prompting the cookie banner and remembering your settings, to log into your account, to redirect you when you log out, etc.). For more information about the First and Third Party Cookies used please follow this link.

Allow All Cookies

Manage Consent Preferences

Strictly Necessary Cookies - Always Active

We do not allow you to opt-out of our certain cookies, as they are necessary to ensure the proper functioning of our website (such as prompting our cookie banner and remembering your privacy choices) and/or to monitor site performance. These cookies are not used in a way that constitutes a “sale” of your data under the CCPA. You can set your browser to block or alert you about these cookies, but some parts of the site will not work as intended if you do so. You can usually find these settings in the Options or Preferences menu of your browser. Visit www.allaboutcookies.org to learn more.

Sale of Personal Data, Targeting & Social Media Cookies

Under the California Consumer Privacy Act, you have the right to opt-out of the sale of your personal information to third parties. These cookies collect information for analytics and to personalize your experience with targeted ads. You may exercise your right to opt out of the sale of personal information by using this toggle switch. If you opt out we will not be able to offer you personalised ads and will not hand over your personal information to any third parties. Additionally, you may contact our legal department for further clarification about your rights as a California consumer by using this Exercise My Rights link

If you have enabled privacy controls on your browser (such as a plugin), we have to take that as a valid request to opt-out. Therefore we would not be able to track your activity through the web. This may affect our ability to personalize ads according to your preferences.

Targeting cookies may be set through our site by our advertising partners. They may be used by those companies to build a profile of your interests and show you relevant adverts on other sites. They do not store directly personal information, but are based on uniquely identifying your browser and internet device. If you do not allow these cookies, you will experience less targeted advertising.

Social media cookies are set by a range of social media services that we have added to the site to enable you to share our content with your friends and networks. They are capable of tracking your browser across other sites and building up a profile of your interests. This may impact the content and messages you see on other websites you visit. If you do not allow these cookies you may not be able to use or see these sharing tools.

If you want to opt out of all of our lead reports and lists, please submit a privacy request at our Do Not Sell page.

Save Settings
Cookie Preferences Cookie List

Cookie List

A cookie is a small piece of data (text file) that a website – when visited by a user – asks your browser to store on your device in order to remember information about you, such as your language preference or login information. Those cookies are set by us and called first-party cookies. We also use third-party cookies – which are cookies from a domain different than the domain of the website you are visiting – for our advertising and marketing efforts. More specifically, we use cookies and other tracking technologies for the following purposes:

Strictly Necessary Cookies

We do not allow you to opt-out of our certain cookies, as they are necessary to ensure the proper functioning of our website (such as prompting our cookie banner and remembering your privacy choices) and/or to monitor site performance. These cookies are not used in a way that constitutes a “sale” of your data under the CCPA. You can set your browser to block or alert you about these cookies, but some parts of the site will not work as intended if you do so. You can usually find these settings in the Options or Preferences menu of your browser. Visit www.allaboutcookies.org to learn more.

Functional Cookies

We do not allow you to opt-out of our certain cookies, as they are necessary to ensure the proper functioning of our website (such as prompting our cookie banner and remembering your privacy choices) and/or to monitor site performance. These cookies are not used in a way that constitutes a “sale” of your data under the CCPA. You can set your browser to block or alert you about these cookies, but some parts of the site will not work as intended if you do so. You can usually find these settings in the Options or Preferences menu of your browser. Visit www.allaboutcookies.org to learn more.

Performance Cookies

We do not allow you to opt-out of our certain cookies, as they are necessary to ensure the proper functioning of our website (such as prompting our cookie banner and remembering your privacy choices) and/or to monitor site performance. These cookies are not used in a way that constitutes a “sale” of your data under the CCPA. You can set your browser to block or alert you about these cookies, but some parts of the site will not work as intended if you do so. You can usually find these settings in the Options or Preferences menu of your browser. Visit www.allaboutcookies.org to learn more.

Sale of Personal Data

We also use cookies to personalize your experience on our websites, including by determining the most relevant content and advertisements to show you, and to monitor site traffic and performance, so that we may improve our websites and your experience. You may opt out of our use of such cookies (and the associated “sale” of your Personal Information) by using this toggle switch. You will still see some advertising, regardless of your selection. Because we do not track you across different devices, browsers and GEMG properties, your selection will take effect only on this browser, this device and this website.

Social Media Cookies

We also use cookies to personalize your experience on our websites, including by determining the most relevant content and advertisements to show you, and to monitor site traffic and performance, so that we may improve our websites and your experience. You may opt out of our use of such cookies (and the associated “sale” of your Personal Information) by using this toggle switch. You will still see some advertising, regardless of your selection. Because we do not track you across different devices, browsers and GEMG properties, your selection will take effect only on this browser, this device and this website.

Targeting Cookies

We also use cookies to personalize your experience on our websites, including by determining the most relevant content and advertisements to show you, and to monitor site traffic and performance, so that we may improve our websites and your experience. You may opt out of our use of such cookies (and the associated “sale” of your Personal Information) by using this toggle switch. You will still see some advertising, regardless of your selection. Because we do not track you across different devices, browsers and GEMG properties, your selection will take effect only on this browser, this device and this website.