Programmers, Lawmakers Want A.I. to Eliminate Bias, Not Promote It

iStock.com/simpson33rizi

 

Connecting state and local government leaders

Artificial intelligence can worsen systemic racism.

This story was originally posted by Stateline, an initiative of the Pew Charitable Trusts.

DALLAS — When software engineer Bejoy Narayana was developing Bob.ai, an application to help automate Dallas-Fort Worth’s Section 8 voucher program, he stopped and asked himself, ‘‘Could this system be used to help some people more than others?”

Bob.ai uses artificial intelligence, known as AI, and automation to help voucher holders find rental units, property owners complete contracting and housing authorities conduct inspections. The software and mobile app were released in 2018 in partnership with the Dallas Housing Authority, which gave Narayana access to data from some 16,000 Section 8 voucher holders.

Artificial intelligence is used in a host of algorithms in medicine, banking and other major industries. But as it has proliferated, studies have shown that AI can be biased against people of color. In housing, AI has helped perpetuate segregation, redlining and other forms of racial discrimination against Black families, who disproportionately rely on vouchers.

Narayana worried that Bob.ai would do the same, so he tweaked his app so that tenants could search for apartments using their voucher number alone, without providing any other identifying information.

As an Indian immigrant overseeing a team largely made up of people of color, Narayana was especially sensitive to the threat of racial bias. But lawmakers in a growing number of states don’t want to rely on the goodwill of AI developers. Instead, as AI is adopted by more industries and government agencies, they want to strengthen and update laws to guard against racially discriminatory algorithms—especially in the absence of federal rules.

Since 2019, more than 100 bills related to artificial intelligence and automated decision systems have been introduced in nearly two dozen states, according to the National Conference of State Legislatures. This year, lawmakers in at least 16 states proposed creating panels to review AI’s impact, promote public and private investment in AI, or address transparency and fairness in AI development.

A bill in California would be the first to require developers to evaluate the privacy and security risks of their software, as well as assess their products’ potential to generate inaccurate, unfair, biased or discriminatory decisions. Under the proposed law, the California Department of Technology would have to approve software before it could be used in the public sector.

The bill, introduced by Assembly Member Ed Chau, a Democrat and chair of the Committee on Privacy and Consumer Protection, passed the California State Assembly earlier this month and was pending in the state Senate at publication time. Chau’s office did not respond to multiple requests for comment.

Vinhcent Le, a lawyer at the Greenlining Institute, an advocacy group focused on racial economic justice, helped write the California legislation. Le described algorithms such as Bob.ai as gatekeepers to opportunity that can either perpetuate segregation and redlining or help to end them.

“It’s great that the developers of Bob.ai decided to omit a person’s name, but we can’t rely on small groups of people making decisions that can essentially affect thousands,” Le said. “We need an agreed way to audit these systems to ensure they are integrating equity metrics in ways that don’t unfairly disadvantage people.”

Automated Discrimination

According to an October report by the Massachusetts Institute of Technology, AI often has exacerbated racial bias in housing. A 2019 report from the University of California, Berkeley, showed that an AI-based mortgage lending system charged Black and Hispanic borrowers higher rates than White people for the same loans.

In 2019, U.S. Sen. Cory Booker, a New Jersey Democrat, introduced a bill like the one under consideration in California, but it died in committee and has not been reintroduced.

"Fifty years ago, my parents encountered a practice called 'real estate steering' where black couples were steered away from certain neighborhoods in New Jersey. With the help of local advocates and the backing of federal legislation, they prevailed,” Booker said in a news release introducing the bill.

“However, the discrimination that my family faced in 1969 can be significantly harder to detect in 2019: houses that you never know are for sale, job opportunities that never present themselves, and financing that you never become aware of—all due to biased algorithms."

Several states have struggled in recent years with problematic software.

Facebook overhauled its ad-targeting system to prevent discrimination in housing, credit and job ads in 2019 as part of a settlement to resolve legal challenges filed by the National Fair Housing Alliance, the American Civil Liberties Union, the Communications Workers of America and other advocacy groups.

In Michigan, an AI system that cost the state $47 million to build in 2013 falsely accused as many as 40,000 people of unemployment insurance fraud, forcing some people into bankruptcy, according to the Detroit Free Press.

In Pennsylvania, a child abuse prediction model unfairly targets low-income families because it relies on data that is collected only on families using public resources, according to Virginia Eubank’s 2018 book “Automating Inequality.”

“Automated decision-making shatters the social safety net, criminalizes the poor, intensifies discrimination, and compromises our deepest national values,” Eubanks wrote. “And while the most sweeping digital decision-making tools are tested in what could be called ‘low rights environments’ where there are few expectations of political accountability and transparency, systems first designed for the poor will eventually be used on everyone.”

The Sacramento Housing Redevelopment Agency began using Bob.ai in March. Laila Darby, assistant director of the housing voucher program, said the agency vetted Bob.ai before using it to make sure it didn’t raise privacy and discrimination concerns.

Narayana said he’s sure Bob.ai would pass any state-mandated test for algorithmic discrimination.

“We’re a company that is fighting discrimination and doing everything possible to expand housing for voucher holders,” Narayana said. “Vetting these systems is beneficial because discrimination and inequality is something everyone should be concerned about.”

Automating Solutions

Narayana worked as an engineer at IBM until he decided to start his own company with the mission of rethinking government functions. He founded BoodsKapper in 2016 and began developing Bob.ai out of a co-working space near the Dallas-Fort Worth airport.

Narayana’s creation has been a huge success—in Dallas and beyond. The Dallas Housing Authority has used Bob.ai to cut the average wait time for an apartment inspection from 15 days to one. Since the launch of Bob.ai, Dallas and more than a dozen other housing agencies have added some 20,000 Section 8 units from landlords who were not participating in the program because of the long inspection wait times.

“We partnered with [Narayana] to come up with some technology advancements to our workflows and automation so that we could more timely respond to our business partners so that they didn’t see this as a lost lead in terms of working with the voucher program,” said Troy Broussard, Dallas Housing Authority CEO.

Marian Russo, executive director of the Village of Patchogue Community Development Agency on Long Island, New York, said she hopes Bob.ai can help the agency reverse the area’s long history of redlining. The authority plans to begin using Bob.ai to manage its 173 housing vouchers later this year.

“We’re one of the most segregated parts of the country,” Russo said of Long Island. “We have 25 housing authorities, so if we could just have a central place with all the landlords who are renting through the program and all the individuals who are looking for housing in one place, that could be a part of equalizing the housing issues on Long Island.”

U.S. Rep. Bill Foster, an Illinois Democrat, has similar hopes for AI. In a May 7 hearing, members of the Task Force on Artificial Intelligence of the U.S. House Committee on Financial Services discussed how AI could expand lending, housing and other opportunities. But they also warned that historical data inputted into AI can create models that are racist or sexist. Foster’s office did not respond to multiple requests for comment.

“The real promise of AI in this space is that it may eventually produce greater fairness and equity in ways that we may not have contemplated ourselves,” said Foster, chair of the task force, in the hearing. “So, we want to make sure that the biases of the analog world are not repeated in the AI and machine-learning world.”

X
This website uses cookies to enhance user experience and to analyze performance and traffic on our website. We also share information about your use of our site with our social media, advertising and analytics partners. Learn More / Do Not Sell My Personal Information
Accept Cookies
X
Cookie Preferences Cookie List

Do Not Sell My Personal Information

When you visit our website, we store cookies on your browser to collect information. The information collected might relate to you, your preferences or your device, and is mostly used to make the site work as you expect it to and to provide a more personalized web experience. However, you can choose not to allow certain types of cookies, which may impact your experience of the site and the services we are able to offer. Click on the different category headings to find out more and change our default settings according to your preference. You cannot opt-out of our First Party Strictly Necessary Cookies as they are deployed in order to ensure the proper functioning of our website (such as prompting the cookie banner and remembering your settings, to log into your account, to redirect you when you log out, etc.). For more information about the First and Third Party Cookies used please follow this link.

Allow All Cookies

Manage Consent Preferences

Strictly Necessary Cookies - Always Active

We do not allow you to opt-out of our certain cookies, as they are necessary to ensure the proper functioning of our website (such as prompting our cookie banner and remembering your privacy choices) and/or to monitor site performance. These cookies are not used in a way that constitutes a “sale” of your data under the CCPA. You can set your browser to block or alert you about these cookies, but some parts of the site will not work as intended if you do so. You can usually find these settings in the Options or Preferences menu of your browser. Visit www.allaboutcookies.org to learn more.

Sale of Personal Data, Targeting & Social Media Cookies

Under the California Consumer Privacy Act, you have the right to opt-out of the sale of your personal information to third parties. These cookies collect information for analytics and to personalize your experience with targeted ads. You may exercise your right to opt out of the sale of personal information by using this toggle switch. If you opt out we will not be able to offer you personalised ads and will not hand over your personal information to any third parties. Additionally, you may contact our legal department for further clarification about your rights as a California consumer by using this Exercise My Rights link

If you have enabled privacy controls on your browser (such as a plugin), we have to take that as a valid request to opt-out. Therefore we would not be able to track your activity through the web. This may affect our ability to personalize ads according to your preferences.

Targeting cookies may be set through our site by our advertising partners. They may be used by those companies to build a profile of your interests and show you relevant adverts on other sites. They do not store directly personal information, but are based on uniquely identifying your browser and internet device. If you do not allow these cookies, you will experience less targeted advertising.

Social media cookies are set by a range of social media services that we have added to the site to enable you to share our content with your friends and networks. They are capable of tracking your browser across other sites and building up a profile of your interests. This may impact the content and messages you see on other websites you visit. If you do not allow these cookies you may not be able to use or see these sharing tools.

If you want to opt out of all of our lead reports and lists, please submit a privacy request at our Do Not Sell page.

Save Settings
Cookie Preferences Cookie List

Cookie List

A cookie is a small piece of data (text file) that a website – when visited by a user – asks your browser to store on your device in order to remember information about you, such as your language preference or login information. Those cookies are set by us and called first-party cookies. We also use third-party cookies – which are cookies from a domain different than the domain of the website you are visiting – for our advertising and marketing efforts. More specifically, we use cookies and other tracking technologies for the following purposes:

Strictly Necessary Cookies

We do not allow you to opt-out of our certain cookies, as they are necessary to ensure the proper functioning of our website (such as prompting our cookie banner and remembering your privacy choices) and/or to monitor site performance. These cookies are not used in a way that constitutes a “sale” of your data under the CCPA. You can set your browser to block or alert you about these cookies, but some parts of the site will not work as intended if you do so. You can usually find these settings in the Options or Preferences menu of your browser. Visit www.allaboutcookies.org to learn more.

Functional Cookies

We do not allow you to opt-out of our certain cookies, as they are necessary to ensure the proper functioning of our website (such as prompting our cookie banner and remembering your privacy choices) and/or to monitor site performance. These cookies are not used in a way that constitutes a “sale” of your data under the CCPA. You can set your browser to block or alert you about these cookies, but some parts of the site will not work as intended if you do so. You can usually find these settings in the Options or Preferences menu of your browser. Visit www.allaboutcookies.org to learn more.

Performance Cookies

We do not allow you to opt-out of our certain cookies, as they are necessary to ensure the proper functioning of our website (such as prompting our cookie banner and remembering your privacy choices) and/or to monitor site performance. These cookies are not used in a way that constitutes a “sale” of your data under the CCPA. You can set your browser to block or alert you about these cookies, but some parts of the site will not work as intended if you do so. You can usually find these settings in the Options or Preferences menu of your browser. Visit www.allaboutcookies.org to learn more.

Sale of Personal Data

We also use cookies to personalize your experience on our websites, including by determining the most relevant content and advertisements to show you, and to monitor site traffic and performance, so that we may improve our websites and your experience. You may opt out of our use of such cookies (and the associated “sale” of your Personal Information) by using this toggle switch. You will still see some advertising, regardless of your selection. Because we do not track you across different devices, browsers and GEMG properties, your selection will take effect only on this browser, this device and this website.

Social Media Cookies

We also use cookies to personalize your experience on our websites, including by determining the most relevant content and advertisements to show you, and to monitor site traffic and performance, so that we may improve our websites and your experience. You may opt out of our use of such cookies (and the associated “sale” of your Personal Information) by using this toggle switch. You will still see some advertising, regardless of your selection. Because we do not track you across different devices, browsers and GEMG properties, your selection will take effect only on this browser, this device and this website.

Targeting Cookies

We also use cookies to personalize your experience on our websites, including by determining the most relevant content and advertisements to show you, and to monitor site traffic and performance, so that we may improve our websites and your experience. You may opt out of our use of such cookies (and the associated “sale” of your Personal Information) by using this toggle switch. You will still see some advertising, regardless of your selection. Because we do not track you across different devices, browsers and GEMG properties, your selection will take effect only on this browser, this device and this website.