Michigan’s use of AI to process SNAP applications draws concerns about past automation failures

jetcityimage via Getty Images

Given the state’s track record in using an algorithmic fraud detection system, the Michigan Department of Health and Human Services’ use of AI in SNAP determinations holds a lot of reason for caution and concern, an expert says.

This story was originally published by Michigan Advance.

The Michigan Department of Health and Human Services has begun using artificial intelligence to help boost the number of Supplemental Nutrition Assistance Program cases it can review, a department official told members of the Senate Appropriations Subcommittee on DHHS last week.

While discussing efforts to comply with new federal requirements, David Knezek, the department’s chief operating officer said the agency has deployed an AI case reading tool to help employees go through cases line-by-line to ensure the department is making accurate determinations on payments before that money goes out the door.

Under H.R. 1, also known as the One Big Beautiful Bill Act, states are required to pay for a portion of SNAP benefits based on their payment error rate, or how accurately states make eligibility and benefit determinations among households participating in the program. In analyzing the changes, the nonpartisan Brookings Institution notes that wrongly rejecting an applicant is not considered an error under the measure, and that the rate is not a measure of fraud. 

Knezek told members of the committee that the department is only able to review a relatively small number of cases manually.

“Using this AI case reading tool, we’re now not only going to be able to scan every single case in a perfect environment before that money goes out the door, we’re also going to be able to target it to the cases that have the highest likelihood of resulting in a payment error rate,” Knezek said.

David Knezek (at right), the chief operating officer for MDHHS, speaks to the Senate Appropriations Subcommittee on DHHS. March 17, 2026. | Screenshot

He noted that the largest number of errors come from single and dual person households, while the largest dollar errors come from households with larger numbers of individuals.

“Using that AI case reading tool, we’re able to target the ones that are most likely for fraud,” Knezek said.

Knezek said the department is also deploying an optical character recognition tool to scan documents and input information such as pay stubs submitted to the department, to avoid human error up front, while allowing for human verification on the back end. 

On Monday the Michigan Advance asked the department when the AI case reading tool and character recognition tool was deployed, what programs are being used, whether there was any disclosure to applicants that AI was being used to review their case and what safeguards were in place. 

We see too many times with these AI systems that they're rolled out without adequate testing, and then it turns recipients into guinea pigs in an AI experiment, and that is not acceptable.

– Michele Gilman, the Venable professor of law at the University of Baltimore Law School

After two days, Erin Stover, a public information officer for the department said that the agency had used optical character recognition tools for several years, and had more recently begun using AI-assisted case reading to support case review. 

Eligibility staff remains responsible for all case decisions while using the tools to flag discrepancies, Stover said in an emailed statement. 

“AI-assisted case reading capability is part of our broader efforts to strengthen accuracy and prepare for federal policy changes under H.R. 1, which increase the importance of accurate eligibility determinations,” Stover said. 

The department uses tools approved by the Department of Technology, Management and Budget within its secure system, and does not use public-facing generative AI to process cases, Stover said. 

“Safeguards are in place to protect applicant data, which is only accessible to authorized personnel and is handled in accordance with state and federal privacy requirements,” Stover said. 

Applicants are also informed that their information may be verified through data matching and review processes as part of determining eligibility, Stover said, with all applications subject to review to determine eligibility, consistent with federal requirements.

Stover later told Michigan Advance the state’s case reader uses Google Vertex AI, which the company describes as a “unified, open platform for building, deploying, and scaling generative AI and machine learning models and AI applications.”

New AI Tools Aim To Reduce Errors, but Raise Familiar Concerns

The agency’s decision to incorporate artificial intelligence into its case determinations calls to mind the state’s 2013 effort to automate review of its unemployment cases through the Michigan Integrated Data Automated System, or MiDAS, leading to multiple lawsuits and settlements providing repayments and damages to many individuals wrongfully accused of fraud.

According to reporting from Undark Magazine, more than 40,000 individuals were charged with misrepresentation within the first two years of the system’s rollout, with the agency demanding payments of roughly five times what they paid in benefits.

The Michigan Auditor General later reviewed 22,000 cases marked as fraudulent, determining that 93% did not actually involve fraud. 

Given the state’s track record in using an algorithmic fraud detection system, the Department of Health and Human Services’ use of AI in SNAP determinations holds a lot of reason for caution and concern, Michele Gilman, the Venable professor of law at the University of Baltimore Law School, told Michigan Advance. 

One key question on Gilman’s mind: How well has the case reader tool been tested and vetted?

“We see too many times with these AI systems that they’re rolled out without adequate testing, and then it turns recipients into guinea pigs in an AI experiment, and that is not acceptable,” Gilman said.

Michele Gilman, Venable professor of law, director of the Saul Ewing Advocacy Clinic, and co-director of the Center on Applied Feminism at the University of Baltimore School of Law. | University of Baltimore School of Law

One of the challenges of working with fraud detection systems is that actual rates of fraud are low, whether you’re looking at public benefits, banks or credit cards, Gilman noted. As a result, programmers struggle to program tools to detect fraud, as they do not have robust data, leading to high rates of false positives and false negatives, she said.

According to the Benefits Technology Advocacy Hub, the MiDAS system would flag any data discrepancy – no matter how minor – as fraud, requiring follow up from the applicant in 10 days. The system also took the average of an applicant’s entire income rather than looking at individual paychecks, creating discrepancies on system determined income, which led to more fraud determinations.

Given the false positives and negatives that arise in these systems it’s all the more important to have some layer of human review, Gilman said. However, those reviewers need to be knowledgeable on the limits of AI systems, to avoid deferring too much to the system’s determinations. 

While there is a role technology can play in tandem with staff, Gilman said, the ultimate accountability has to lie with the agency.

“It can’t be ‘our vendor screwed up’ or ‘the AI went haywire’ like the actual accountability has to ultimately be with agency officials,” Gilman said

She pointed to the AI risk management framework released by the National Institute for Standards and Technology, which emphasizes the broad integration of humans at all phases of the AI lifecycle.

Jennifer Lord represented individuals falsely accused of fraud in a class action lawsuit against the Michigan Unemployment Insurance Agency. While working on Bauserman v. Unemployment Insurance Agency, Lord said she also advocated for guardrails on the use of AI in government services, though those efforts have yet to bear fruit.

Under former President Joe Biden’s administration, Gilman says there was a lot of attention to the ways that AI can go wrong. In 2023, Biden issued an executive order placing guardrails on AI development and tasking the U.S. Department of Agriculture and the Department of Health and Human Services with issuing guidelines on the use of AI in programs like SNAP and Medicaid. The guidelines discussed issues with AI impacting civil rights and safety and acknowledged data privacy concerns.

Due Process Concerns Loom for Benefit Recipients

However, the Biden Administration’s emphasis on fairness, equity and accountability has been thrown out the window, Gilman said, with the Trump Administration placing faith in AI companies, with less emphasis on consumer rights. 

“There’s a lot of faith in AI for cost savings and efficiency that is unwarranted,” Gilman said.

Jennifer Lord | Sterling Employment Law

As a lawyer representing low-income people receiving public benefits, Gilman said she doesn’t have many hooks to hang her hat on outside of due process rights guaranteed by the U.S. Constitution.

“As a Constitutional matter, you’re entitled to human review at some point,” Gilman said, explaining that the problem with the state’s unemployment system was that the only way to get human eyes on a case was through filing an appeal and appearing before an administrative law judge. However, the system’s determinations could not be explained, creating a “black box” problem and rendering that human review meaningless, she explained. 

Lord noted that programs written to detect fraud typically overcorrect, raising further concerns about the role program developers play in public benefits determinations. 

“We’ve got private companies who are now basically writing regulations, implementing the law, and their goal is ‘save us as much money as possible,’” Lord said. 

If the state turns over a government function to a private entity designing and implementing a system without checks and balances, it will have another disaster like the MiDAS system on its hands, Lord said.

Additionally, the individuals who rely on public benefits are the ones who have the least access to legal assistance, Lord said.

“They are already in dire financial straits, otherwise they wouldn’t be applying for the benefits,” Lord said, noting that some individuals may not have a computer, or the ability to meet tight timelines for challenging administrative decisions.

Michigan Advance is part of States Newsroom, a nonprofit news network supported by grants and a coalition of donors as a 501c(3) public charity. Michigan Advance maintains editorial independence. Contact Editor Jon King for questions: info@michiganadvance.com.

X
This website uses cookies to enhance user experience and to analyze performance and traffic on our website. We also share information about your use of our site with our social media, advertising and analytics partners. Learn More / Do Not Sell My Personal Information
Accept Cookies
X
Cookie Preferences Cookie List

Do Not Sell My Personal Information

When you visit our website, we store cookies on your browser to collect information. The information collected might relate to you, your preferences or your device, and is mostly used to make the site work as you expect it to and to provide a more personalized web experience. However, you can choose not to allow certain types of cookies, which may impact your experience of the site and the services we are able to offer. Click on the different category headings to find out more and change our default settings according to your preference. You cannot opt-out of our First Party Strictly Necessary Cookies as they are deployed in order to ensure the proper functioning of our website (such as prompting the cookie banner and remembering your settings, to log into your account, to redirect you when you log out, etc.). For more information about the First and Third Party Cookies used please follow this link.

Allow All Cookies

Manage Consent Preferences

Strictly Necessary Cookies - Always Active

We do not allow you to opt-out of our certain cookies, as they are necessary to ensure the proper functioning of our website (such as prompting our cookie banner and remembering your privacy choices) and/or to monitor site performance. These cookies are not used in a way that constitutes a “sale” of your data under the CCPA. You can set your browser to block or alert you about these cookies, but some parts of the site will not work as intended if you do so. You can usually find these settings in the Options or Preferences menu of your browser. Visit www.allaboutcookies.org to learn more.

Sale of Personal Data, Targeting & Social Media Cookies

Under the California Consumer Privacy Act, you have the right to opt-out of the sale of your personal information to third parties. These cookies collect information for analytics and to personalize your experience with targeted ads. You may exercise your right to opt out of the sale of personal information by using this toggle switch. If you opt out we will not be able to offer you personalised ads and will not hand over your personal information to any third parties. Additionally, you may contact our legal department for further clarification about your rights as a California consumer by using this Exercise My Rights link

If you have enabled privacy controls on your browser (such as a plugin), we have to take that as a valid request to opt-out. Therefore we would not be able to track your activity through the web. This may affect our ability to personalize ads according to your preferences.

Targeting cookies may be set through our site by our advertising partners. They may be used by those companies to build a profile of your interests and show you relevant adverts on other sites. They do not store directly personal information, but are based on uniquely identifying your browser and internet device. If you do not allow these cookies, you will experience less targeted advertising.

Social media cookies are set by a range of social media services that we have added to the site to enable you to share our content with your friends and networks. They are capable of tracking your browser across other sites and building up a profile of your interests. This may impact the content and messages you see on other websites you visit. If you do not allow these cookies you may not be able to use or see these sharing tools.

If you want to opt out of all of our lead reports and lists, please submit a privacy request at our Do Not Sell page.

Save Settings
Cookie Preferences Cookie List

Cookie List

A cookie is a small piece of data (text file) that a website – when visited by a user – asks your browser to store on your device in order to remember information about you, such as your language preference or login information. Those cookies are set by us and called first-party cookies. We also use third-party cookies – which are cookies from a domain different than the domain of the website you are visiting – for our advertising and marketing efforts. More specifically, we use cookies and other tracking technologies for the following purposes:

Strictly Necessary Cookies

We do not allow you to opt-out of our certain cookies, as they are necessary to ensure the proper functioning of our website (such as prompting our cookie banner and remembering your privacy choices) and/or to monitor site performance. These cookies are not used in a way that constitutes a “sale” of your data under the CCPA. You can set your browser to block or alert you about these cookies, but some parts of the site will not work as intended if you do so. You can usually find these settings in the Options or Preferences menu of your browser. Visit www.allaboutcookies.org to learn more.

Functional Cookies

We do not allow you to opt-out of our certain cookies, as they are necessary to ensure the proper functioning of our website (such as prompting our cookie banner and remembering your privacy choices) and/or to monitor site performance. These cookies are not used in a way that constitutes a “sale” of your data under the CCPA. You can set your browser to block or alert you about these cookies, but some parts of the site will not work as intended if you do so. You can usually find these settings in the Options or Preferences menu of your browser. Visit www.allaboutcookies.org to learn more.

Performance Cookies

We do not allow you to opt-out of our certain cookies, as they are necessary to ensure the proper functioning of our website (such as prompting our cookie banner and remembering your privacy choices) and/or to monitor site performance. These cookies are not used in a way that constitutes a “sale” of your data under the CCPA. You can set your browser to block or alert you about these cookies, but some parts of the site will not work as intended if you do so. You can usually find these settings in the Options or Preferences menu of your browser. Visit www.allaboutcookies.org to learn more.

Sale of Personal Data

We also use cookies to personalize your experience on our websites, including by determining the most relevant content and advertisements to show you, and to monitor site traffic and performance, so that we may improve our websites and your experience. You may opt out of our use of such cookies (and the associated “sale” of your Personal Information) by using this toggle switch. You will still see some advertising, regardless of your selection. Because we do not track you across different devices, browsers and GEMG properties, your selection will take effect only on this browser, this device and this website.

Social Media Cookies

We also use cookies to personalize your experience on our websites, including by determining the most relevant content and advertisements to show you, and to monitor site traffic and performance, so that we may improve our websites and your experience. You may opt out of our use of such cookies (and the associated “sale” of your Personal Information) by using this toggle switch. You will still see some advertising, regardless of your selection. Because we do not track you across different devices, browsers and GEMG properties, your selection will take effect only on this browser, this device and this website.

Targeting Cookies

We also use cookies to personalize your experience on our websites, including by determining the most relevant content and advertisements to show you, and to monitor site traffic and performance, so that we may improve our websites and your experience. You may opt out of our use of such cookies (and the associated “sale” of your Personal Information) by using this toggle switch. You will still see some advertising, regardless of your selection. Because we do not track you across different devices, browsers and GEMG properties, your selection will take effect only on this browser, this device and this website.