Will Connecticut pass AI legislation this year?

Pgiam via Getty Images

As AI technology stands to see increased adoption in the near future, legislators in the state say waiting any longer would be a mistake.

This article was originally published by CT Mirror.

When Connecticut lawmakers exited the state Capitol at the end of the 2025 session, they left behind some unfinished business, particularly around the state’s plan for regulating companies’ use of artificial intelligence, ensuring data privacy and establishing consumer protections around emerging technologies. 

For a second year in a row, legislators were unable to agree on the direction of state AI policy, with pro-regulation lawmakers in the state Senate and the more regulation-shy Lamont administration disagreeing over the best course of action.   

In the months since, the question of what Connecticut should do about AI has only become more pressing. In December, the Trump administration issued an executive order in the hopes of discouraging states from regulating the technology. Meanwhile, a growing number of businesses are incorporating artificial intelligence into their operations, and investment in the global AI market has reached hundreds of billions of dollars.

Without federal legislation, state legislatures — in Connecticut and elsewhere — are facing pressure to address everything from the ethics of AI use to the environmental impact of data centers and concerns over a dot com-like “bubble.”

And as "generative AI" — programs that use datasets and already-available information to power technologies like ChatGPT, Google’s Gemini, and Microsoft’s Copilot — is increasingly used in everyday life, the task facing regulators is only getting more complicated.

So far, few states have reached common ground on how to write the rules.

Pro-regulation lawmakers have proposed a wave of new measures, arguing that guardrails on the rapidly-changing technology will provide necessary protection to constituents worried about losing their privacy and intellectual property.  

Opponents say the ever-growing list of AI "dos and don'ts" could have a chilling effect on local economies, curbing AI adoption and encouraging technology companies and innovation-focused businesses to move to friendlier markets.

In Connecticut, the debate is unfolding just as state economic development officials launch multiple efforts to invest in artificial intelligence and emerging technologies, likening the initiative to a second industrial revolution. 

With the 2026 legislative session quickly approaching, state lawmakers believe that the coming months provide a chance to define how Connecticut will approach the technology moving forward. Leaders of last year’s regulation efforts say the state can’t afford to miss its next chance.

"There's definitely a debate over how strong our AI laws should be," said Senate Majority Leader Bob Duff, D-Norwalk. "But I will tell you that if you talk to average people on the streets, they're very concerned about AI and how it's going to impact them."

Lawmakers are Gearing Up for Another Swing at AI regulations

The Connecticut General Assembly’s record on passing AI-related measures is mixed. In recent years, state lawmakers have been able to push through a number of proposals, including data privacy regulation, new funding for AI training and education programs, and the criminalization of deepfake revenge porn. 

Efforts to pass comprehensive legislation have been harder to get over the finish line.

Take Senate Bill 2, a wide-ranging proposal that sought to regulate how businesses use artificial intelligence in various ways, calling for the Department of Economic and Community Development to create a "regulatory sandbox" and seeking to limit the effects of algorithm-based discrimination. The bill was supported by Democratic leadership in the state Senate, and first emerged in 2024 after a state task force released a 255-page report on AI. 

Gov. Ned Lamont opposed the bill, arguing that the measure would contribute to a fractured landscape of state AI regulations. State officials also suggested that lawmakers were acting too early, potentially scaring off future innovation in Connecticut. 

Ultimately, S.B. 2 was amended to remove many of the business-related provisions and completely pulled references to algorithmic discrimination. While the amended bill passed the Senate with bipartisan support, the measure did not receive a vote in the House before the end of last year's session. 

For supporters of the legislation, the failure was frustrating, especially after last-minute amendments shifted the bill away from some of its original intent for the sake of broader appeal. "The bill had changed and become, I would say, more scaled back in the protections," said state Sen. James Maroney, D-Milford, the author of Senate Bill 2 and a leading voice in the legislature on data privacy and AI. 

"By the end of last year, [S.B. 2] was more of a disclosure bill, to use if AI was being used to make an important decision about your life," he said. 

In a December interview with the Connecticut Mirror, Maroney outlined his views on the state’s AI needs. He noted that he is far from an opponent of artificial intelligence, instead casting his desire for regulation as supporting the guardrails that will help structure the state’s future innovation efforts. 

He said last year's proposal would have provided those guardrails by accomplishing a multifaceted goal: "protecting" state residents, "promoting" responsible AI development, and "empowering" state government to use AI in ways that will benefit constituents. 

Senate lawmakers intend to continue their efforts to "protect, promote, and empower" this year, planning a package of data privacy and consumer protection reforms alongside support for AI training and workforce development. One such bill has already been announced: a ban on facial recognition software in retail stores. 

Both Maroney and Duff, the bill’s expected sponsors, said the measure was inspired by news that Wegmans Food Markets, a popular grocery chain, is using facial recognition software at some of its locations, including in its New York City grocery store. While the company said it's not sharing the data with any third parties, the news still sparked concern over the use and storage of biometric data.

"The facial recognition and the biometrics and voice recognition, I think, are issues that are really much different than a camera looking for a shoplifter," Duff said. 

The bill’s sponsors say they hope to enact the ban before facial recognition software becomes widely used in the state. Earlier in January, reporting from CT Insider found that ShopRite, a New Jersey-based grocery chain with several locations in Connecticut, was using facial recognition software in several local stores.

As New Legislation Comes Into Shape, Businesses in CT are Wary

A spokesperson for the governor said he wants to focus on regulations that "protect the privacy and safety of Connecticut residents."

"Governor Lamont continues to be supportive of any measures that protect the safety of residents when using AI, as well initiatives to upskill AI research and job training," Rob Blanchard, Lamont's spokesperson, said in an emailed statement. "While the federal landscape surrounding AI regulation continues to evolve, the Governor will continue to prioritize safety and education." 

State lawmakers who support regulating AI and data privacy told the Connecticut Mirror that their efforts are about ensuring state residents can engage with artificial intelligence on their own terms. In their view, regulation is both commonsense and necessary, and does not have to result in serious negative impacts for local businesses.

Some business leaders see things differently. The Connecticut Business and Industry Association, the state's largest trade group, has been critical of efforts to strongly regulate AI use, arguing that at a time when the economy is stagnant, energy and other costs continue to impact companies, and small business owners voice concern and frustration over the state’s business climate, new AI policy could hinder innovation. 

The adoption of new regulations on businesses, "puts us at much more of a risk of being a less business-friendly state, and can really impact investment in the state, and the ability for small businesses to want to operate here in the state," said Chris Davis, CBIA’s vice president of public policy. "That can really hinder willingness to take advantage of the beneficial sides of artificial intelligence, the efficiencies that improve productivity and increase tax revenue for the state and really grow our economy." 

Davis said his concerns largely boil down to three points. First, there is a concern that proposed regulations in the state are blurring the lines between artificial intelligence and data privacy, creating a consistent "creep" of new regulations. 

Next is the question of how the state might enact and enforce policies, particularly algorithmic discrimination and the use of impact assessments to track business employment outcomes.

Research has found that because of how AI gathers and uses already available information, some of which can contain biased and inaccurate data, AI systems can produce outputs that reinforce discrimination against marginalized communities. That can cause harm to people based on their age, race, and gender. Debate around the topic is currently making its way through the courts as a lawsuit, Mobley v. Workday, which challenges some AI-based hiring systems as being discriminatory.

Concerns over AI bias were a component of last year's legislative debate, with some lawmakers arguing that failing to address algorithmic bias would leave a massive "hole" in any state legislation.

Addressing algorithmic bias has proven to be a major focus in statehouses; measures in more than 20 states were introduced in 2025. 

The push to address algorithmic discrimination through specific and repeated assessment was of particular concern to businesses in Connecticut, Davis said, because it suggested that “every business is discriminating unless they can somehow prove that they're not.” Davis said federal policy and state law — the Connecticut Fair Employment Practices Act, in particular — already require businesses not to discriminate.

Ultimately Connecticut lawmakers removed references to algorithmic discrimination from last year’s bill. 

Davis' final concern is the direct result of the other two: that by creating a wave of new regulations and then requiring businesses to keep track of how they are complying with them, the state could inadvertently limit AI growth by creating a system that is overly complicated, expensive and mired in paperwork. 

Some of these concerns, along with a growing business interest in having input on new state policies, are part of why CBIA recently launched a Technology Council, a group that will review and offer business industry perspectives on proposed state technology policy. The group is expected to be active in the coming year. 

Davis declined to discuss the pending facial recognition bill or other possible legislation that could emerge in the session, noting CBIA would prefer to comment after bills are introduced. Still, he said he hopes lawmakers will avoid enacting anything too rigid so that businesses have flexibility.

“We're in a situation where we need to be able to find ways to be more productive and more efficient here in the state,” he said. “And AI has that opportunity.” 

States are Leading the Way on AI Regulation. The Federal Government Wants to Change That.

Asked about the ideal form of AI regulation for the business community, Davis said business and industry concerns are largely rooted in the piecemeal nature of state action. If each state adopts differing levels of regulation around AI and data privacy measures, that would make it difficult for businesses and consumers alike to navigate issues across state lines.

More specifically, there is concern that Connecticut could end up on the stricter side of the regulatory divide, and that companies looking for looser standards might move somewhere else. This is part of why S.B. 2 proved controversial last year, and was a factor in why the pro-business Lamont voiced his preference for other states to take the lead on adopting AI regulations.

The earliest adopters of comprehensive AI regulation have also run into their own troubles. 

Colorado — for one — has emerged as a sort of national test case. In 2024, the state enacted the Colorado Artificial Intelligence Act, a comprehensive regulatory measure that addressed algorithmic discrimination. The law was the first broad measure approved at the state level, and the Colorado bill has been viewed as a model that could potentially influence other states looking to adopt regulations. 

The fairly new law continues to be a source of controversy ahead of its expected implementation later this year, with supporters and opponents remaining at odds as the state braces for higher than expected implementation costs. Colorado lawmakers are currently looking to revise the law in the current 2026 session. 

The continued discussion and delays in Colorado offer an early lesson: that lawmakers in other states will need to establish a variety of technical standards, from concise regulatory definitions, to easily navigated financial frameworks, and clearly-structured review processes for businesses if they hope to adopt comprehensive AI regulation. 

At this point, many states seem more interested in adopting smaller, more incremental bills over large legislative packages. According to the National Conference of State Legislatures, almost every state considered an AI or consumer privacy bill in 2025, with further action expected in statehouses this year. 

For now, Connecticut also seems likely to take a more targeted approach in 2026, and early discussions at the start of the session seem likely to focus on data privacy.

“Whenever you bring up privacy issues, there's a lot of things that we can talk about,” Duff said.

As Connecticut lawmakers work through these questions, the federal government is looking to have its own say on state AI efforts. The Trump administration's December executive order warned states away from AI regulations, arguing that a patchwork of regulation could negatively affect interstate commerce. The administration instead supported a "carefully crafted national framework", a singular federal standard establishing national rules on AI and related consumer protections.

The order also threatened to pull back leftover broadband deployment funds from states that have passed "onerous" laws around AI. 

The executive order arrived months after a previous effort to curtail state AI efforts failed in Congress, with lawmakers removing a proposed ten year moratorium on state AI regulations from an earlier version of the president’s One Big Beautiful Bill Act over the summer.

Still, federal efforts to cut off state legislation may not have much of a chance. "A lot of the things in the executive order are — I don't want to say they're not enforceable, but they don't actually do that much," said Gowri Ramachandran, the director of elections and security for the Brennan Center for Justice, a legal and policy think thank housed at New York University's School of Law. She notes that state AI laws are likely on solid legal ground, adding that the administration lacks the power to directly take legal action against state measures. 

In Connecticut, lawmakers supporting new regulations say that in the absence of federal leadership, it is up to states to help put boundaries on artificial intelligence technologies. "Based on past precedent, there will not be a national standard," said Maroney, who joined Duff and other state lawmakers in signing a letter criticizing the president's AI executive order last month. "We haven't seen any federal laws between 1998 and last year." 

And as AI technology stands to see increased adoption in the near future, legislators say waiting any longer would be a mistake. 

"By not addressing or regulating in some way, shape, or form artificial intelligence, we make the same mistake that we did 30 years ago, when we did not put any kind of regulation or boundaries around the internet," Duff said. "That's a mistake to our society, to our country."

X
This website uses cookies to enhance user experience and to analyze performance and traffic on our website. We also share information about your use of our site with our social media, advertising and analytics partners. Learn More / Do Not Sell My Personal Information
Accept Cookies
X
Cookie Preferences Cookie List

Do Not Sell My Personal Information

When you visit our website, we store cookies on your browser to collect information. The information collected might relate to you, your preferences or your device, and is mostly used to make the site work as you expect it to and to provide a more personalized web experience. However, you can choose not to allow certain types of cookies, which may impact your experience of the site and the services we are able to offer. Click on the different category headings to find out more and change our default settings according to your preference. You cannot opt-out of our First Party Strictly Necessary Cookies as they are deployed in order to ensure the proper functioning of our website (such as prompting the cookie banner and remembering your settings, to log into your account, to redirect you when you log out, etc.). For more information about the First and Third Party Cookies used please follow this link.

Allow All Cookies

Manage Consent Preferences

Strictly Necessary Cookies - Always Active

We do not allow you to opt-out of our certain cookies, as they are necessary to ensure the proper functioning of our website (such as prompting our cookie banner and remembering your privacy choices) and/or to monitor site performance. These cookies are not used in a way that constitutes a “sale” of your data under the CCPA. You can set your browser to block or alert you about these cookies, but some parts of the site will not work as intended if you do so. You can usually find these settings in the Options or Preferences menu of your browser. Visit www.allaboutcookies.org to learn more.

Sale of Personal Data, Targeting & Social Media Cookies

Under the California Consumer Privacy Act, you have the right to opt-out of the sale of your personal information to third parties. These cookies collect information for analytics and to personalize your experience with targeted ads. You may exercise your right to opt out of the sale of personal information by using this toggle switch. If you opt out we will not be able to offer you personalised ads and will not hand over your personal information to any third parties. Additionally, you may contact our legal department for further clarification about your rights as a California consumer by using this Exercise My Rights link

If you have enabled privacy controls on your browser (such as a plugin), we have to take that as a valid request to opt-out. Therefore we would not be able to track your activity through the web. This may affect our ability to personalize ads according to your preferences.

Targeting cookies may be set through our site by our advertising partners. They may be used by those companies to build a profile of your interests and show you relevant adverts on other sites. They do not store directly personal information, but are based on uniquely identifying your browser and internet device. If you do not allow these cookies, you will experience less targeted advertising.

Social media cookies are set by a range of social media services that we have added to the site to enable you to share our content with your friends and networks. They are capable of tracking your browser across other sites and building up a profile of your interests. This may impact the content and messages you see on other websites you visit. If you do not allow these cookies you may not be able to use or see these sharing tools.

If you want to opt out of all of our lead reports and lists, please submit a privacy request at our Do Not Sell page.

Save Settings
Cookie Preferences Cookie List

Cookie List

A cookie is a small piece of data (text file) that a website – when visited by a user – asks your browser to store on your device in order to remember information about you, such as your language preference or login information. Those cookies are set by us and called first-party cookies. We also use third-party cookies – which are cookies from a domain different than the domain of the website you are visiting – for our advertising and marketing efforts. More specifically, we use cookies and other tracking technologies for the following purposes:

Strictly Necessary Cookies

We do not allow you to opt-out of our certain cookies, as they are necessary to ensure the proper functioning of our website (such as prompting our cookie banner and remembering your privacy choices) and/or to monitor site performance. These cookies are not used in a way that constitutes a “sale” of your data under the CCPA. You can set your browser to block or alert you about these cookies, but some parts of the site will not work as intended if you do so. You can usually find these settings in the Options or Preferences menu of your browser. Visit www.allaboutcookies.org to learn more.

Functional Cookies

We do not allow you to opt-out of our certain cookies, as they are necessary to ensure the proper functioning of our website (such as prompting our cookie banner and remembering your privacy choices) and/or to monitor site performance. These cookies are not used in a way that constitutes a “sale” of your data under the CCPA. You can set your browser to block or alert you about these cookies, but some parts of the site will not work as intended if you do so. You can usually find these settings in the Options or Preferences menu of your browser. Visit www.allaboutcookies.org to learn more.

Performance Cookies

We do not allow you to opt-out of our certain cookies, as they are necessary to ensure the proper functioning of our website (such as prompting our cookie banner and remembering your privacy choices) and/or to monitor site performance. These cookies are not used in a way that constitutes a “sale” of your data under the CCPA. You can set your browser to block or alert you about these cookies, but some parts of the site will not work as intended if you do so. You can usually find these settings in the Options or Preferences menu of your browser. Visit www.allaboutcookies.org to learn more.

Sale of Personal Data

We also use cookies to personalize your experience on our websites, including by determining the most relevant content and advertisements to show you, and to monitor site traffic and performance, so that we may improve our websites and your experience. You may opt out of our use of such cookies (and the associated “sale” of your Personal Information) by using this toggle switch. You will still see some advertising, regardless of your selection. Because we do not track you across different devices, browsers and GEMG properties, your selection will take effect only on this browser, this device and this website.

Social Media Cookies

We also use cookies to personalize your experience on our websites, including by determining the most relevant content and advertisements to show you, and to monitor site traffic and performance, so that we may improve our websites and your experience. You may opt out of our use of such cookies (and the associated “sale” of your Personal Information) by using this toggle switch. You will still see some advertising, regardless of your selection. Because we do not track you across different devices, browsers and GEMG properties, your selection will take effect only on this browser, this device and this website.

Targeting Cookies

We also use cookies to personalize your experience on our websites, including by determining the most relevant content and advertisements to show you, and to monitor site traffic and performance, so that we may improve our websites and your experience. You may opt out of our use of such cookies (and the associated “sale” of your Personal Information) by using this toggle switch. You will still see some advertising, regardless of your selection. Because we do not track you across different devices, browsers and GEMG properties, your selection will take effect only on this browser, this device and this website.