California investigates Elon Musk’s AI company after ‘avalanche’ of complaints about sexual content

Attorney General Rob Bonta speaks during a press conference at the Office of the Attorney General in Sacramento, California, on Oct. 28, 2025.

Attorney General Rob Bonta speaks during a press conference at the Office of the Attorney General in Sacramento, California, on Oct. 28, 2025. Tayfun Coskun/Anadolu via Getty Images

Attorney General Rob Bonta said his office is looking into whether a new AI image editing tool from Elon Musk's company violates California law.

This article was originally published on The Markup and was republished under the Creative Commons Attribution-NonCommercial-NoDerivatives license.

California Attorney General Rob Bonta today announced an investigation into how and whether Elon Musk’s X and xAI broke the law in the past few weeks by enabling the spread of naked or sexual imagery without consent.

xAI reportedly updated its Grok artificial intelligence tool last month to allow image editing. Users on the social media platform X, which is connected to the tool, began using Grok to remove clothing in pictures of women and children.

“The avalanche of reports detailing the non-consensual sexually explicit material that xAI has produced and posted online in recent weeks is shocking,” Bonta said in a written statement. “This material, which depicts women and children in nude and sexually explicit situations, has been used to harass people across the internet. I urge xAI to take immediate action to ensure this goes no further.”

Bonta urged Californians who want to report depictions of them or their children undressed or committing sexual acts to visit oag.ca.gov/report. In an emailed response, xAI did not address questions about the investigation.

Research obtained by Bloomberg found Research obtained by Bloomberg found that X users posted more non-consensual naked or sexual imagery than those of any other website. In a posting on X, Musk promised “consequences” for people who made illegal content with the tool. On Friday, Grok limited image editing to paying subscribers.

One potential route for Bonta to prosecute xAI is a law that went into effect just two weeks ago creating legal liability for the creation and distribution of “deepfake” pornography.

 X and xAI appear to be violating the provisions of that law, known as AB 621, said Sam Dordulian, who previously worked in the sex crimes unit of the Los Angeles District Attorney’s Office but today works in private practice as a lawyer for people in cases involving deepfakes or revenge porn.

Assemblymember Rebecca Bauer-Kahan, author of the law, told CalMatters in a statement last week that she reached out to prosecutors, including the attorney general’s office and the city attorney of San Francisco, to remind them that they can act under the law. What's happening on X, Bauer-Kahan said, is what AB 621 was designed to address. 

“Real women are having their images manipulated without consent, and the psychological and reputational harm is devastating,” the San Ramon Democrat said in an emailed statement. “Underage children are having their images used to create child sexual abuse material, and these websites are knowingly facilitating it.”

A Global Concern

Bonta’s inquiry also comes shortly after a call for an investigation by Gov. Gavin Newsom, backlash from regulators in the European Union and India and bans on X in Malaysia, Indonesia, and potentially the United Kingdom. As Grok app downloads rise in Apple and Google app stores, lawmakers and advocates are calling for the smartphone makers to prohibit the application.

Why Grok created the feature the way it did and how it will respond to the controversy around it is unclear, and answers may not be forthcoming, since an analysis recently concluded that it’s the least transparent of major AI systems available today. xAI did not address questions about the investigation from CalMatters. 

Evidence of concrete harm from deepfakes is piling up. In 2024, the FBI warned that use of deepfake tools to extort young people is a growing problem that has led to instances of self harm and suicide. Multiple audits have found that child sexual abuse material is inside the training data of AI models, making them capable of geneating vulgar photos. A 2024 Center for Democracy and Technology survey found that 15% of high school students have heard of or seen sexually explicit imagery of someone they know at school in the past year.

The investigation announced today is the latest action by the attorney general to push AI companies to keep kids safe. Late last year, Bonta endorsed a bill that would have prevented chatbots that talk about self harm and engage in sexually explicit conversations  from interacting with people under 18. He also joined attorneys general from 44 other states in sending a letter that questions why companies like Meta and OpenAI allow their chatbots to have sexually inappropriate conversations with minors.

California has passed roughly half a dozen laws since 2019 to protect people from deepfakes. The new law by Bauer-Kahan amends and strengthens a 2019 law, most significantly by allowing district attorneys to bring cases against companies that “recklessly aid and abet” the distribution of deepfakes without the consent of the person depicted nude or committing sexual acts. That means the average person can ask the attorney general or the district attorney where they live to file a case on their behalf. It also increases the maximum amount that a judge can award a person from $150,000 to $250,000. Under the law, a public prosecutor is not required to prove that an individual depictured in an AI generated nude or sexual image suffered actual harm to bring a case to court. Websites that refuse to comply within 30 days can face penalties of $25,000 per violation.

In addition to those measures, two 2024 laws (AB 1831 and SB 1381) expand the state’s definition of child pornography to make possession or distribution of artificially-generated child sexual abuse material illegal. Another required social media platforms to give people an easy way to request the immediate removal of a deepfake, and defines the posting of such material as a form of digital identity theft. A California law limiting the use of deepfakes in elections was signed into law last year but was struck down by a federal judge last summer following a lawsuit by X and Elon Musk. 

Future Reforms

Every new state law helps give lawyers like Dordulian a new avenue to address harmful uses of deepfakes, but he said more needs to be done to help people protect themselves. He said his clients face challenges proving violation of existing laws since they require distribution of explicit materials, for example with a messaging app or social media platform, for protections to kick in. In his experience, people who use nudify apps typically know each other, so distribution doesn’t always take place, and if it does, it can be hard to prove.

For example, he said, he has a client who works as a nanny who alleges that the father of the kids she takes care of made images of her using photos she posted on Instagram. The nanny found the images on his iPad. This discovery was disturbing for her and caused her emotional trauma, but since he can’t use deepfake laws he has to sue on the basis of negligence or emotional distress and laws that were never created to address deepfakes. Similarly, victims told CNBC last year that the distinction between creating and distributing deepfakes left a gap in the law in a number of U.S. states.

“The law needs to keep up with what’s really happening on the ground and what women are experiencing, which is just the simple act of creation itself is the problem,” Dordulian said.

California is at the forefront of passing laws to protect people from deepfakes, but existing law isn’t meeting the moment, said Jennifer Gibson, cofounder and director of Psst, a group created a little over a year ago that provides pro bono legal services to tech and AI workers interested in whistleblowing. A California law that went into effect Jan. 1 protects whistleblowers inside AI companies but only if they work on catastrophic risk that can kill more than 50 people or cause more than $1 billion in damages. If the law protected people who work on deepfakes, former X employees who detailed witnessing Grok generating illegal sexually explicit material last year to Business Insider would, Gibson said, have had protections if they  shared the information with authorities.

“There needs to be a lot more protection for exactly this kind of scenario in which an insider sees that this is foreseeable, knows that this is going to happen, and they need somewhere to go to report to both to keep the company accountable and protect the public.”

Originally published on themarkup.org

X
This website uses cookies to enhance user experience and to analyze performance and traffic on our website. We also share information about your use of our site with our social media, advertising and analytics partners. Learn More / Do Not Sell My Personal Information
Accept Cookies
X
Cookie Preferences Cookie List

Do Not Sell My Personal Information

When you visit our website, we store cookies on your browser to collect information. The information collected might relate to you, your preferences or your device, and is mostly used to make the site work as you expect it to and to provide a more personalized web experience. However, you can choose not to allow certain types of cookies, which may impact your experience of the site and the services we are able to offer. Click on the different category headings to find out more and change our default settings according to your preference. You cannot opt-out of our First Party Strictly Necessary Cookies as they are deployed in order to ensure the proper functioning of our website (such as prompting the cookie banner and remembering your settings, to log into your account, to redirect you when you log out, etc.). For more information about the First and Third Party Cookies used please follow this link.

Allow All Cookies

Manage Consent Preferences

Strictly Necessary Cookies - Always Active

We do not allow you to opt-out of our certain cookies, as they are necessary to ensure the proper functioning of our website (such as prompting our cookie banner and remembering your privacy choices) and/or to monitor site performance. These cookies are not used in a way that constitutes a “sale” of your data under the CCPA. You can set your browser to block or alert you about these cookies, but some parts of the site will not work as intended if you do so. You can usually find these settings in the Options or Preferences menu of your browser. Visit www.allaboutcookies.org to learn more.

Sale of Personal Data, Targeting & Social Media Cookies

Under the California Consumer Privacy Act, you have the right to opt-out of the sale of your personal information to third parties. These cookies collect information for analytics and to personalize your experience with targeted ads. You may exercise your right to opt out of the sale of personal information by using this toggle switch. If you opt out we will not be able to offer you personalised ads and will not hand over your personal information to any third parties. Additionally, you may contact our legal department for further clarification about your rights as a California consumer by using this Exercise My Rights link

If you have enabled privacy controls on your browser (such as a plugin), we have to take that as a valid request to opt-out. Therefore we would not be able to track your activity through the web. This may affect our ability to personalize ads according to your preferences.

Targeting cookies may be set through our site by our advertising partners. They may be used by those companies to build a profile of your interests and show you relevant adverts on other sites. They do not store directly personal information, but are based on uniquely identifying your browser and internet device. If you do not allow these cookies, you will experience less targeted advertising.

Social media cookies are set by a range of social media services that we have added to the site to enable you to share our content with your friends and networks. They are capable of tracking your browser across other sites and building up a profile of your interests. This may impact the content and messages you see on other websites you visit. If you do not allow these cookies you may not be able to use or see these sharing tools.

If you want to opt out of all of our lead reports and lists, please submit a privacy request at our Do Not Sell page.

Save Settings
Cookie Preferences Cookie List

Cookie List

A cookie is a small piece of data (text file) that a website – when visited by a user – asks your browser to store on your device in order to remember information about you, such as your language preference or login information. Those cookies are set by us and called first-party cookies. We also use third-party cookies – which are cookies from a domain different than the domain of the website you are visiting – for our advertising and marketing efforts. More specifically, we use cookies and other tracking technologies for the following purposes:

Strictly Necessary Cookies

We do not allow you to opt-out of our certain cookies, as they are necessary to ensure the proper functioning of our website (such as prompting our cookie banner and remembering your privacy choices) and/or to monitor site performance. These cookies are not used in a way that constitutes a “sale” of your data under the CCPA. You can set your browser to block or alert you about these cookies, but some parts of the site will not work as intended if you do so. You can usually find these settings in the Options or Preferences menu of your browser. Visit www.allaboutcookies.org to learn more.

Functional Cookies

We do not allow you to opt-out of our certain cookies, as they are necessary to ensure the proper functioning of our website (such as prompting our cookie banner and remembering your privacy choices) and/or to monitor site performance. These cookies are not used in a way that constitutes a “sale” of your data under the CCPA. You can set your browser to block or alert you about these cookies, but some parts of the site will not work as intended if you do so. You can usually find these settings in the Options or Preferences menu of your browser. Visit www.allaboutcookies.org to learn more.

Performance Cookies

We do not allow you to opt-out of our certain cookies, as they are necessary to ensure the proper functioning of our website (such as prompting our cookie banner and remembering your privacy choices) and/or to monitor site performance. These cookies are not used in a way that constitutes a “sale” of your data under the CCPA. You can set your browser to block or alert you about these cookies, but some parts of the site will not work as intended if you do so. You can usually find these settings in the Options or Preferences menu of your browser. Visit www.allaboutcookies.org to learn more.

Sale of Personal Data

We also use cookies to personalize your experience on our websites, including by determining the most relevant content and advertisements to show you, and to monitor site traffic and performance, so that we may improve our websites and your experience. You may opt out of our use of such cookies (and the associated “sale” of your Personal Information) by using this toggle switch. You will still see some advertising, regardless of your selection. Because we do not track you across different devices, browsers and GEMG properties, your selection will take effect only on this browser, this device and this website.

Social Media Cookies

We also use cookies to personalize your experience on our websites, including by determining the most relevant content and advertisements to show you, and to monitor site traffic and performance, so that we may improve our websites and your experience. You may opt out of our use of such cookies (and the associated “sale” of your Personal Information) by using this toggle switch. You will still see some advertising, regardless of your selection. Because we do not track you across different devices, browsers and GEMG properties, your selection will take effect only on this browser, this device and this website.

Targeting Cookies

We also use cookies to personalize your experience on our websites, including by determining the most relevant content and advertisements to show you, and to monitor site traffic and performance, so that we may improve our websites and your experience. You may opt out of our use of such cookies (and the associated “sale” of your Personal Information) by using this toggle switch. You will still see some advertising, regardless of your selection. Because we do not track you across different devices, browsers and GEMG properties, your selection will take effect only on this browser, this device and this website.