Texas universities deploy AI tools to review and rewrite how some courses discuss race and gender

McKenna Baker via Getty Images

Records obtained by The Texas Tribune show how universities are using the technology to reshape curriculum under political pressure, raising concerns about academic freedom.

This article was originally published by Texas Tribune.

A senior Texas A&M University System official testing a new artificial intelligence tool this fall asked it to find how many courses discuss feminism at one of its regional universities. Each time she asked in a slightly different way, she got a different number.

“Either the tool is learning from my previous queries,” Texas A&M system’s chief strategy officer Korry Castillo told colleagues in an email, “or we need to fine tune our requests to get the best results.”

It was Sept. 25, and Castillo was trying to deliver on a promise Chancellor Glenn Hegar and the Board of Regents had already made: to audit courses across all of the system’s 12 universities after conservative outrage over a gender-identity lesson at the flagship campus intensified earlier that month, leading to the professor’s firing and the university president’s resignation.

Texas A&M officials said the controversy stemmed from the course’s content not aligning with its description in the university’s course catalog and framed the audit as a way to ensure students knew what they were signing up for. As other public universities came under similar scrutiny and began preparing to comply with a new state law that gives governor-appointed regents more authority over curricula, they, too, announced audits.

Records obtained by The Texas Tribune offer a first look at how Texas universities are experimenting with AI to conduct those reviews. 

At Texas A&M, internal emails show staff are using AI software to search syllabi and course descriptions for words that could raise concerns under new system policies restricting how faculty teach about race and gender. 

At Texas State, memos show administrators are suggesting faculty use an AI writing assistant to revise course descriptions. They urged professors to drop words such as “challenging,” “dismantling” and “decolonizing” and to rename courses with titles like “Combating Racism in Healthcare” to something university officials consider more neutral like “Race and Public Health in America.”

While school officials describe the efforts as an innovative approach that fosters transparency and accountability, AI experts say these systems do not actually analyze or understand course content, instead generating answers that sound right based on patterns in their training data.

That means small changes in how a question is phrased can lead to different results, they said, making the systems unreliable for deciding whether a class matches its official description. They warned that using AI this way could lead to courses being flagged over isolated words and further shift control of teaching away from faculty and toward administrators.

“I’m not convinced this is about serving students or cleaning up syllabi,” said Chris Gilliard, co-director of the Critical Internet Studies Institute. “This looks like a project to control education and remove it from professors and put it into the hands of administrators and legislatures.”

Setting Up the Tool

During a board of regents meeting last month, Texas A&M System leaders described the new processes they were developing to audit courses as a repeatable enforcement mechanism. 

Vice Chancellor for Academic Affairs James Hallmark said the system would use “AI-assisted tools” to examine course data under “consistent, evidence-based criteria,” which would guide future board action on courses. Regent Sam Torn praised it as “real governance,” saying Texas A&M was “stepping up first, setting the model that others will follow.” 

That same day, the board approved new rules requiring presidents to sign off on any course that could be seen as advocating for “race and gender ideology” and prohibiting professors from teaching material not on the approved syllabus for a course.

In a statement to the Tribune, Chris Bryan, the system’s vice chancellor for marketing and communications, said Texas A&M is using OpenAI services through an existing subscription to aid the system’s course audit and that the tool is still being tested as universities finish sharing their course data. He said “any decisions about appropriateness, alignment with degree programs, or student outcomes will be made by people, not software.”

In records obtained by the Tribune, Castillo, the system’s chief strategy officer, told colleagues to prepare for about 20 system employees to use the tool to make hundreds of queries each semester. 

The records also show some of the concerns that arose from early tests of the tool.  

When Castillo told colleagues about the varying results she obtained when searching for classes that discuss feminism, deputy chief information officer Mark Schultz cautioned that the tool came with “an inherent risk of inaccuracy.”

“Some of that can be mitigated with training,” he said, “but it probably can’t be fully eliminated.”

Schultz did not specify what kinds of inaccuracies he meant. When asked if the potential inaccuracies had been resolved, Bryan said, “We are testing baseline conversations with the AI tool to validate the accuracy, relevance and repeatability of the prompts.” He said this includes seeing how the tool responds to invalid or misleading prompts and having humans review the results.

Experts said the different answers Castillo received when she rephrased her question reflect how these systems operate. They explained that these kinds of AI tools generate their responses by predicting patterns and generating strings of text.

“These systems are fundamentally systems for repeatedly answering the question ‘what is the likely next word’ and that’s it,” said Emily Bender, a computational linguist at the University of Washington. “The sequence of words that comes out looks like the kind of thing you would expect in that context, but it is not based on reason or understanding or looking at information.”

Because of that, small changes to how a question is phrased can produce different results. Experts also said users can nudge the model toward the answer they want. Gilliard said that is because these systems are also prone to what developers call “sycophancy,” meaning they try to agree with or please the user. 

“Very often, a thing that happens when people use this technology is if you chide or correct the machine, it will say, ‘Oh, I’m sorry’ or like ‘you’re right,’ so you can often goad these systems into getting the answer you desire,” he said.

T. Philip Nichols, a Baylor University professor who studies how technology influences teaching and learning in schools, said keyword searches also provide little insight into how a topic is actually taught. He called the tool “a blunt instrument” that isn’t capable of understanding how certain discussions that the software might flag as unrelated to the course tie into broader class themes. 

“Those pedagogical choices of an instructor might not be present in a syllabus, so to just feed that into a chatbot and say, ‘Is this topic mentioned?’ tells you nothing about how it’s talked about or in what way,” Nichols said. 

Castillo’s description of her experience testing the AI tool was the only time in the records reviewed by the Tribune when Texas A&M administrators discussed specific search terms being used to inspect course content. In another email, Castillo said she would share search terms with staff in person or by phone rather than email. 

System officials did not provide the list of search terms the system plans to use in the audit.

Martin Peterson, a Texas A&M philosophy professor who studies the ethics of technology, said faculty have not been asked to weigh in on the tool, including members of the university’s AI council. He noted that the council’s ethics and governance committee is charged with helping set standards for responsible AI use.

While Peterson generally opposes the push to audit the university system’s courses, he said he is “a little more open to the idea that some such tool could perhaps be used.”

“It is just that we have to do our homework before we start using the tool,” Peterson said.

AI-Assisted Revisions

At Texas State University, officials ordered faculty to rewrite their syllabi and suggested they use AI to do it.

In October, administrators flagged 280 courses for review and told faculty to revise titles, descriptions and learning outcomes to remove wording the university said was not neutral. Records indicate that dozens of courses set to be offered by the College of Liberal Arts in the Spring 2026 semester were singled out for neutrality concerns. They included courses such as Intro to Diversity, Social Inequality, Freedom in America, Southwest in Film and Chinese-English Translation.

Faculty were given until Dec. 10 to complete the rewrites, with a second-level review scheduled in January and the entire catalog to be evaluated by June. 

Administrators shared with faculty a guide outlining wording they said signaled advocacy. It discouraged learning outcomes that describe students “measure or require belief, attitude or activism (e.g., value diversity, embrace activism, commit to change).”

Administrators also provided a prompt for faculty to paste into an AI writing assistant alongside their materials. The prompt instructs the chatbot to “identify any language that signals advocacy, prescriptive conclusions, affective outcomes or ideological commitments” and generate three alternative versions that remove those elements. 

Jayme Blaschke, assistant director of media relations at Texas State, described the internal review as “thorough” and “deliberative,” but would not say whether any classes have already been revised or removed, only that “measures are in place to guide students through any adjustments and keep their academic progress on track.” He also declined to explain how courses were initially flagged and who wrote the neutrality expectations.

Faculty say the changes have reshaped how curriculum decisions are made on campus.

Aimee Villarreal, an assistant professor of anthropology and president of Texas State’s American Association of University Professors chapter, said the process is usually faculty-driven and unfolds over a longer period of time. She believes the structure of this audit allows administrators to more closely monitor how faculty describe their disciplines and steer how that material must be presented.

She said the requirement to revise courses quickly or risk having them removed from the spring schedule has created pressure to comply, which may have pushed some faculty toward using the AI writing assistant.

Villarreal said the process reflects a lack of trust in faculty and their field expertise when deciding what to teach.

“I love what I do,” Villarreal said, “and it’s very sad to see the core of what I do being undermined in this way.”

Nichols warned the trend of using AI in this way represents a larger threat. 

“This is a kind of de-professionalizing of what we do in classrooms, where we’re narrowing the horizon of what’s possible,” he said. “And I think once we give that up, that’s like giving up the whole game. That’s the whole purpose of why universities exist.”

The Texas Tribune partners with Open Campus on higher education coverage.

X
This website uses cookies to enhance user experience and to analyze performance and traffic on our website. We also share information about your use of our site with our social media, advertising and analytics partners. Learn More / Do Not Sell My Personal Information
Accept Cookies
X
Cookie Preferences Cookie List

Do Not Sell My Personal Information

When you visit our website, we store cookies on your browser to collect information. The information collected might relate to you, your preferences or your device, and is mostly used to make the site work as you expect it to and to provide a more personalized web experience. However, you can choose not to allow certain types of cookies, which may impact your experience of the site and the services we are able to offer. Click on the different category headings to find out more and change our default settings according to your preference. You cannot opt-out of our First Party Strictly Necessary Cookies as they are deployed in order to ensure the proper functioning of our website (such as prompting the cookie banner and remembering your settings, to log into your account, to redirect you when you log out, etc.). For more information about the First and Third Party Cookies used please follow this link.

Allow All Cookies

Manage Consent Preferences

Strictly Necessary Cookies - Always Active

We do not allow you to opt-out of our certain cookies, as they are necessary to ensure the proper functioning of our website (such as prompting our cookie banner and remembering your privacy choices) and/or to monitor site performance. These cookies are not used in a way that constitutes a “sale” of your data under the CCPA. You can set your browser to block or alert you about these cookies, but some parts of the site will not work as intended if you do so. You can usually find these settings in the Options or Preferences menu of your browser. Visit www.allaboutcookies.org to learn more.

Sale of Personal Data, Targeting & Social Media Cookies

Under the California Consumer Privacy Act, you have the right to opt-out of the sale of your personal information to third parties. These cookies collect information for analytics and to personalize your experience with targeted ads. You may exercise your right to opt out of the sale of personal information by using this toggle switch. If you opt out we will not be able to offer you personalised ads and will not hand over your personal information to any third parties. Additionally, you may contact our legal department for further clarification about your rights as a California consumer by using this Exercise My Rights link

If you have enabled privacy controls on your browser (such as a plugin), we have to take that as a valid request to opt-out. Therefore we would not be able to track your activity through the web. This may affect our ability to personalize ads according to your preferences.

Targeting cookies may be set through our site by our advertising partners. They may be used by those companies to build a profile of your interests and show you relevant adverts on other sites. They do not store directly personal information, but are based on uniquely identifying your browser and internet device. If you do not allow these cookies, you will experience less targeted advertising.

Social media cookies are set by a range of social media services that we have added to the site to enable you to share our content with your friends and networks. They are capable of tracking your browser across other sites and building up a profile of your interests. This may impact the content and messages you see on other websites you visit. If you do not allow these cookies you may not be able to use or see these sharing tools.

If you want to opt out of all of our lead reports and lists, please submit a privacy request at our Do Not Sell page.

Save Settings
Cookie Preferences Cookie List

Cookie List

A cookie is a small piece of data (text file) that a website – when visited by a user – asks your browser to store on your device in order to remember information about you, such as your language preference or login information. Those cookies are set by us and called first-party cookies. We also use third-party cookies – which are cookies from a domain different than the domain of the website you are visiting – for our advertising and marketing efforts. More specifically, we use cookies and other tracking technologies for the following purposes:

Strictly Necessary Cookies

We do not allow you to opt-out of our certain cookies, as they are necessary to ensure the proper functioning of our website (such as prompting our cookie banner and remembering your privacy choices) and/or to monitor site performance. These cookies are not used in a way that constitutes a “sale” of your data under the CCPA. You can set your browser to block or alert you about these cookies, but some parts of the site will not work as intended if you do so. You can usually find these settings in the Options or Preferences menu of your browser. Visit www.allaboutcookies.org to learn more.

Functional Cookies

We do not allow you to opt-out of our certain cookies, as they are necessary to ensure the proper functioning of our website (such as prompting our cookie banner and remembering your privacy choices) and/or to monitor site performance. These cookies are not used in a way that constitutes a “sale” of your data under the CCPA. You can set your browser to block or alert you about these cookies, but some parts of the site will not work as intended if you do so. You can usually find these settings in the Options or Preferences menu of your browser. Visit www.allaboutcookies.org to learn more.

Performance Cookies

We do not allow you to opt-out of our certain cookies, as they are necessary to ensure the proper functioning of our website (such as prompting our cookie banner and remembering your privacy choices) and/or to monitor site performance. These cookies are not used in a way that constitutes a “sale” of your data under the CCPA. You can set your browser to block or alert you about these cookies, but some parts of the site will not work as intended if you do so. You can usually find these settings in the Options or Preferences menu of your browser. Visit www.allaboutcookies.org to learn more.

Sale of Personal Data

We also use cookies to personalize your experience on our websites, including by determining the most relevant content and advertisements to show you, and to monitor site traffic and performance, so that we may improve our websites and your experience. You may opt out of our use of such cookies (and the associated “sale” of your Personal Information) by using this toggle switch. You will still see some advertising, regardless of your selection. Because we do not track you across different devices, browsers and GEMG properties, your selection will take effect only on this browser, this device and this website.

Social Media Cookies

We also use cookies to personalize your experience on our websites, including by determining the most relevant content and advertisements to show you, and to monitor site traffic and performance, so that we may improve our websites and your experience. You may opt out of our use of such cookies (and the associated “sale” of your Personal Information) by using this toggle switch. You will still see some advertising, regardless of your selection. Because we do not track you across different devices, browsers and GEMG properties, your selection will take effect only on this browser, this device and this website.

Targeting Cookies

We also use cookies to personalize your experience on our websites, including by determining the most relevant content and advertisements to show you, and to monitor site traffic and performance, so that we may improve our websites and your experience. You may opt out of our use of such cookies (and the associated “sale” of your Personal Information) by using this toggle switch. You will still see some advertising, regardless of your selection. Because we do not track you across different devices, browsers and GEMG properties, your selection will take effect only on this browser, this device and this website.