AI as the power beast — can we tame it?

sinology via Getty Images
COMMENTARY | Companies are pledging to spend billions of dollars on data centers to deal with the increased compute power. But it will strain resources, and other questions remain.
Today, humans and machines find themselves competing for both intellectual dominance and electric power usage. We know that artificial intelligence, in all its many applications, has an almost unlimited demand for quality data, wherever it can be found. With more data, AI becomes more powerful.
AI is a resource-intensive technology due to its high cooling requirements and significant water consumption, which are necessary to maintain a controlled internal environment. As AI capacity grows, so has the demand for electricity and water. Two examples of data center energy usage:
- A typical AI data center, according to the International Energy Agency, uses as much power as 100,000 households right now, but the largest centers currently being constructed will consume 20 times that amount.
- AI-specific servers in these data centers are estimated to have used between 53 and 76 terawatt-hours of electricity. On the high end, this is enough to power more than 7.2 million U.S. homes for a year.
As the world's energy demand increases to meet the needs of a warming planet, and the demand for larger and more data centers grows, there are headline-grabbing news reports about new initiatives to develop or expand massive data centers.
Coal-fired power plants, about to be decommissioned, have been granted long-term extensions. The Three Mile Island Nuclear Power Plant, known for one of the worst reactor meltdown disasters in the US, is bringing one of its undamaged reactors back online, with Microsoft serving as the primary customer. They claim this as a win because it will provide them with the much-needed energy while reducing carbon emissions compared to other alternative sources.
Responding to millions of AI chatbot queries daily puts sustained energy pressure on cloud infrastructure. Basic AI inquiries can equate to the same energy use as powering approximately 150 average U.S. homes for one year, or driving a gasoline-powered car over 1 million miles.
Some savvy tech leaders are urging people to stop saying things like “please” and “thank you.” The rationale is that this approach requires more computational time, as AI responds to such politeness in more detail than necessary, thereby consuming precious computing resources and energy. A deeper dive into per query usage reveals some interesting statistics, for example:
- Google Search: Uses 0.0003 kilowatt-hours per query
- ChatGPT: Uses 0.0029 kWh per query (10 times more energy)
- A single request to an AI chatbot is said to consume 2.9 watt hours compared to 0.3 watt hours for a Google search
These comparisons help illustrate that while individual AI queries use significantly more energy than traditional web searches, the broader infrastructure that supports AI represents a substantial portion of energy consumption that's comparable to powering millions of homes or significant portions of national electricity grids.
As the planet warms, air conditioning use by humans accounts for 7% of global electricity, and is estimated to grow year over year. It is challenging to find direct comparisons between the electricity use of AI and data centers, as they often serve multiple purposes. Some estimates suggest that the electricity usage of data centers between 2030 and 2035 could account for as much as 20% to 30% of global electricity consumption.
Meanwhile, the world’s largest tech companies are executing unprecedented expansions of data center infrastructure, driven by the explosive growth of artificial intelligence, cloud computing and digital services. Recent corporate announcements, when combined, represent trillions of dollars in planned investments in U.S. data center and technology infrastructure.
Some examples include Apple’s commitment to investing more than $500 billion in the U.S. over four years, with a portion allocated to AI infrastructure, data centers, and silicon research and development, Microsoft’s construction of a $1 billion data center campus in Mount Pleasant, Wisconsin, as part of a larger plan to boost cloud and AI capabilities. The site was chosen for its reliable power infrastructure and a skilled workforce, and is expected to be operational by the end of 2025.
Others are doing similar work. Oracle, in partnership with OpenAI and SoftBank, is developing the $100 billion “Stargate” project in Texas, focused on AI infrastructure and integrating renewable energy sources such as solar and wind. This is part of a $500 billion, five-year strategy to expand AI and cloud computing.
And Facebook parent company Meta is constructing an $800 million data center in Kansas City, Missouri, with a commitment to 100% renewable energy and advanced cooling technologies. Meta is also planning a facility larger than New York’s Central Park.
Meanwhile, Amazon Web Services has committed $100 billion for global expansion, including dozens of new hyperscale data centers, with a strong focus on renewable energy and high-efficiency cooling. And Google parent Alphabet has committed $75 billion in capital spending, mainly on AI infrastructure and data centers. The company is also investing in eco-friendly data centers in Europe, especially in Germany and the Netherlands, using geothermal and solar cooling systems.
Despite the positive news about commitments from private sector interests to meet current and future energy demand, most of the mentioned projects will take at least three to 10 years to complete. Meanwhile, the existing energy demands are already leading to higher prices for everyone.
Some key questions still require answers, such as: Will AI applications and services remain affordable, or will a widening gap emerge between those with access to AI and those without? How could AI be used to improve data center efficiency and reduce power consumption? To what extent might the approach of companies going it alone backfire? Considering the massive investments in AI resources in a largely unregulated environment, could this result in lower returns on investment? Who is responsible for the costs if and when failures happen?
Ultimately, while humans may eventually succeed in securing long-term, sustainable energy, we continue to fuel the AI beast, which in turn becomes increasingly powerful. It is in a constant learning mode, learning everything related to humans, including their history, language, events, nuances, and energy consumption, among other things. And as we continue to feed the AI beast, we do so with little concern for how its powers could soon surpass us.
Alan R. Shark is an associate professor at the Schar School for Policy and Government, George Mason University, where he also serves as a faculty member at the Center for Human AI Innovation in Society (CHAIS). Shark is also a senior fellow and former Executive Director of the Public Technology Institute (PTI). He is a National Academy of Public Administration Fellow and Founder and Co-Chair of the Standing Panel on Technology Leadership. Shark is the host of the podcast Sharkbytes.net.





By