Slow Digest: AI, Environmental Problem or Miracle Solution?

This week’s edition of Slow Digest is written by C21 Graduate Fellows Yuchen Zhao and Ceceilia Loeschmann.

Artificial intelligence (AI) is everywhere these days, used for tasks like writing emails to creating images and powering research. But behind the convenience is a growing question: What does all this computing cost the planet? As concerns about energy use, carbon emissions, and sustainability continue to grow, the environmental impact of AI has become a hot topic. In the Q&A that follows, two C21 Graduate Fellows — Yuchen Zhao and Ceceilia Loeschmann — go head-to-head, taking opposing sides on whether AI is an environmental problem…or part of the solution. 

Q1: How does AI impact the environment? 

Yuchen: Gibson’s Affordance Theory, introduced in 1979, proposes that environments are full of affordances, action possibilities available to an organism, relative to its capabilities. He argues that what we perceive first are affordances, the “values and meanings” built into an environment, what it offers us “for good or ill.” From this perspective, AI changes the affordances of our socio-technical environment: it doesn’t just sit there as “new technology,” it opens up new possibilities for perception and action. It lets planners and researchers “see” patterns and scenarios that weren’t perceptible before, which can support more efficient, less wasteful decisions. AI affords new ways of seeing environmental systems. We can detect patterns in air pollution, water quality, or land use that would otherwise remain invisible, and that can reshape how planners, policymakers, or communities act. That gives governments, NGOs, and communities an earlier warning and more precise targets for intervention. It can also enhance efficiencies. United Nations Environment Programme, for example, uses AI to detect when oil and gas installations vent methane, a greenhouse gas that drives climate change.  In cities, AI-enhanced tools can help planners simulate different land-use or transit scenarios and choose options that reduce emissions and protect vulnerable communities. 

So I don’t see AI as simply “good” or “bad” for the environment. It’s a technology that introduces new environmental risks and new environmental affordances. The key question is how we steer it: can we design and regulate AI so that the dominant uses are those that support decarbonization, environmental justice, and better decision-making, while actively working to shrink its own material footprint? 

Ceceilia: AI technology, particularly generative AI technologies (such as ChatGPT and Claude), harms the environment in several ways. When we think of AI and the environment, we generally envision AI data centers. These data centers are specialized facilities that train and run AI models using specialized Tensor Processing Units (TPUs) and/or Graphics Processing Units (GPUs) within servers.  They require a huge amount of electricity and water in order to operate.   

Researchers from the Lawrence Berkeley National Laboratory say that a single medium-sized data center consumes around 300,000 gallons of water a day, or about as much as 1,000 U.S. households. Cornell researchers found that the rapid growth of AI data centers is projected to consume 731 to 1,125 million cubic meters of water annually by 2030. This massive consumption is driven by the evaporative cooling systems that are often used to cool the processor chips within the data centers. The water for these cooling systems can come from either potable (drinkable) or non-potable freshwater sources, depending on the location of the data center. Many data centers operate where water is scarce, due to lower land costs, which causes a range of problems. Often the costs for cooling these data centers fall on local communities through higher utility rates and infrastructure strain. Locally, there is currently a bill moving forward in the Wisconsin Assembly proposing water recycling and utility rate requirements for data centers. Wisconsin has recently emerged as an area for data center development due to the state’s access to water and available land.  

Powering data centers and generative AI models also requires a large amount of electricity. According to the Pew Research Center, data centers accounted for 4% of total U.S. electricity use in 2024. The International Energy Agency found that a typical AI data center consumes as much electricity as 100,000 households, but the largest ones under construction today are estimated to consume 20 times as much. Additionally, some data centers rely on on-site diesel generator fleets or even methane gas generators for backup power to meet the immense energy demands of running AI models, which contributes to noise and air pollution. Theres also the negative environmental impacts that come from creating TPUs and GPUs, which require the mining of rare earth minerals. Overall, the current path that AI technologies are taking is not environmentally sustainable.   


Q2: Can AI play a role in fighting climate change? 

Yuchen: Yes, especially if we treat AI as an affordance for better environmental governance rather than as a panacea. In environmental monitoring, AI systems can process huge volumes of data from satellites, drones, and ground sensors to give a much more accurate, real-time picture of environmental conditions. This affords quicker, more targeted actions—for example, predicting floods, tracking air pollution plumes, or detecting illegal land-use change. 

In cities, AI-driven urban decision support systems can help optimize traffic flows, building energy use, and infrastructure planning, allowing planners to test low-carbon scenarios before they are built.  In Gibson’s terms, AI enriches the environment with new informational structures that afford more sustainable decisions—if institutions actually pick up and use those possibilities. The problem is not that AI can’t help; it’s that governance, equity, and policy frameworks often fall behind the technical capabilities. 

The same AI infrastructure that powers chatbots can also afford large-scale efficiency gains in the background of everyday life. For example, Google reports that just five AI-enabled products, including fuel-efficient routing in Google Maps and its Green Light traffic signal optimization, helped avoid an estimated 26 million metric tons of CO₂ in 2024—more than double Google’s own total emissions that year. 

Ceceilia: Yes, if the tech giants behind these AI technologies can successfully transition toward sustainable energy, and if regulation frameworks can be put into place. Companies like Amazon, Meta, Microsoft, and Google have all set climate goals, hoping to achieve net-zero emissions within the next couple of decades. But these goals aren’t met. For example, Google’s total emissions have actually grown in the last few years, partly due to the company’s expansion in AI, as admitted in their 2025 Environmental Report. The company reported a growth of carbon emissions by 51% since 2019, but some accounts say the real number may actually be as high as 65%.  


Q3: How does everyday AI use contribute to energy consumption? 

Yuchen: Everyday AI use does require energy. Each chatbot query, image generation, or AI-assisted email runs on servers in data centers. But I think it’s more useful to see this not just as “extra consumption,” but as an investment in systems that can dramatically reduce energy use elsewhere. It opens up possibilities for more efficient action by offering lower-carbon choices: routing computation to cleaner grids, using more efficient models by default, or nudging users toward high-impact uses (like environmental monitoring, planning, and public services) rather than pure convenience or novelty. 

For instance, AI already sits behind tools many people use every day, like navigation apps that suggest fuel-efficient routes or real-time transit options. When millions of drivers follow routes that reduce idling and congestion, the net effect can be lower fuel use and emissions—even though each routing request uses some electricity in a data center. Similarly, AI systems embedded in smart buildings continuously adjust heating, cooling, and lighting based on occupancy and weather. That means everyday AI decisions in the background can trim energy use in offices, campuses, and public buildings far more than they consume. And AI-based environmental monitoring, like air-quality alerts or flood-risk dashboards that people check on their phones, relies on data centers too, but supports earlier, more targeted responses that protect both people and infrastructure. 

Everyday AI use has an energy cost, but if we design and govern these systems with sustainability in mind, the actions, from more efficient mobility to smarter buildings and infrastructure, can produce net reductions in energy use and emissions that outweigh the footprint of individual queries.  

Ceceilia: This depends on the type and size of the AI model an individual is using on the day to day, and what they are asking AI to do.  A survey done by the Pew Research Center says that 73% of Americans would be willing to let AI assist them at least a little with their day-to-day activities. Depending on what these billions of people are asking, on what AI platform, one query can be more energy-intensive and emissions-producing than another. 

The Washington Post and the University of California recently released a study that shows using OpenAI’s ChatGPT-4 model to generate a 100-word email requires 519 milliliters of water and requires 0.14 kilowatt-hours (kWh) of electricity. That’s slightly more than one water bottle of water, and energy equal to powering 14 LED light bulbs for an hour. 

According to another analysis done by the MIT Technology Review, generating a single standard-quality image (1024 x 1024 pixels) with Stable Diffusion 3 Medium uses an estimated 2,282 joules total. The same report estimates creating a high-quality five-second video on one of the best AI video models can consume over 3.4 million joules. That’s equivalent to running a microwave for over an hour.   

But, before these models can even fulfill a request, they need to be trained. The same Washington Post study found that Meta used 22 million liters of water training its LLaMA-3 AI model — that’s the volume of 8.8 Olympic-sized swimming pools!  


Q4: Is there such a thing as “green AI”? 

Yuchen: I’d say “green AI” is possible, but it’s not automatic. AI doesn’t become green just because it’s “clever”; it becomes green when we deliberately change how it’s built, where it runs, and what we use it for. 

On the technical side, green AI means, as one of the sustainable AI research defined,  incorporates sustainable practices and techniques in model design, training, and deployment that aim to reduce the associated environmental cost and carbon footprint: designing models and infrastructure to use less energy and cleaner energy: more efficient algorithms, smaller or better-optimized models instead of defaulting to the biggest ones, data centers powered by renewables, and careful scheduling of workloads to times and places with low-carbon electricity.  

There’s already work on “energy-aware” training, model compression, and reporting the carbon footprint of major training runs so that researchers can compare impact, not just accuracy. the most promising approaches include algorithm optimization, hardware optimization, data center optimization, and pragmatic scaling factor reductions, etc. 

Equally important is the purpose of the AI system. An AI tool that helps a city cut building energy use, optimize transit, or target climate adaptation funding has a very different environmental profile than one used only for generating endless novelty images. “Green AI” isn’t just about efficient chips; it’s about prioritizing high-impact, climate-relevant applications while pushing companies to shrink and clean up the underlying infrastructure. 

So yes, green AI can exist—but only if we treat low energy use and real-world climate benefits as core design goals, not as an afterthought. 

Ceceilia: Right now, we have no largescale “green” AI.  There is a possibility of this if data centers moved away from using fossil fuels and were instead powered by renewable energy sources, like solar, wind, or nuclear energy. However, these sources take time to build and can’t keep up with the current AI boom. For example, the U.S. Department of Energy states that widespread commercial nuclear reactors are not likely to arrive until the 2030s.  

Additionally, there are currently murky regulations in place requiring data centers to disclose their water and energy consumption. This lack of standardized reporting makes it difficult to assess progress towards sustainability goals. 

Theres also a lack of business incentives, as put by James Temple

“We’ve built and paid for a global economy that spews out planet-warming gases, investing trillions of dollars in power plants, steel mills, factories, jets, boilers, water heaters, stoves, and SUVs that run on fossil fuels. And few people or companies will happily write off those investments so long as those products and plants still work. AI can’t remedy all that just by generating better ideas. To raze and replace the machinery of every industry around the world at the speed now required, we will need increasingly aggressive climate policies that incentivize or force everyone to switch to cleaner plants, products, and practices.” 


Q5: How does AI change how we imagine the future of the planet? 

Yuchen: We increasingly use AI to see the future through simulations, visualizations, and stories. Those AI-powered visions are starting to shape what we believe is possible, urgent, or inevitable for the planet. On one hand, it feeds a story that technology will save us: glossy visuals of smart cities, perfectly optimized energy systems, and climate models that can predict everything. On the other hand, it also fuels a darker imagination—of endless data centers, runaway consumption, and automated systems deepening surveillance and extraction. So AI doesn’t give us one future; it sharpens the contrast between competing ones. 

For me, the hopeful side is that AI can make complex climate futures more visible and concrete. When AI helps us simulate flood risks in a neighborhood, map heat islands, or test different transit and land-use scenarios, it turns abstract climate data into something people can see, argue with, and plan around. It can support more informed, participatory decision-making—letting communities explore “what if” questions about energy, housing, and adaptation in ways that were previously only available to specialists. 

Despite the underlying risks, whether AI expands or narrows our imagination still depends on how we choose to use it. If we pair AI with climate justice movements, local knowledge, and public debate, it can help us imagine futures that are not only more efficient, but also more equitable and livable. If we don’t, it can just as easily reinforce old patterns in a faster, more obscure way.  

Ceceilia: AI is currently accelerating ecological harm and worsening climate change. Unfortunately, unless proper sustainability measures and regulations are put into place, I do not see much of a future for our planet. Especially because it has been put into the hands of tech giants, who seem not to be concerned about the laypeople of planet earth. Take OpenAI CEO Sam Altman for example, who has said that “a significant fraction of the power on Earth should be spent running AI compute.” Altman also believes that concerns over AI’s water usage are “totally fake.”  

Besides the future of our Earth, we are beginning to see the negative sociocultural implications of AI. For example: 

  • number of layoffs have been tied to AI adaption, in which human workers are being replaced with machines 
  • An recent study also showed consequences have been found regarding AI’s impact on student learning 
  • As more people use AI to generate images from models trained on copyrighted works, real artists are not being credited 

Now that you’ve heard two sides to the same question – what do you think? Is AI an environmental problem, or a miracle solution? 


The views, information, and opinions expressed In Slow Digest do not necessarily represent the views, policies, or positions of the Center for 21st Century Studies, the University of Wisconsin–Milwaukee, or the University of Wisconsin System. The Center for 21st Century Studies supports scholarly debate about, and engagement with, the pressing issues of our time.