Artificial intelligence (AI) is revolutionizing our world, from helping us choose what to cook for dinner, to enabling advanced data analysis. For us, students, AI has become part of the academic toolkit, whether it’s for writing assistance, article and lecture summaries, or accessing more personalized learning resources. However, what many don’t realize is that our growing reliance on AI comes at a hidden cost – one that is largely invisible yet increasingly significant: water consumption. AI’s environmental impact is often discussed along the topics of energy usage and carbon emissions, but not many of us realize that water plays a major role in keeping AI running.
Where does the water go?
When thinking of AI’s environmental cost, water might not be the first thing that comes to mind. However, it plays a critical role in both the direct and indirect operations of AI systems, primarily through data centers, as well as various processes throughout the supply chain such as the production of semiconductors and microchips used in AI models. Popular large language models (LLMs) likeOpenAI’s ChatGPT and Google’s Bard are energy-intensive, requiring massive server farms to provide enough data to train the powerful programs (DeGeurin et al., 2023).
1. Direct Water Usage:
Data centers – the backbone of AI – require immense cooling systems to prevent overheating. These centers house thousands of servers that generate tremendous amounts of heat while running (Clancy, 2022). Water is commonly used in cooling systems to regulate the temperature of these servers, as the optimal temperature to prevent the equipment from malfunctioning is typically between 10 and 25 degrees Celsius (DeGeurin et al., 2023). Cooling mechanisms vary, but one of the most popular methods is evaporative cooling, which directly consumes significant quantities of water (Digital Realty, 2023). The researchers estimate around a gallon of water is consumed for every kilowatt-hour expended in an average data center (Farfan & Lohrmann, 2023). Not just any type of water can be used, either. Data centers pull from clean, freshwater sources in order to avoid the corrosion or bacteria growth that can come with seawater (DeGeurin et al., 2023).
(Li et al., 2023)
2. Indirect Water Usage:
The electricity that powers AI also has a water footprint, especially when it comes from thermoelectric power plants, which rely on water for steam generation and cooling (Petrakopoulou, 2021) (Torcellini et al., 2023). Even when data centers run on renewable energy, the construction and operation of the renewable infrastructure can still have a water impact. All of that just along other often omitted factors such as water usage embodied in the supply chains (e.g., water used for chip manufacturing) (Li et al., 2023). To illustrate it better: an average chip manufacturing facility today can use up to 10 million gallons of ultrapure water per day – as much water as is used by 33,000 US households every day (James, 2024). Need more examples? Just imagine that globally semiconductor factories are already consuming as much water as Hong Kong, a city of 7.5 million (Robinson, 2024).
(James, 2024)
How thirsty is the AI?
Just how much water does AI consume? The numbers are staggering: in 2021 Google’s US data centers alone consumed 16.3 billion liters of water, including 12.7 billion liters of freshwater (Clancy, 2022) (Li et al., 2023). That’s just as much as the annual consumption of a mid-sized city. According to data published in 2023, a single conversation with ChatGPT (spanning 20 to 50 interactions) is equivalent to drinking a 500ml bottle (DeGeurin et al., 2023). While this may not seem significant on an individual scale, ChatGPT has currently over 200 million active users, engaging in multiple conversations daily (Singh, 2024). GPT-3, an AI model developed by OpenAI, reportedly consumed approximately 700,000 liters of water only during its training phase (Li et al., 2023). When scaled up to consider all functioning and developing AI models along with their data centers, this leads to billions of liters of water being consumed only for cooling purposes. However, not all AI models are equal in their water demands. While smaller models require less computational power, and thus less water for cooling, larger, more advanced models like GPT-4 demand significantly more resources. And of course, as AI models become more sophisticated and popularized, they also become more resource-intensive, both in terms of energy and water.
(Cruchet & MacDiarmid, 2023)
AI’s Water Crisis: Implications
The high water consumption of AI systems and data centers has significant environmental and societal consequences, particularly in water-scarce regions and less developed countries.
- Escalating Water Scarcity: In regions where water is already scarce, data centers add to the problem. A clear example is Google’s data center in South Carolina, which raised alarms over its massive water withdrawals in an area often hit by droughts (Moss, 2017). As AI’s growth drives up demand for these centers, we’re likely to see more conflicts between tech giants and local communities fighting for the same limited resources.
- Strain on Ecosystems: Data centers don’t just impact human communities; they affect nature too. When large amounts of water are diverted for industrial use, natural ecosystems suffer. Less water means habitat loss for animals and severe disruptions to the local environment, throwing entire ecosystems out of balance (Balova & Kolbas, 2023).
- Widening the Digital Divide: The high water and energy demands of AI data centers often mean they are built in regions with abundant resources, leaving less developed areas at a disadvantage. These centers are often built in resource-rich regions, close to users, to reduce latency and cut down on data transmission costs. It makes sense from a business perspective—faster data, lower costs. But what happens to the areas that lack water, energy, and infrastructure? They get left behind, further widening the existing digital divide.
Drying Out AI: Smart Solutions for Water Use
While the current water consumption rates may seem unsustainable, there are solutions – though their plausibility and long-term impact vary.
1. Water-Efficient Cooling Technologies: One promising solution is the adoption of more water-efficient cooling technologies. Some companies are experimenting with air cooling or liquid cooling systems that don’t rely on water. For example, Google’s data center in Finland introduced the first ever system using cold seawater for cooling, drastically reducing freshwater consumption (Miller, 2011). However, not all data centers can be located near natural water sources that can be sustainably tapped.
2. Renewable Energy Transitions: While much of AI’s water footprint comes from electricity generation, transitioning data centers to renewable energy sources like wind and solar could reduce the indirect water use associated with thermoelectric plants (Arts, 2024).
(Lenovo StoryHub, 2024)
3. Transparency and Accountability: One of the most plausible and immediately impactful steps is for tech companies to be more transparent about their water usage. Publicly reporting on their water consumption and environmental impact could put pressure on companies to adopt more sustainable practices. Microsoft and Google have already pledged to become “water positive” by 2030, meaning they aim to replenish more water than they consume (Clancy, 2021). While this goal is ambitious, its success will depend on innovations in both technology and infrastructure.
Other specialists have proposed relocating data centers to Nordic countries like Iceland or Sweden, in a bid to utilize ambient, cool air to minimize carbon footprint, a technique called “free cooling” (Monserrate, 2022). However, network signal latency issues make this dream of a haven for green data centers largely untenable to meet the computing and data storage demands of the wider world.
Will AI ever be sustainable?
AI’s water footprint is a pressing environmental issue that must be addressed alongside energy and carbon concerns. Though constant advancements are being made, there is still much to explore regarding AI’s water consumption. Further research is needed in areas such as:
- investigation of the environmental trade-offs of AI usage;
- exploration of alternative cooling methods for data centers;
- assessment of the feasibility of building AI systems that are less resource-intensive;
- analysis of the scalability of current solutions like seawater cooling or closed-loop cooling systems,
to ensure the long-term sustainability of AI technologies.
As students and future innovators, understanding these invisible costs is the first step toward making informed and conscious choices. Whether by adjusting our daily digital habits, supporting companies with sustainable practices, or advocating for responsible AI development, we all have a role to play in ensuring that AI can thrive without draining the planet’s resources. By demanding more transparency from the tech industry and pushing for the adoption of more water-efficient technologies, we can help to navigate the future of AI toward a more sustainable and unbiased path.
References
Arts, M. (2024). Designing green energy data centres. Royal HaskoningDHV. https://www.royalhaskoningdhv.com/en/newsroom/blogs/2023/designing-green-energy-data-centres
Balova, A., & Kolbas, N. (2023, August 20). Biodiversity and Data Centers: What’s the connection? Ramboll. https://www.ramboll.com/galago/biodiversity-and-data-centers-what-s-the-connection
Clancy, H. (2021). Diving into ‘water positive’ pledges by Facebook, Google. Trellis. https://trellis.net/article/diving-water-positive-pledges-facebook-google/
Clancy, H. (2022, November 22). Sip or guzzle? Here’s how Google’s data centers use water – Trellis. GreenBiz. Retrieved September 15, 2024, from https://trellis.net/article/sip-or-guzzle-heres-how-googles-data-centers-use-water/
Cruchet, N., & MacDiarmid, A. (2023, November 21). Datacenter Water Usage: Where Does It All Go? Submer. Retrieved September 16, 2024, from https://submer.com/blog/datacenter-water-usage/
DeGeurin, M., Ropek, L., Gault, M., Feathers, T., & Barr, K. (2023). ‘Thirsty’ AI: Training ChatGPT Required Enough Water to Fill a Nuclear Reactor’s Cooling Tower, Study Finds. Gizmodo. https://gizmodo.com/chatgpt-ai-water-185000-gallons-training-nuclear-1850324249
Digital Realty. (2023). The Future of Data Center Cooling: Innovations for Sustainability. Digital Realty. https://www.digitalrealty.com/resources/articles/future-of-data-center-cooling
Farfan, J., & Lohrmann, A. (2023). Gone with the clouds: Estimating the electricity and water footprint of digital data services in Europe. Energy Conversion and Management. https://www.sciencedirect.com/science/article/pii/S019689042300571X
James, K. (2024, July 19). Semiconductor manufacturing and big tech’s water challenge | World Economic Forum. The World Economic Forum. Retrieved September 16, 2024, from https://www.weforum.org/agenda/2024/07/the-water-challenge-for-semiconductor-manufacturing-and-big-tech-what-needs-to-be-done/
M.T Ziegler (2024, March) The world’s AI generators: rethinking water usage in data centers to build a more sustainable future. Lenovo StoryHub. https://news.lenovo.com/data-centers-worlds-ai-generators-water-usage/
Li, P., Ren, S., Yang, J., & Islam, M. (2023, October 29). Making AI Less “Thirsty”: Uncovering and Addressing the Secret Water Footprint of AI Models. arXiv. http://arxiv.org/pdf/2304.03271
Miller, R. (2011). Google Using Sea Water to Cool Finland Project – Google Using Sea Water to Cool Finland Project. Data Center Knowledge. https://www.datacenterknowledge.com/hyperscalers/google-using-sea-water-to-cool-finland-project
Monserrate, S. G. (2022, February 14). The staggering ecological impacts of computation and the cloud. MIT Schwarzman College of Computing. Retrieved September 16, 2024, from https://computing.mit.edu/news/the-staggering-ecological-impacts-of-computation-and-the-cloud/
Moss, S. (2017). Google’s plan to use aquifer for cooling in South Carolina raises concerns. Data Center Dynamics. https://www.datacenterdynamics.com/en/news/googles-plan-to-use-aquifer-for-cooling-in-south-carolina-raises-concerns/
Petrakopoulou, F. (2021). Defining the cost of water impact for thermoelectric power generation. Energy Reports. https://www.sciencedirect.com/science/article/pii/S2352484721002158
Robinson, D. (2024, February 29). Growing water use a concern for chip industry and AI models. The Register. Retrieved September 16, 2024, from https://www.theregister.com/2024/02/29/growing_water_use_ai_semis_concern/
Singh, S. (2024). ChatGPT Statistics (SEP. 2024) – 200 Million Active Users. DemandSage. Retrieved September 15, 2024, from https://www.demandsage.com/chatgpt-statistics/
Torcellini, P., Long, N., & Judkoff, R. (2023). Consumptive Water Use for U.S. Power Production. NREL. https://www.nrel.gov/docs/fy04osti/33905.pdf
Hi Maria,
Wow! I had heard that the computing power and energy needed to make Generative AI work was a lot, but I had no idea it was this much! It really makes me reconsider using LLM’s for mundane queries, like asking for dinner inspirations or movie reccomendations.
One thing I hope will make a big difference in the future is the introduction of mass-produced photonic computer chips. Smart Photonics (smartphotonics.nl) is a company based here in the Netherlands that is developing these chips in collaboration with ASML. They mention that using light instead of electrons could be the big breakthrough we’ve been looking for when it comes to energy-intensive data centers.
I also agree with you that making informed and conscious choices by users is essential to making AI a sustainable technology ready for future use.