Artificial Intelligence (AI) has shown great progress, but a recent study highlights the significant water consumption of AI models in data centres making it an environmental concern. The study uncovers the environmental cost of training and deploying large AI models, emphasizing the need for sustainable practices.
The Thirsty Reality of AI Models:
Large AI models like GPT-3 and GPT-4 have a hidden water footprint. Training GPT-3 in Microsoft’s US data centres can consume up to 700,000 liters of fresh water, equivalent to producing hundreds of cars. ChatGPT, for a simple conversation, “drinks” a 500ml bottle of water, considering its billions of users.
Data Centres’ Water Consumption:
Data centres, where AI models reside, are energy-intensive and water-thirsty. Google’s US data centres alone consumed 12.7 billion liters of freshwater for cooling in 2021. These centres rely on water-intensive cooling systems and require significant amounts of water for power generation.
Why Are AI Models Water Intensive?
Data centres generate heat, necessitating cooling systems. Water-intensive evaporative cooling towers are often used, requiring pure freshwater to prevent corrosion and microbial growth. Water is also consumed for power generation in data centres.
Looking for Solutions:
The study suggests training AI models during cooler hours to reduce water loss through evaporation. It calls for the industry to develop environmentally friendly AI models. Uncovering and addressing the water footprint of AI models is crucial, considering freshwater scarcity and aging water infrastructure.
As we explore the potential of AI and models like ChatGPT, it is vital to be mindful of their environmental impact. With freshwater scarcity and droughts increasing, sustainable practices are essential. Addressing the hidden water footprint of AI models is crucial for a sustainable future.