In recent news, Microsoft’s water consumption has skyrocketed by a whopping 34 percent, reaching a staggering 6.4 million cubic meters in 2022. And what’s to blame? Well, it seems like generative AI might be the culprit. According to Shaolei Ren, a researcher at the University of California, Riverside, Microsoft’s close association with OpenAI and its significant investments in generative AI products have contributed to this surge in water usage. Ren even published a paper on this topic, delving into the impact of generative AI adoption on datacenter water consumption.
In Microsoft’s latest report on environmental, social, and governance (ESG) factors, they stated that the increased water consumption aligns with their business growth. They noted that water consumption shot up by a third, from 4.8 cubic meters in 2021 to a whopping 6.4 million cubic meters last year. This is a substantial increase compared to the 14 percent rise in water consumption reported by the software giant between 2020 and 2021.
While Microsoft hasn’t explicitly listed AI adoption as the exact cause of this surge, we do know that they’ve been deploying thousands upon thousands of GPUs to power their large language models, which are behind Bing Chat and GitHub Copilot, among others. Plus, let’s not forget that Microsoft is in tight collaboration with OpenAI, the mastermind behind the impressive ChatGPT.
To shed more light on the situation, The Register reached out to Microsoft for comment. According to a representative, they acknowledge the concerns surrounding datacenter water consumption, especially in America where scientists have warned about potential drought conditions due to changing weather patterns. In fact, climate models developed by the US Department of Energy’s Argonne National Labs predict that large parts of the country could experience persistent drought conditions by the middle of this century, followed by devastating floods.
Now, let’s get down to the specifics of Microsoft’s water usage. Although the water consumption figures mentioned in their ESG report encompass the entire company and not just their datacenters, it’s no secret that these facilities gulp down large amounts of water. The GPUs required to power their machine learning models are typically set up in groups of eight and consume an enormous amount of power compared to traditional datacenter setups. In fact, an eight-GPU system can guzzle between 6kW and 10kW of power under full load, which is equivalent to the energy consumed by a typical cloud rack. And guess what? All that heat needs to go somewhere, and depending on the cooling technology employed, hotter systems can result in increased water usage.
Datacenters employ various techniques to cool their systems, such as direct liquid cooling (DLC) using deionized water and other fluids or air cooling. DLC, which is highly efficient, involves passing coolant through cold plates attached to hotspots inside the system. Google actually uses this approach to cool their Tensor Processing Units (TPUs) utilized for AI training and inference workloads. On the other hand, many modern GPU systems still rely on air cooling. Regardless of the method used, datacenters require sophisticated air-handling and thermal-management systems to expel the heat. Evaporative cooling is a popular choice, in which water is used to extract the heat from the air exiting the datacenter. As the hot air causes the water to evaporate, it cools down the air, enabling the process to be repeated. The advantage of evaporative cooling is its energy efficiency, particularly in climates where it’s only needed during the hottest months of the year. However, this approach presents challenges in regions with limited water access or treatment facilities. For instance, datacenters located in Phoenix, Arizona, have transitioned to alternative cooling technologies that don’t directly consume water, but they do consume more power and generate more noise.
Now, let’s dive into the staggering numbers. Microsoft revealed that their water consumption in 2022 increased by approximately 1.6 million cubic meters compared to the previous year. To put that into perspective, it’s enough water to fill a mind-boggling 640 Olympic-sized swimming pools or the equivalent of over three billion grapefruits. And that’s just the new water consumption. When you consider the total water consumption for 2022, it amounts to a whopping 2,560 Olympic pools. In 2021, researchers at the University of Oxford estimated that US datacenters collectively consumed a mind-blowing 1.7 billion liters of water per day. However, they also pointed out that measuring datacenter water consumption is often hindered by a lack of transparency.
Despite these challenges, researchers at the University of California, Riverside and the University of Texas at Arlington have attempted to assess the water usage of generative AI. Their estimation is striking – they claim that just a simple conversation of 20-50 questions and answers with ChatGPT, a large language model, requires the consumption of a 500ml water bottle. However, it’s important to remember that actual water consumption associated with running these language models depends on various factors, including the cooling techniques used by the facility, the location of model training and deployment, and the timing of these tasks. For example, datacenters located in colder climates tend to consume less water and can take advantage of lower ambient temperatures, unlike those situated in hot and arid desert climates. According to reports, the Iowa datacenter where Microsoft and OpenAI trained GPT-4 only engages in water consumption when the temperature exceeds 29.3 degrees Celsius.
So, what is Microsoft doing to tackle this water consumption issue? In their ESG report, Microsoft claims to be using water consumption metrics as a guide for their water reclamation efforts, with a goal of becoming net “water positive” by 2030. Achieving this elusive “water positive” status typically involves funding projects that protect watersheds, restore wetlands, and enhance infrastructure in water-stressed regions. For example, in Chennai, India, where Microsoft operates a datacenter, they partnered with The Natural Conservancy to revive the wetlands surrounding Lake Sembakkam. To date, Microsoft has committed to 35 million cubic meters of water reclamation projects. Additionally, Microsoft is working on thermal management technologies to reduce water consumption in their facilities. One such technology is their geoexchange system at the Thermal Energy Center in Redmond, which reroutes heat into the ground instead of relying on cooling towers. This innovation is expected to slash water consumption at the facility by around 30,280 cubic meters, which is equivalent to about 12 Olympic swimming pools. Furthermore, Microsoft has taken a step further by phasing out evaporative coolers in some of their water-challenged datacenters. Earlier this year, they agreed to transition their final two datacenters, located outside Phoenix, to “zero water” cooling infrastructure.
But let’s not forget that Microsoft isn’t the only cloud provider grappling with this water consumption predicament. Both Amazon Web Services (AWS) and Google Cloud are facing similar challenges as they continue to expand their cloud empires. However, they have also made efforts to address this issue. Last year, AWS made a commitment to becoming “water positive” by 2023, aiming to return more water to communities than what is consumed by their operations. Google has made similar promises, but they did admit in April that AI adoption has made things more difficult.
So, there you have it. Water consumption in the tech world, fueled by generative AI and the exponential growth of datacenters, is a pressing issue. As concerns surrounding water scarcity and climate change intensify, it is imperative for companies like Microsoft to take proactive measures to limit their ecological footprint. With initiatives like water reclamation projects and advanced thermal management technologies, they might just be on the right track. But let’s hope that innovation and conservation go hand in hand as we navigate this era of rapidly evolving technology.