For Google, meanwhile, total water consumption at its data centers and offices came in at 5.6 billion gallons in 2022, a 21% increase on the year before.
Both companies are working to reduce their water footprint and become “water positive” by the end of the decade, meaning that they aim to replenish more water than they use.
t’s notable, however, that their latest water consumption figures were disclosed before the launch of their own respective ChatGPT competitors. The computing power needed to run Microsoft’s Bing Chat and Google Bard could mean significantly higher levels of water use over the coming months.
“With AI, we’re seeing the classic problem with technology in that you have efficiency gains but then you have rebound effects with more energy and more resources being used,” said Somya Joshi, head of division: global agendas, climate and systems at the Stockholm Environment Institute.
“And when it comes to water, we’re seeing an exponential rise in water use just for supplying cooling to some of the machines that are needed, like heavy computation servers, and large-language models using larger and larger amounts of data,” Joshi told CNBC during the COP28 climate summit in the United Arab Emirates.
“So, on one hand, companies are promising to their customers more efficient models … but this comes with a hidden cost when it comes to energy, carbon and water,” she added.
How are tech firms reducing their water footprint?
A spokesperson for Microsoft told CNBC that the company is investing in research to measure the energy and water use and carbon impact of AI, while working on ways to make large systems more efficient.
“AI will be a powerful tool for advancing sustainability solutions, but we need a plentiful clean energy supply globally to power this new technology, which has increased consumption demands,” a spokesperson for Microsoft told CNBC via email.
“We will continue to monitor our emissions, accelerate progress while increasing our use of clean energy to power datacenters, purchasing renewable energy, and other efforts to meet our sustainability goals of being carbon negative, water positive and zero waste by 2030,” they added.
Separately, a Google spokesperson told CNBC that research shows that while AI computing demand has dramatically increased, the energy needed to power this technology is rising “at a much slower rate than many forecasts have predicted.”
“We are using tested practices to reduce the carbon footprint of workloads by large margins; together these principles can reduce the energy of training a model by up to 100x and emissions by up to 1000x,” the spokesperson said.
“Google data centers are designed, built and operated to maximize efficiency — compared with five years ago, Google now delivers around 5X as much computing power with the same amount of electrical power,” they continued.
“To support the next generation of fundamental advances in AI, our latest TPU v4 [supercomputer] is proven to be one of the fastest, most efficient, and most sustainable ML [machine leanring] infrastructure hubs in the world.”