Big tech companies are excellent at hiding how the sausage gets made behind sleek user interfaces and neat functionality. 

A bit more convenience for us can have huge environmental and societal costs. Indeed, just recently, we learned that TikTok’s huge energy demand is potentially impacting history by holding back weapons production for Ukraine.

Now, a new paper from the University of Colorado Riverside and the University of Texas Arlington demonstrates that the sudden leap in commercial artificial intelligence isn’t without its problems either.

While companies are opaque about the quantity of natural resources they use in aid of making things a bit more convenient for lazy humans, the researchers have had a go at calculating the amount of water used to make ChatGPT and Google’s Bard so unnervingly intelligent

If Microsoft used its state-of-the-art US data centres to train GPT-3, the paper estimates, 700,000 litres of clear fresh water would have been guzzled up. That’s the kind of cooling you need for server rooms filled with computers packing 10,000 GPUs and over 285,000 processor cores.

But that’s a big “if”. The training might have been carried out at Microsoft’s less efficient Asian data centres, then the quantity would likely be tripled.

For consumers, it means that your average 20-50 question chat with ChatGPT is the equivalent of emptying a 500ml bottle of water on the ground, not even cleaning your shoes in the process. That quickly adds up when you consider ChatGPT hit 100 million users after just two months of public availability. 

That’s a terrible look in a world where draughts are going to get increasingly common as climate change bites.

The paper says that AI data centres — and indeed server rooms for all other big tech companies — could be more efficient based on where they’re built and when they’re active. Cooler climates require less water for cooling, and if AI models did their training during the night when it’s naturally colder, the water demands would be lower too.

But there’s a sustainability catch-22 here. Big tech also needs to reduce its carbon footprint, and that’s easier to do in hotter climates because data centres can rely on clean solar power. “In other words, only focusing on AI models’ carbon footprint alone is far from enough to enable truly sustainable AI,” the researchers write.

What about consumers? Well, they could bug artificial intelligence chatbots at more “water-efficient” hours, but that feels like wishful thinking for a number of reasons — not least of which is that AI companies aren’t at all transparent about their water and energy usage.

“When and where are the AI models trained? What about the AI models trained and/or deployed in third-party colocation data centres or public clouds?” the researchers ask.

Big tech companies spend a lot of their time ensuring that apps are effortless to use for consumers. This is a helpful reminder that, behind the scenes, a lot is being sacrificed for our convenience. 

Image: Airam Dato-on / Pexels

Alan Martin
Alan is an experienced and versatile writer with the unique distinction of having written for both The New Statesman and Nuts. The list of publications Alan has written for doesn't stop there. His work has also been published in: Wired, CNET, Gizmodo UK, ShortList, NME, TechRadar, The i, The Independent, The Evening Standard, City Metric, Macworld, Pocket Gamer, Expert Reviews, Coach, The Inquirer, Rock Paper Shotgun, Tom's Guide, T3, PC Pro, IT Pro, Ideal Home, Livingetc, Stuff, Business Insider, theBit, Wareable, and Trusted Reviews. Alan now covers a range of subjects for ReviewsFire, with a focus on news - his unique style of covering technology news is a key part of ReviewsFire's success.

LEAVE A REPLY

Please enter your comment!
Please enter your name here