Vitality Effectivity Not Sufficient in Push for Data Center Sustainability

This text initially appeared in Gentle Studying.

Vitality effectivity is a part of the answer to cut back emissions from knowledge facilities, nevertheless it’s not sufficient in and of itself. That was a message echoed by Hannah Brier, senior sustainable transformation technologist at Hewlett Packard Enterprise (HPE), throughout the firm’s current occasion in London.

Whereas the problem of knowledge middle power use has been round for years now, it’s being exacerbated by the AI increase, in line with Brier.

“When you consider your infrastructure and you consider your knowledge facilities, immediately, you’ve got bought elevated energy density. So, that rack that you’ve in these knowledge facilities, there may be extra compute energy occurring in there, which additionally means there’s going to be extra warmth generated as properly,” she stated.

That is related for the telecommunications business, which has not eschewed the AI hype. Certainly, fairly the other is true, judging by current business occasions the place generative AI (GenAI) has been a outstanding subject. But, many telcos now have their very own emissions targets to consider.

AI is extremely emissions intensive, attributable to each coaching the fashions and utilizing them. For instance, Google’s 2024 environmental report confirmed its emissions have risen by a whopping 48% since 2019, due primarily to AI. Maybe much more worryingly, the corporate notes decreasing emissions can be tough due to the power wanted to energy “the larger depth of AI compute, and the emissions related to the anticipated will increase in our technical infrastructure funding.”

Associated:Data Center Business Requires Environmental ‘Diet Labels’ to Minimize Carbon Emissions

Along with cryptocurrencies, AI is a part of the rationale why power use from knowledge facilities might double by 2026, in line with the Worldwide Vitality Company (IEA). On the similar time, some international locations have began to limit new knowledge facilities due to considerations over their affect on the grid.

Too A lot Information?

Throughout her presentation, Brier highlighted HPE’s method to knowledge middle sustainability, which incorporates power effectivity however goes past it to areas like software program. Right here, she famous, tweaks might be made to decrease power consumption.

{Hardware} effectivity is one other issue. Usually firms have extra gear than they want, which implies some is unused. This not solely goes in opposition to sustainability ideas, but in addition drives up prices.

In the meantime, Brier additionally acknowledged that typically AI is not the best resolution to an issue. She stated that whereas there may be a variety of hype round AI in the intervening time, it will not resolve each downside.

Some telcos have made related observations. For instance, Orange’s chief AI officer, Steve Jarrett, informed Gentle Studying earlier this 12 months that the operator is trying to different options the place relevant. “You do not need to use the massive language mannequin sledge hammer to hit each nail,” he stated.

Associated:AI Revolution Will Add Gas to Data Center Increase, BlackRock Says

Sue Preston, vp, WW advisory {and professional} providers for HPE international gross sales, in the meantime added that not all AI is made equal, noting that there are predictive algorithms which have been used for years with out inflicting important power issues.

When firms do practice AI, they typically accumulate extra knowledge “for a wet day,” and not using a clear objective, Brier stated. This echoes a current report from Omdia (a sister firm of Gentle Studying), created in partnership with NTT Information and Web App, which discovered that round 60% of the information firms retailer goes unused.

Earlier this 12 months, Rika Nakazawa, group vp for related business and head of sustainability for Americas at NTT DATA, informed Gentle Studying {that a} doable resolution is tagging knowledge to point an expiration date.

Proper-sizing the dataset can be necessary on the subject of coaching AI, in line with Brier. “While you’re your datasets it’s worthwhile to ensure that they’re adequately sized earlier than you practice the mannequin,” she stated.

Associated:Data Center, Crypto-Mining in 10 States Drive US Business Energy Development

Making Data Centers Cool

That’s not to say power effectivity doesn’t matter. The power used to energy computing is roughly equal to that wanted for cooling, in line with the IEA. Right here, Brier stated, HPE is working to cut back the affect by utilizing liquid cooling, which is extra power environment friendly and was initially developed by HPE for its supercomputers. 

HPE can be taking this a step additional in its lately introduced partnership with Danfoss to reuse the warmth generated by knowledge facilities.

The collaboration began with a modular knowledge middle constructed by HPE for Danfoss when it turned a buyer of HPE’s GreenLake cloud platform, Preston stated throughout the occasion. The gear was built-in to warmth buildings within the firm’s property. 

In June, each firms introduced a collaboration pairing HPE’s modular knowledge facilities with Danfoss’ warmth reuse expertise. They now need to work with joint prospects to deploy the answer, Preston stated, focusing on use instances together with greenhouses, agriculture and district heating.

It is price noting that this concept is not utterly new. Corporations are utilizing related approaches to warmth something from houses to swimming swimming pools, and even to farm eels.

Regardless of all these efforts, nonetheless, no predictions counsel that the carbon footprint related to knowledge facilities and AI will go down anytime quickly.