Tackling the thermal challenges necessitates a collaborative approach between technological innovation and environmental sustainability
By Johan Steyn, 27 September 2023
Beneath the sleek exteriors of our digital devices lies a narrative less glamorous than the utility it provides — a tale of environmental impact told in the language of heat.
The relentless cadence of computations executed by the central processing unit of our smartphones and laptops is a testament to human ingenuity, albeit with a thermal consequence. This heat generation is the result of the electrical energy coursing through the circuits, meeting the resistance of materials and morphing into thermal emissions.
Within the contemporary digital framework, large data centres serve as the critical backbones supporting the seamless functionality of the interconnected domain. These expansive facilities harbour a multitude of servers dedicated to the relentless task of processing, storing, and disseminating a vast array of data, ensuring the continuous flow of information that modern society has come to rely upon.
With the escalating demand for real-time data access, cloud computing, and online services, the operational intensity of these data centres has witnessed a corresponding surge. However, a significant repercussion of this incessant operational cadence is the substantial generation of heat, a by-product of the colossal data processing and computational activities occurring around the clock.
Generative AI, an avant-garde frontier in the digital realm, brings forth a new dimension to the heating conundrum associated with data centres.
At the core of generative AI lies the ability to create intricate models, simulate complex scenarios, and generate new data, which, while revolutionary, demands substantial computational prowess. The algorithms driving generative AI are resource-intensive, often requiring the coordinated effort of numerous servers housed within data centres to perform myriad calculations at an expeditious pace.
Each stride in advancing generative AI capabilities seemingly walks hand in hand with a rise in environmental ramifications, painting a complex picture of progress intertwined with ecological responsibility.
It is estimated, for instance, that before the launch of ChatGPT by its creators, OpenAI, the training of this large language model consumed 1,287MW hours of electricity and generated 552 tonnes of carbon dioxide. Imagine the impact after its release and the global-scale consumer uptake that resulted. This platform boasts an impressive 175-billion parameters (the internal variables that the model uses to generate predictions or responses.)
The next platform released by OpenAI, GPT-4, is estimated to use a staggering 1-trillion parameters. Can you imagine the impact on our environment with this model? It is further estimated that platforms such as these will grow in computational power over the next five years to be a thousand times more powerful. Will that result in a similar increase in resource demand and heat generation? The competitive landscape for this technology has seen an explosion of new tools and products. Companies such as Alphabet’s Google, Microsoft and a myriad of others are deploying similar tools.
As generative AI continues to burgeon and finds its footing in an array of sectors, the discourse around its environmental impact becomes an imperative dialogue.
Addressing the thermal challenges posed by the proliferation of generative AI necessitates a collaborative approach, melding the brilliance of technological innovation with the prudence of environmental sustainability. The intersection of these domains beckons a new era of conscious innovation, one that is cognisant of its environmental footprints and is geared towards minimising the adverse effects while maximising the beneficial potentials of generative AI.