AS ARTIFICIAL intelligence (AI) technologies continue to advance, their potential to revolutionize various industries is becoming increasingly apparent. However, these advancements come with significant environmental costs, particularly in terms of energy consumption.
The rise of generative AI models, such as OpenAI’s ChatGPT and Google’s Gemini, has led to an increased demand for computational power, which directly impacts global sustainability efforts.
Generative AI models like ChatGPT and Gemini are built on large-scale neural networks that require vast amounts of computational resources. The training of these models is particularly energy-intensive. For example, training a single large model like GPT-3 can consume as much energy as a car does over its entire lifetime, including its manufacturing process. This energy consumption is primarily due to the need for extensive data processing, which involves running complex algorithms on powerful hardware, often for weeks or even months.
Once trained, these models continue to consume significant power during the inference phase, where they generate responses to user queries. The widespread use of AI-powered applications means that these models are constantly in operation, leading to continuous energy consumption. As AI becomes more integrated into everyday technologies, the cumulative energy demand of these systems will only increase.
The energy consumption of AI is closely tied to the data centers where these models are hosted. Both OpenAI and Google rely on extensive global networks of data centers to run their AI models. OpenAI, for instance, uses Microsoft’s Azure cloud infrastructure, which includes over 60 data centers worldwide.
These centers are equipped with thousands of GPUs (Graphics Processing Units) and TPUs (Tensor Processing Units) designed specifically for AI workloads. Google’s Gemini AI operates similarly, utilizing Google’s proprietary data centers, which are optimized for large-scale AI computations.
Data centers are notorious for their high energy usage, as they require constant cooling and power to maintain optimal operating conditions for the servers. According to a 2021 report by the International Energy Agency (IEA), data centers globally consume about 1 percent of the world’s electricity. As AI models grow in complexity and usage, this percentage is expected to rise, contributing to a larger carbon footprint.
But AI can save itself.
Using AI to promote sustainability across various sectors–the integration into supply chains, for instance, enhances sustainable production by optimizing processes like predictive maintenance and production planning, which reduces waste and energy consumption. This according to a research called “The Impact of AI in Sustainable Development Goal Implementation: A Delphi Study” by Simon Ofori Ametepey and his team.
The study proposed how AI is increasingly recognized as a catalyst for achieving Sustainable Development Goals (SDGs). Experts highlight AI’s positive impact on goals related to clean energy, sustainable cities, and climate action, among others. AI-driven solutions can help tackle complex challenges such as climate change by improving energy efficiency, reducing emissions, and promoting the sustainable use of resources.
This integration is aligned with the principles of Industry 5.0, emphasizing ecological materials, digital twins, and renewable energy sources to create a more resilient and sustainable manufacturing environment.
To address the sustainability challenges posed by AI, several strategies are being explored and implemented by leading tech companies:
1. Energy efficiency improvements: Researchers and engineers are continuously working on optimizing AI algorithms to make them more energy efficient. This includes developing more efficient neural network architectures and using techniques such as model distillation, where smaller, less resource-intensive models are created from larger ones.
2. Transition to renewable energy: A significant step towards reducing the carbon footprint of AI is the transition to renewable energy sources. Companies like Google have committed to running their data centers on 100 percent renewable energy. As of 2020, Google claims to have matched its energy usage with 100 percent renewable energy purchases, though actual energy consumption still includes non-renewable sources due to grid constraints.
3. Carbon offsetting: In addition to improving energy efficiency and using renewable energy, some companies are investing in carbon offsetting initiatives. These initiatives involve investing in projects that reduce or capture emissions elsewhere, such as reforestation or renewable energy projects, to compensate for the emissions generated by AI operations.
4. Localized AI processing: Another emerging trend is the development of smaller AI models that can run on local devices rather than relying on cloud-based data centers. This reduces the need for constant data transmission and can significantly lower the energy required for AI processing.
However, this approach is currently limited to less complex tasks and smaller models. By improving energy efficiency, transitioning to renewable energy, and exploring innovative solutions like localized AI processing, it is possible to reduce the carbon footprint of AI and align technological advancement with sustainability goals. The future of AI must be shaped not only by its capabilities but also by its commitment to a sustainable and environmentally responsible trajectory.