AI’s Growing Energy Demand: Will AI Reduce Emissions or Create New Energy Challenges?

  Editorial INTI     1 bulan yang lalu
b12e98d92ac8081886f7e648801ec7bd8425c0426cd7a8189fb89d117345d3c5.jpg

Jakarta, INTI - As artificial intelligence (AI) continues to evolve and offer promising solutions across industries, its rapid growth also brings a hidden cost: energy consumption. While AI is hailed as a tool that can help drive the global energy transition, especially in the pursuit of net-zero emissions, the rising demand for electricity to power AI operations is becoming a challenge. The debate around AI's role in the future of energy is heating up as experts weigh its benefits against its environmental footprint.

The Energy Burden of AI
AI systems require immense computational power, and this need is expanding rapidly. For example, running a simple query on AI platforms like ChatGPT requires significantly more electricity than performing a Google search. With over 100 million users weekly engaging with AI models, the energy used by AI-based platforms is skyrocketing. Behind these operations are vast data centers that consume large amounts of power, primarily to train and maintain AI models.

According to Microsoft, their carbon emissions have increased by nearly 30% since 2020, largely due to the expansion of data centers used to power AI tools. Google reported similar results, with a 50% rise in emissions since 2019, driven by the energy demands of AI systems. This puts pressure on an already stretched global grid, as AI's energy needs continue to surge.

What's Driving AI's Energy Appetite?
AI's energy use, though currently a fraction of the technology sector’s overall consumption, is projected to grow exponentially. Estimates suggest that global data centers account for 2-3% of total greenhouse gas (GHG) emissions, with AI likely to increase this share as its use becomes more widespread. The problem is compounded by the increasing need for training more sophisticated models like GPT-4, which demands far more electricity than previous versions.

Generative AI, in particular, has been identified as a key driver of this demand. Training advanced models requires thousands of megawatt hours of electricity, which has a significant carbon footprint. Furthermore, because AI models need to operate continuously, the energy drain is relentless. As one AI expert put it, “AI never sleeps,” reflecting the constant need for power to sustain these systems.

Balancing AI's Potential with Its Environmental Cost
Given AI’s ability to optimize systems, reduce inefficiencies, and model climate solutions, many hope AI will become a key player in reducing emissions. Some studies even predict AI could help mitigate 5-10% of global GHG emissions by 2030. However, this potential must be balanced against the energy-intensive nature of AI, particularly as it continues to grow.

Governments and tech companies are taking steps to address this issue. The European Parliament, for instance, is introducing requirements for AI systems to log their energy consumption, ensuring greater accountability. Meanwhile, chipmakers like Nvidia are developing more efficient hardware, and data centers are exploring cutting-edge cooling technologies and sustainable energy sources to minimize their environmental impact.

Finding ways to limit unnecessary energy consumption, such as reducing “dark data” (data that is generated but never used), will also be key. Selective use of AI, such as deploying smaller, less energy-demanding models for specific tasks, may help balance the growing energy needs with sustainability goals.

Ad

Ad