Generative AI: A Major Energy Hog, Warns Expert at Hugging Face

Generative AI's inefficiency demands huge energy, driving data centre growth and environmental concerns. New chips promise both performance and energy efficiency.

Generative AI has been hailed for its ability to produce human-like responses and create content from scratch. However, behind its impressive capabilities lies a significant issue that cannot be overlooked - its immense energy consumption. According to Sasha Luccioni, a machine-learning expert at Hugging Face, the computational inefficiency of Generative AI poses a major problem. Luccioni highlights that every time the model is queried, it activates the entire system, leading to a substantial waste of energy.

The Energy Hog: Large Language Models (LLMs)

Large Language Models (LLMs) are at the core of many Generative AI systems, and they have been trained on extensive datasets to facilitate the generation of text in response to various queries. Dr. Luccioni emphasizes that the process of generating content from scratch and creating answers places a significant computational burden on the system, ultimately resulting in heightened energy consumption. In fact, recent research conducted by Dr. Luccioni and her colleagues revealed that Generative AI systems may consume approximately 33 times more energy than machines running task-specific software.

While the energy-intensive nature of Generative AI is concerning, it is the data centers that bear the brunt of this heightened energy demand. These colossal facilities, hidden from the view of most individuals, are responsible for supporting the computations required for Generative AI and many other digital services. The International Energy Agency (IEA) projects a significant surge in electricity consumption by data centers, estimating a doubling from 460 terawatt hours in 2022 to a potential 1,000 terawatt hours annually by 2026. To put this into perspective, the projected demand is equivalent to the electricity consumption of Japan, a country with a population of 125 million people.

The Global Impact: Ireland and the UK

Several countries, particularly Ireland and the UK, are already grappling with the substantial energy requirements of data centers. In Ireland, nearly a fifth of the country's electricity is consumed by data centers, and this figure is expected to rise significantly in the coming years. The situation is no different in the UK, where the boss of National Grid has predicted a six-fold rise in data center electricity demand within a decade, largely driven by the expansion of AI-related activities.

US Utilities Firms and the Energy Strain

Across the Atlantic, utilities firms in the US are facing a similar predicament. The surge in data center demands coincides with a manufacturing renaissance, further exacerbating the strain on local energy infrastructure. Lawmakers in some states are contemplating the revision of tax breaks previously extended to data center developers due to the overwhelming pressure these facilities impose on the energy grid.

The Hardware Evolution: The Role of Supercomputer Chips

As the hardware landscape continues to evolve, companies are seeking more energy-efficient solutions to power high-end processes such as Generative AI. Nvidia's recently launched Grace Blackwell supercomputer chips offer a promising prospect. These chips are specifically designed to drive advanced processes including generative AI, with Nvidia projecting substantial energy savings in comparison to previous hardware generations. While the energy demands of data centers are poised to rise, the evolving hardware presents an opportunity for notable efficiency improvements.

The Future Outlook: Energy Efficiency and Market Viability

Despite the anticipated rise in data center energy demands, the industry has made strides in improving energy efficiency over time. However, concerns remain regarding the significant waste heat generated by data centers, with efforts in Europe to explore innovative ways of utilizing this excess heat. Amid these advancements, the economic viability of Generative AI applications remains a critical consideration. If the new technology fails to offer cost-effective benefits, its widespread adoption may be hindered by the entrenched use of conventional methods.

Energy Ratings for AI: Navigating the Options

Looking ahead, the development of energy ratings for AI systems could offer valuable insights to stakeholders, enabling them to make informed decisions based on energy efficiency. Dr. Luccioni is actively engaged in a project to establish energy ratings for AI, aiming to provide users with clearer distinctions between energy-intensive models and more lightweight and efficient alternatives. This initiative seeks to empower individuals and organizations to opt for energy-efficient AI models, thereby contributing to a sustainable and energy-conscious future.

Share news

Copyright ©2025 All rights reserved | PrimeAi News