INTELBRIEF
July 23, 2024
The Energy Politics of Artificial Intelligence as Great Power Competition Intensifies
Bottom Line Up Front
- AI is predicted to be a cornerstone of military competitiveness as great power competition intensifies, rendering the scramble for energy resources a private industry endeavor and a vital national security issue.
- Generative AI has emerged as one of the most energy-intensive technologies on the planet, drastically driving up the electricity consumption of data centers and chips.
- As the U.S. is the second largest electricity consumer in the world behind China, the electricity demand will only grow as the two countries continue to compete, particularly in the technology and military sectors.
- Countries like China and Russia have access to a myriad of resources, such as critical minerals, advanced manufacturing capacity, and planned efforts to ramp up the use of AI in security and defense.
The proliferation of generative artificial intelligence (AI) use across the private and public sectors has ushered in a new era in environmental and energy politics, marked by additional strain on energy resources in the scramble of companies developing and deploying advanced AI models. The computing resources necessary to develop, train, and deploy generative AI models are linked to significant energy consumption and intense water use. AI is predicted to be a cornerstone of military innovation, and in the current paradigm of great power competition, the scramble for electricity is thus not only a private industry endeavor related to profit, innovation, and tech leadership but a significant national security concern.
Generative AI has emerged as one of the most energy-intensive technologies on the planet, drastically driving up the electricity consumption of data centers and chips. For instance, a search on ChatGPT may consume 25 times more energy than a Google search. Within the lifecycle of an AI model, two phases have the most environmental impact. First, the training phase of the machine learning (ML) model in which algorithms learn from training data to make predictions and/or decisions, based on inference. According to Mosharaf Chowdhury, an associate professor of electrical engineering and computer science at the University of Michigan, one round of training the model GPT-3 could consume 1,287 MWh, enough to supply an average U.S. household for 120 years. Following the training phase is the inference phase, when the trained model is used. The World Economic Forum states that the inference phase of an AI model accounts for 80 percent of the environmental footprint, with the training phase at 20 percent.
Scholars Petr Spelda and Vit Stritecky have investigated the environmental costs of the human-AI nexus from various angles and flag that computational resources required for ML experiments have been doubling every 3.4 months since 2012, with some experiments now consuming hundreds of petaflop/s-days. They are also concerned about "gratuitous generalization capability," where ML models are optimized for accuracy beyond what is necessary for the task, resulting in inefficient and environmentally harmful operations.
The International Energy Agency (IEA) has estimated that by 2026, global energy consumption of data centers, cryptocurrency, and AI will double – roughly the amount of electricity used by the entire country of Japan. In the U.S., Rene Haas chief executive of Arm, a chip-design company, told the Wall Street Journal that “by the end of the decade, AI data centers could consume as much as 20% to 25% of U.S. power requirements,” up from 4% or less today. As the U.S. is the second largest electricity consumer in the world behind China, the demand for electricity will only grow as the two countries continue to compete, particularly in the technology and military sectors. Other top electricity consuming countries include Russia, India, Japan, Brazil, and South Korea.
As of 2022, The Department of Defense (DoD) is the single-largest electricity consumer in the U.S. and consumes 76 percent of the federal government’s total energy consumption. AI has a variety of important uses within the defense system such as for predictive analytics, autonomous vehicles, cybersecurity, and intelligence operations. The National Security Institute at George Mason University states that within the scope of defense and national security, AI “significantly enhances operational efficiency, decision-making accuracy, and strategic advantage, allowing the Department of Defense (DoD) to process vast amounts of data, identify threats, and optimize resource deployment.”
As the U.S. military continues integrating AI into its everyday operations, electricity consumption across the DoD is projected to grow exponentially. As a result of the strain AI poses to electrical grids and the realization of the importance of AI, the U.S. Department of Energy (DOE) announced last week the Frontiers in AI for Science, Security, and Technology (FASST), a roadmap designed to “help harness AI for the public good,” including in national security. According to Axios, the “DOE aims to build energy-efficient AI supercomputers, which could help address the challenge of advancing this technology without causing energy consumption to skyrocket, which Energy Secretary Jennifer Granholm has identified as a top concern.”
The U.S. electrical grid is extremely antiquated, with much of the infrastructure built in the 1960s and 1970s. Despite parts of the system being upgraded, the overall aging infrastructure is struggling to meet our electricity demands–AI puts even more pressure on this demand. Thus, the need for a modernized grid powered by efficient and clean energy is more urgent than ever. As countries like China and Russia have access to a myriad of resources, such as critical minerals, advanced manufacturing capacity, and planned efforts to ramp up the use of AI in its militaries, the ability to power these systems is now a matter of national security.