By Sap Insights, Alan Stafford
In January 2025, US President Donald Trump announced a $ 500 billion new joint venture to create a new data center to power AI services. Last summer, the Biden administration launched the AI Data Center Taku Force to address the AI infrastructure needs. Obviously, the federal government recognizes the rapid growth of AI technology, no matter who is in charge.
We also recognize that these data centers need a large amount of power. In 2022, SAP estimated that 200 terrowatts were needed a year to supply power to all data centers around the world. The World Economic Forum states, “The calculation skills needed to maintain the rise of AI are doubled almost every 100 days.” The data center has already consumed about 4 % of the total power of the country in 2023, and the EPRI (EPRI) will reach 9 % by 2030.
Many high -tech companies operate large -scale power -hungry data centers and build new new pace for AI push. Google has more than 100 data centers worldwide, and millions of servers are working 24 hours a day. Facebook’s parent company Meta has announced plans to spend $ 60 to $ 65 billion in capital spending to data centers. Microsoft also stated that it would invest $ 80 billion in AI -compatible data centers in 2025.
Google has developed a special “tensor processing unit” to accelerate AI services. These TPUs were first found in the central processing unit, CPU, graphic processing unit, or computer graphics cards found in most computers, but to process more application tasks. It is different from the GPUs used more and more. -Contains cryptocurrency mining.
When cryptocurrency mining was popular around 2009, one criticism was that it consumed a huge amount of power. However, AI requires a huge amount of computing power and as a result. According to some estimates, Internet search with AI requires 30 times the energy of simple text -based search, and researchers predict the increase in data center energy consumption.
I need more power, Scotty
The rapidly growing demand is straining the national electric grid, so it is looking for a new or expanded power source by government agencies, electric utility, and high -tech companies.
The US Ministry of Energy predicts that electricity demand will increase by 15 to 20 % in the next 10 years, and the conventional power source is certainly included in the mix to meet the demand, but some other types. The energy source of is also developed or expanded. Historical and highly criticized nuclear power has been increasing among Americans over the past few years.
Some nuclear power plants remain open after the original abolition date to meet the power demand. California, which has four nuclear power plants, was planning to close the last plant this year, but the closure was delayed until at least 2030. The energy accident in 1979 will be resumed. Microsoft supports the project by promising to purchase all productions in the factory in order to strengthen data centers and AI services on the surface.
Google and Amazon also invest in nuclear power by promoting operations to small modularities (SMR). SMR is smaller than conventional nuclear power plants, so it can be constructed in a high -tech company data center or a day near it.
Other companies pursue the Holy Grail of Energy Production: nuclear fusion power. The nucleus is combined and the energy is released. Commonwealth Fusion Systems has announced that it is building the world’s first grid scale fusion power plant in Virginia. This may be the highest state of power consumed in the data center. Companies investors include WHO’S WHO, a technology company or executive.
Until then, renewable energy has been focused on high -tech companies. Solar has not yet accounted for most of the United States’ energy production, but has grown rapidly, and is promoting the same company (Amazon, Google, Microsoft). In September 2024, STARTUP COMPANY ECL announced that it will build a hydrogen -driven AI data center in Texas.
AI and sustainability
Certainly, AI is promoting energy consumption, and electrical production is increasing to respond to it. However, some companies have improved AI efficiency and sustainability.
According to SAP, the data center works with 100 % renewable power. This includes self -produced sources such as solar panels and purchases a renewable energy certificate (EAC).
Using a smaller and less powerful AI model can save a considerable amount of power, and all AI tasks do not require the biggest model. SAP states that optimizing all unique machine learning model training efforts and matching the most suitable LLM for specific applications will optimize energy consumption and cost efficiency. Other companies use similar strategies.
The World Economic Forum will increase the amount of power usage in the training stage and the reasoning stage, which can reduce AI energy consumption by 12 % to 15 %, and at the same time to finish only 3 % related tasks. He says that it can be longer.
China’s AI company Deepseek has released the Deepseek-R1 model in late January, claiming that the AI processing cost is much lower due to the use of more efficient resources.
Another strategy is to stop the processing of some AI models if they are identified as low -performance after reaching a specific value. Doing so reduces the energy used by 80 %.
The processing chips used for AI also focus on many companies. SAP says he is trying to use the most efficient chips for AI processing. Google says that the latest tensor chips are 67 % more efficient than the previous generation.
Read “Reasons why AI must overcome fragmentation to optimize AI for all benefits” to learn how AI will play a role in achieving sustainable goals. Please go.