Hot Posts

6/recent/ticker-posts

Power grids groan as demand for AI rises

Power grids groan as demand for AI rises

Hugging Face, a machine learning startup, employs Sasha Luccioni, who claims that there is a significant issue with generative AI. AI generative is energy-hungry.

According to her, "the model is incredibly inefficient from a computational standpoint because every time you query it, the entire thing gets activated."

Consider the Large Language Models (LLMs) that are the foundation of many systems that use generative AI. They can produce text in response to almost any query since they have been trained on large databases of textual content.
Dr. Luccioni says, "When you use generative AI, it's basically making up answers and generating content from scratch." That implies a significant amount of work for the computer. 
As to a recent study conducted by Dr. Luccioni and colleagues, a machine running task-specific software may consume 33 times less energy than a Generative AI system. Although the study has undergone peer review, it has not yet been released in a journal.
All this energy isn't being used by your personal computer, though. or your mobile device. Our growing reliance on computations takes place in massive data centers that are hidden from most people's view.
"Cloud," refers to Dr. Luccioni. "You don't consider these massive metal boxes that require a lot of energy and heat up."


Global data centers are consuming an increasing amount of electricity. They consumed 460 terawatt hours of electricity in 2022, and in just four years, the International Energy Agency (IEA) projects that this amount will quadruple. By 2026, data centers may consume 1,000 terawatt hours per year in total. According to the IEA, "this demand is roughly equivalent to Japan's electricity consumption." The population of Japan is 125 million.

Huge amounts of data, including Hollywood films and your emails, are kept in data centers to be retrieved from anywhere in the globe. Not only does AI power those anonymous structures, but they also power cryptocurrency. They support existence as we know it.

However, certain nations are all too aware of how energy-hungry these facilities are. Dublin is now experiencing a prohibition on the development of new data centers. Data centers consume about a fifth of Ireland's electricity, and this percentage is projected to rise dramatically over the next several years while Irish families cut back on their usage.

In a lecture given in March, the CEO of National Grid predicted that the UK's need for data center electricity will increase six times in ten years, primarily due to the development of artificial intelligence. Nonetheless, National Grid anticipates a significantly higher overall energy requirement for electrifying transportation and heating.

US utilities companies are starting to feel the pinch, according to consultant Chris Seiple of Wood Mackenzie.

He says, "They're being hit with data center demands at the exact same time that domestic manufacturing is experiencing a renaissance thanks to government policy." According to US sources, lawmakers in several states are now reconsidering tax benefits granted to data center developers due to the extreme load these facilities are placing on regional energy infrastructure.

According to Mr. Seiple, there is a "land grab" for data center locations close to renewable energy hubs or power plants: "Iowa is a hotbed of data center development, there's a lot of wind generation there."

These days, some data centers can afford to expand to more remote places because Generative AI systems, which are becoming more and more popular, do not have a large latency issue. Latency is the time delay, usually measured in milliseconds, between sending information out from a data center and the user getting it. In the past, for the fastest response times, data centers managing emergency communications or financial trading algorithms, for instance, were located inside or close to major population centers.

The need for energy in data centers will undoubtedly increase in the upcoming years, but it's unclear by how much. Mr. Seiple emphasizes.

The hardware underlying generative AI is always changing, which contributes to some of the uncertainty.


Tony Grayson, general manager of Compass Quantum, a data center company, mentions Nvidia's newly released Grace Blackwell supercomputer chips, which are named for a mathematician and computer scientist and are intended to power advanced applications like computer-aided drug design, quantum computing, and generative artificial intelligence.

Nvidia claims that a business could use 8,000 of its previous generation of Nvidia chips to train AIs in 90 days, which would enable them to be several times larger than the largest AI systems now in use. A 15 megawatt electrical supply would be required for this.

But according to Nvidia, 2,000 Grace Blackwell chips could do the same task in the same amount of time, and they would only require a four megawatt supply.

(That still comes up to 8.6 gigawatt hours of electricity used, which is more than Northern Ireland uses in a year.)

According to Mr. Grayson, "the performance is increasing so much that your overall energy savings are big." However, he acknowledges that the need for power is influencing the locations of data centers: "People are going to where cheap power is at."


Dr. Luccioni points out that producing the newest computer chips requires a substantial amount of energy and resources. 

Post a Comment

0 Comments