Hugging Face, a machine learning startup, employs Sasha Luccioni, who claims that there is a significant issue with generative AI. AI generative is energy-hungry.
According to her, "the model is incredibly inefficient from a computational standpoint because every time you query it, the entire thing gets activated."
Consider the Large Language Models (LLMs) that are the foundation of many systems that use generative AI. They can produce text in response to almost any query since they have been trained on large databases of textual content.
Dr. Luccioni says, "When you use generative AI, it's basically making up answers and generating content from scratch." That implies a significant amount of work for the computer.
As to a recent study conducted by Dr. Luccioni and colleagues, a machine running task-specific software may consume 33 times less energy than a Generative AI system. Although the study has undergone peer review, it has not yet been released in a journal.
All this energy isn't being used by your personal computer, though. or your mobile device. Our growing reliance on computations takes place in massive data centers that are hidden from most people's view.
"Cloud," refers to Dr. Luccioni. "You don't consider these massive metal boxes that require a lot of energy and heat up."
Global data centers are consuming an increasing amount of electricity. They consumed 460 terawatt hours of electricity in 2022, and in just four years, the International Energy Agency (IEA) projects that this amount will quadruple. By 2026, data centers may consume 1,000 terawatt hours per year in total. According to the IEA, "this demand is roughly equivalent to Japan's electricity consumption." The population of Japan is 125 million.
Huge amounts of data, including Hollywood films and your emails, are kept in data centers to be retrieved from anywhere in the globe. Not only does AI power those anonymous structures, but they also power cryptocurrency. They support existence as we know it.
US utilities companies are starting to feel the pinch, according to consultant Chris Seiple of Wood Mackenzie.
According to Mr. Seiple, there is a "land grab" for data center locations close to renewable energy hubs or power plants: "Iowa is a hotbed of data center development, there's a lot of wind generation there."
The hardware underlying generative AI is always changing, which contributes to some of the uncertainty.
Tony Grayson, general manager of Compass Quantum, a data center company, mentions Nvidia's newly released Grace Blackwell supercomputer chips, which are named for a mathematician and computer scientist and are intended to power advanced applications like computer-aided drug design, quantum computing, and generative artificial intelligence.
But according to Nvidia, 2,000 Grace Blackwell chips could do the same task in the same amount of time, and they would only require a four megawatt supply.
According to Mr. Grayson, "the performance is increasing so much that your overall energy savings are big." However, he acknowledges that the need for power is influencing the locations of data centers: "People are going to where cheap power is at."
Dr. Luccioni points out that producing the newest computer chips requires a substantial amount of energy and resources.
0 Comments