Experts’ Opinion Questions the Electricity Consumption by Data Centres and AI Activities Worldwide

by August 5, 2019

The chief executive of Applied Materials, Gary Dickerson made a significant prediction at a recent conference in San Francisco. He questioned the electricity consumption of data centers and AI projects. Dickerson said that the data centers’ and their AI workloads could incapacitate a tenth of world’ electricity usage by the year 2025. Applied Material is a big supplier to the semiconductor industry.

At present, a maximum number of data centers mushroomed around the world consume a little less than 2 percent including all kind of workloads handled by their versatile servers. The company evaluates that the AI-centric servers are responsible for just 0.1 percent of global electricity consumption as for now.

Along with Gary, certain other experts and professionals are ringing an alarm for the same. An expert from Huawei, Anders Andrae wonders that data centers could end up soaking up a tenth of the world’ electricity by 2025, however, his estimate envelops all sorts of uses of technology and workloads including AI.

Additionally, Jonathan Koomey who is a special advisor to the senior scientist of Rocky Mountain Institute is quite optimistic regarding the same. Jonathan predicts data centers’ energy usage will remain relatively flat in the coming years despite the upsurge in AI activities.

The diverse opinions and predictions of different experts create a dilemma, if AI’s impact on future energy consumption is a threat?

These widely diverging predictions highlight the uncertainty around AI’s impact on the future of large-scale computing and the ultimate implications for energy demand.

Well, looking at the bigger picture we can say that AI is undoubtedly power-hungry as the training and execution of AI algorithms involves the processing of voluminous data. A study presented by OpenAI states that the computing power needed for driving large AI models is already increasing at double rate every 3-4 months.

Moreover, the forecast essayed by Applied Materials depicts the worst-case scenarios possible in the absence of innovative thinking in hardware and software. Another expert, Sundeep Bajikar who serves as head of the corporate strategy and market intelligence at Applied Materials quoted that it is a conjecture that there will be a shift over the time in the mix of information (including audio, video and textual format) being utilized to train AI algorithms. Also, visual data requires more energy as it is more computationally intensive.

Additionally, Bajikar said that trends including 5G technology and autonomous vehicles highlight the urgent requirement for the new strategy in materials and manufacturing for AI generation.

A considerable number of researchers doubt that the environment could suffer a great loss while quenching the thirst of AI energy consumption. A research team at the University of Massachusetts, Amherst published a study recently depicting that the training AI-enabled models generate around 5 times the entire lifetime exhalation of the average American car.


AI Itself Can Cut Down The Power Consumption Issue

The cynical predictions overlook certain important developments that could curb the AI power consumption, like the upsurge of Hyperscale data centers launched by Facebook and Amazon. These data centers use a large range of basic servers designed for specific tasks. The machines used here are more energy-efficient than servers in traditional centers.

A new variety of microchips also help in the same. The Applied Materials forecast predicts that AI workloads will continue to run on prevailing hardware whose efficiency will gradually improve over the next couple of years.

Interestingly, AI itself could leverage the biggest check on AI power consumption activities. The tech giant Google is already employing technology developed by DeepMind to cool its data centers more efficiently. The technology has helped the organization to reduce its cooling bill by 40 percent and now the AI-powered cooling systems are running in the centers by themselves.

Therefore, it will be hard to judge if the optimistic or pessimistic opinions and predictions are going to win yet we can rely on the fact that despite all the conjectures, companies are capable of resolving the issue while optimizing the varied aspects of data centers and AI operations.