

Artificial intelligence is now a powerful weapon against climate change, enhancing climate forecasts and improving energy efficiency. However, a significant and controversial issue is the rising carbon footprint of energy-intensive AI systems as their adoption increases across public institutions and private companies.
In the latest episode of the Analytics Insight Podcast, host Priya Dialani speaks with Vineet Mittal, Senior Vice President at Ziroh Labs, about why sustainability must become central to AI development.
The expert voice in the episode argues that AI itself is not the environmental threat. “The real risk lies in how inefficiently AI is being deployed today,” Vineet explains, pointing to excessive dependence on energy-hungry GPU infrastructure.
While AI promises efficiency and scale, modern large language models depend on massive computational power. These workloads, executed mainly on GPUs, consume enormous amounts of electricity, placing pressure on already strained energy systems, particularly in countries like India.
According to Ziroh Labs’ Senior VP, the path to green AI starts with infrastructure choices. Ziroh Labs has focused on enabling AI workloads to run efficiently on CPUs that already power most enterprise applications. CPUs consume significantly less energy, reducing the need for new, power-intensive hardware.
The conversation highlights the rise of open-source AI models that have significantly reduced the costs and power consumption associated with training. These improvements challenge the belief that advanced AI requires GPU clusters costing billions of dollars and, at the same time, make environment-friendly AI more accessible.
Looking ahead, Vineet stresses that sustainability is also about inclusion. Energy-efficient, open AI systems allow countries like India to develop culturally relevant models at lower cost. “How we build AI matters as much as what it delivers,” he concludes. “Responsible AI can scale innovation without scaling environmental damage.”