How CloudKeeper's LensGPT is Redefining Cloud Cost Intelligence: Exclusive with Sanjeev Mittal

How CloudKeeper's LensGPT is Redefining Cloud Cost Intelligence: Exclusive with Sanjeev Mittal

We Are Moving from Dashboards to Conversations": CloudKeeper CPTO on the Future of FinOps
Published on

Cloud spending is at an inflection point. Artificial intelligence workloads are now central to enterprise strategy, and organizations are grappling with costs that are more dynamic, unpredictable, and harder to govern than ever before. Traditional tools built for simpler cloud environments are struggling to keep pace, and the gap between innovation and financial control is widening. To address this challenge steps a new generation of AI-powered FinOps platforms that promise to transform how enterprises understand and manage their cloud spend. 

In an exclusive interview with Sanjeev Mittal, CPTO of CloudKeeper, Analytics Insights explores how agentic AI, natural language interfaces, and real-time intelligence are rewriting the rules of cloud cost governance. Here are the excerpts:

Q

Cloud spending has grown significantly with the rise of AI workloads. What are the biggest challenges enterprises face today in managing cloud costs effectively?

A

AI has fundamentally changed how the cloud is consumed. The entire architecture of cloud environments is now being rethought to support AI workloads, which are far more dynamic and resource-intensive.

One of the biggest challenges is visibility. Costs are no longer predictable and often spike based on usage patterns, model interactions, and data movement. Most teams still lack real-time insight into what is driving these costs.

There is also a gap between innovation and control. Teams are moving fast with AI, but cost governance is still catching up. This makes it difficult for organizations to scale AI confidently without overspending.

Q

Traditional cloud cost management relies heavily on dashboards and manual analysis. What limitations do these approaches create for modern enterprises?

A

Dashboards were designed for a simpler cloud environment. They provide visibility, but they still depend heavily on manual effort and user expertise.

Teams often spend hours pulling reports, applying filters, and trying to interpret data across multiple tools. Developing meaningful insights requires database knowledge and technical skills that not everyone on the team has. And by the time a pattern is identified, the cost has already been incurred. It is fundamentally a reactive process.

Another limitation is accessibility. Not every stakeholder is comfortable navigating dashboards — especially finance or leadership teams who often need ad-hoc analysis and real-time answers, not another tool to learn. This creates silos where cost data exists but doesn't reach the people making spending decisions.

What modern enterprises need is a more intuitive and proactive way to understand and act on cloud cost data.

Q

CloudKeeper recently launched LensGPT. Could you explain how an agentic AI approach changes the way organisations interact with cloud financial data?

A

With LensGPT, we are moving from dashboards to conversations. Instead of navigating filters and reports, users ask questions in plain language — "Why did my S3 cost spike last Tuesday?" or "What's my cost per EKS cluster deployment?" — and get contextual, data-backed answers in seconds

The agentic approach means the system does more than just respond. It understands the context, analyzes patterns, and guides the user toward the next step.

Instead of spending time figuring out what happened and what to do next, teams can directly ask and get both the insight and the recommended action.

We are also launching LensGPT as an MCP server, which means it connects directly into tools teams already use — Claude, ChatGPT, Cursor, or internal AI assistants. Cost intelligence meets users where they work, not in another dashboard they have to learn. This makes FinOps faster, more accessible, and much more aligned with how teams actually work today.

Q

How does LensGPT use natural-language queries and AI reasoning to help teams identify cost drivers and optimisation opportunities in real time?

A

LensGPT sits on top of our Lens Analytics Engine, which ingests billing data, usage metrics, tags, and custom business taxonomies across AWS, GCP, and Azure. When a user asks a question, LensGPT translates that natural-language query into structured lookups across these datasets — no SQL knowledge, no dashboard navigation required.

The reasoning layer is what sets it apart. A question like "Why did my costs spike last week?" is not a simple lookup. LensGPT decomposes it into multiple sub-queries — checking service-level changes, regional shifts, configuration modifications, and usage anomalies — then synthesizes a single, explained answer. It attributes the spike to specific resources, tags, or events, not just a service total.

On optimization, LensGPT draws from our recommendation engine which today covers 44 recommendation types across AWS and GCP. When it identifies a cost driver, it can immediately surface the relevant optimization.

Because Lens processes data at hourly granularity, these insights reflect what is happening now, not what happened last month. Teams can catch anomalies within hours rather than discovering them in a monthly review.

Q

In your view, how will AI-powered FinOps platforms reshape cloud cost governance for enterprises over the next few years?

A

AI-powered FinOps platforms will shift cost governance from reactive to continuous. Instead of periodic reviews, organizations will have real-time monitoring and intelligent recommendations built into their workflows.

We will also see FinOps become more accessible across teams. With AI simplifying how data is consumed, more stakeholders will be able to participate in cost-related decisions.

Another key shift will be speed. Decisions that used to take hours or days will happen in minutes, supported by real-time insights.

Overall, governance will become more embedded, proactive, and aligned with how modern cloud environments operate.

Q

With organisations increasingly prioritising cost efficiency in cloud adoption, what trends do you expect to see in the evolution of FinOps and cloud optimisation tools?

A

The biggest shift we see is FinOps moving from a reporting function to an intelligence layer that is embedded directly into engineering and business workflows. Today, cost insights sit in dashboards that a small team reviews. Tomorrow, they will be delivered through the same AI tools teams already use.

The second trend is the emergence of FinOps for AI as a distinct discipline. AI workloads account for 8-10% of cloud spend today, but that is projected to exceed 40%. GPU utilization averages just 20-50% across enterprises — there is enormous waste being created at scale. Organizations will need dedicated tooling for AI cost visibility, GPU right-sizing, LLM cost optimization, and commitment management for AI infrastructure.

We also expect FinOps to shift left in the lifecycle. Cost awareness will become part of architecture and deployment decisions, not something reviewed after the bill arrives. At CloudKeeper, we are already building in this direction — integrating optimization recommendations into DevOps workflows through ITSM integrations, infrastructure-as-code tools, and MCP-based AI assistants.

Ultimately, the tools that win will be the ones that close the loop — from visibility to insight to action — without requiring a human to manually connect the dots at every step

logo
Analytics Insight: Latest AI, Crypto, Tech News & Analysis
www.analyticsinsight.net