

The race to operationalize artificial intelligence has shifted from model-centric experimentation to infrastructure that can sustain continuous, measurable impact. As U.S. private AI investment climbed to $109.1 billion in 2024, the industry made a decisive move away from standalone prototypes and toward enterprise platforms capable of governing models at production scale. This transformation has elevated leaders who understand that reliability and system design—not isolated breakthroughs—now determine competitive advantage. Among them is Sayantan Ghosh, Senior Engineering Manager at LinkedIn and co-inventor of the U.S. patent “Correction of user input,” cited by Google, Oracle, and Dell.
Ghosh describes this shift with clarity: “The infrastructure behind AI is now strategic, not supportive. Training environments once served as laboratories for innovation, but today the real differentiator is the machinery that keeps intelligence consistent and dependable in the wild. When the foundation is strong, every model benefits; when it’s weak, every model fails in its own unpredictable way.” His early work at Uber illustrates the power of treating infrastructure as a growth engine. By scaling the model and metadata services that powered Michelangelo, he helped democratize ML usage across the company and accelerate operations during a period of rapid global expansion.
The migration from pilot projects to enterprise-scale platforms is reshaping operational expectations. AI now runs inside high-volume production environments, and the underlying demands have intensified accordingly.
Ghosh sees inference as the moment where abstract intelligence becomes accountable to actual business outcomes. “Inference is where AI meets reality. Training is an idealized environment; serving is lived experience. Every mismatch between the two—whether in features, definitions, or assumptions—slowly erodes business value.” This awareness shaped his work at eBay, where his Bayesian optimization approaches helped improve advertising yield and produced measurable revenue gains at multimillion-dollar scale. The achievement reinforced a belief that optimization succeeds only when experimentation infrastructure is governed, consistent, and inexpensive to use.
These lessons continue to guide his philosophy: enterprise AI matures only when the supporting systems are mature. Without shared definitions, controlled rollout mechanisms, and reliable serving layers, even the best models deliver uneven or unstable outcomes. Ghosh argues that platformization will continue accelerating as organizations realize that predictable AI performance requires predictable AI plumbing.
The rapid adoption of generative AI—used by 65% of organizations in at least one business function by mid-2024—has pushed experimentation frameworks to the forefront. The faster companies learn in production, the faster they compete. With global digital ad spend projected to reach $678.7 billion in 2025, optimization is now tied directly to how efficiently an organization can evaluate, validate, and ship AI-driven changes.
Ghosh, an invited speaker at Data Con LA, 2025 for his talk, “Data Debt : The hidden malaise impacting your Business”, cautions that acceleration without discipline can become costly. “You can automate learning and accelerate deployment, but you cannot outrun Data Debt. Weak foundations create the illusion of progress while quietly multiplying blind spots. Advanced Machine Learning architectures and Data scale doesn’t correct these problems, it amplifies them.” His work on consumer request queue optimization technology using multi-modal machine learning architectures at Meta is a leading example of cutting edge industry leading innovation. Productionizing image & video features in queue optimization at billion+ data scale reflects a commitment to addressing foundational data issues.
Beyond his direct engineering work, Ghosh contributes to the field through academic and professional service. Their critical insights as a reviewer in top international journals such as ACM Transactions on Knowledge Discovery from Data have helped advance the frontiers of knowledge discovery and uphold scientific rigor, ensuring that impactful and cutting edge research reaches the community.
Looking forward, Ghosh believes that AI will reward organizations that treat infrastructure as a strategic asset rather than an implementation detail. He argues that the businesses capable of connecting training, inference, experimentation, and governance in one cohesive pipeline will outperform those relying on fragmented solutions. “Breakthrough AI doesn’t happen in silos. The true innovators will be the companies that build end-to-end infrastructure, where data, models, and insights move seamlessly across the entire development lifecycle.””