

NVIDIA stock rose after the company announced an expanded multiyear partnership with Meta Platforms that centers on large-scale AI infrastructure.
The agreement commits Meta to deploy millions of NVIDIA GPUs, CPUs, and networking hardware across its global data centers. The companies framed the move as a long-term effort to support AI training and inference at an industrial scale.
Meanwhile, NVIDIA shares closed at $184.97, up 1.20%, and moved higher following the announcement. Investors focused on the scale of Meta’s commitment, which reinforces Nvidia’s role as a primary supplier of advanced AI hardware. The companies did not disclose the financial value of the agreement.
Meta plans to deploy Nvidia’s Blackwell GPUs and prepare for the upcoming Rubin generation as part of new hyperscale AI clusters. These systems will support both model training and inference workloads that power Meta’s recommendation engines, personalization systems, and AI services.
NVIDIA CEO Jensen Huang said no company deploys AI at Meta’s scale and emphasized the deep integration between research and infrastructure.
The partnership extends beyond hardware purchases. Engineering teams from both companies will coordinate software and system design to optimize performance and efficiency.
NVIDIA said the architecture will span Meta’s on-premises data centers and NVIDIA Cloud Partner deployments, which allows Meta to expand compute capacity when demand spikes. That hybrid structure aims to simplify operations while maintaining high utilization of GPU resources.
Meta will expand the rollout of Grace CPUs in production environments. NVIDIA described the deployment as its first large-scale Grace CPU-only implementation, supported by joint optimization work to improve performance per watt.
The companies said this efficiency focus aligns with Meta’s long-term infrastructure strategy, which prioritizes energy-aware computing.
Meta and NVIDIA also confirmed plans to collaborate on the upcoming Vera CPU platform, with potential large-scale adoption in 2027. These CPU-only systems would handle workloads that traditionally run on servers powered by established chip vendors.
Analysts note that broader adoption of NVIDIA CPUs could reshape competition in the data center processor market, where legacy suppliers have held strong positions for decades.
Also Read: Should You Buy NVIDIA in 2026? Best AI Stock or Not?
Meta will integrate Nvidia’s Spectrum-X Ethernet networking platform across its infrastructure footprint. NVIDIA said the networking system delivers predictable low-latency performance and improves throughput for AI-scale data movement.
Efficient networking remains essential for GPU clusters, where data transfer speed directly affects training performance.
The partnership also extends into privacy-focused computing. Meta adopted NVIDIA Confidential Computing for private AI processing within WhatsApp. The technology allows encrypted data handling during GPU workloads while preserving confidentiality.
NVIDIA and Meta said they will expand confidential compute capabilities to additional services as AI features grow across Meta’s ecosystem.
Meta CEO Mark Zuckerberg said the company will build leading-edge clusters designed to deliver advanced AI capabilities to users worldwide. The announcement arrives during a period of investor scrutiny around AI spending, yet analysts continue to view Nvidia’s GPUs as foundational infrastructure for large-scale AI systems.
The partnership signals sustained demand for high-performance chips as major technology companies invest in next-generation computing.