0G Labs Achieves Breakthrough in Decentralized AI Training With 100 Billion+ Parameters

0G Labs
Written By:
CyberNewswire
Published on

Singapore, Republic of Singapore, July 22nd, 2025, Chainwire

0G Labs has released a research paper in collaboration with China Mobile demonstrating the efficacy of AI model training using decentralized clusters. The paper introduces DiLoCoX, a cutting-edge framework designed to train large language models (LLMs) exceeding 100 billion parameters in decentralized environments with limited network bandwidth. It marks a breakthrough in decentralized model training.

“DiLoCoX: A Low-Communication Large-Scale Training Framework for Decentralized Cluster” proposes training on slow networks using a low-communication, large-scale decentralized framework. By chaining a number of complementary technologies together, 0G has been able to overcome the shortfalls of incumbent decentralized models.

0G’s researchers have demonstrated that DiLoCoX is capable of pre-training a 107B foundation model for the first time over a 1Gbps network. Its solution achieves 357x greater speeds in distributed training versus AllReduce while maintaining negligible degradation in model convergence. This achievement is believed to be the first decentralized training framework successfully applied to models with over 100 billion parameters.

In a field where centralized data centers dominate AI training due to their high-speed connectivity, DiLoCoX breaks new ground by enabling efficient, verifiable training on slower networks. The framework ingeniously combines Pipeline Parallelism, a Dual Optimizer Policy, One-Step-Delay Overlap of Communication and Local Training, and an Adaptive Gradient Compression Scheme. These innovations not only scale up model sizes dramatically but deliver significant speed improvements while preserving model convergence with minimal degradation.

In experiments detailed in the paper, 0G pre-trained a 107 billion parameter model – a scale roughly 10x larger than the recently released Intellect-1 model from PrimeIntellect – demonstrating its potential to democratize access to advanced AI infrastructure. This solution addresses key challenges in decentralized AI, including bandwidth constraints and the need for verifiable processing, setting the stage for more powerful yet decentralized AI development.

“DiLoCoX is both a proof of concept and a statement of intent,” said Michael Heinrich, CEO of 0G Labs. “By making it possible to train enormous models in truly decentralized settings, we're not just pushing technical boundaries, but are unlocking a future where AI serves as a public good. This is about building an open ecosystem where anyone can contribute to and benefit from intelligent systems.”

The release of the paper supports 0G's commitment to advancing verifiable, democratized AI. As the foundation for high-performance infrastructure that unifies storage, compute, and data availability, 0G is on a mission to empower developers and researchers to create the next wave of AI-native applications.

The research paper can be read in full here.

About 0G

0G is the first decentralized AI protocol (AIP). A modular, infinitely scalable layer 1, 0G makes possible decentralized AI apps to bring about a truly democratized future of intelligence. Designed for AI execution at scale, 0G unifies decentralized storage, compute, and data availability (DA) to support the next generation of AI-native applications. With high-performance infrastructure, verifiable AI processing, and a permissionless agent ecosystem, 0G is building the foundation for an open, unstoppable AI economy.

Learn more: https://0g.ai/

Contact

CMO

Ada Heinrich

0G Labs

ada@0g.ai

This is a paid press release published via CyberNewswire, a PR newswire syndication platform for cybersecurity companies.

Related Stories

No stories found.
logo
Analytics Insight
www.analyticsinsight.net