/
Iota

Iota
$7.24
Buy Iota




Token details
Verified
Yes
Market Cap
30.70M
Price Change (24h)
-0.17%
Price Change (7d)
-2.25%
What is Iota?
IOTA (Incentivized Orchestrated Training Architecture) is a decentralized framework for pre-training large language models, operating as Subnet 9 (SN9) on the Bittensor network. Built by Macrocosmos AI, IOTA transforms a network of globally distributed, heterogeneous machines into a single cooperating unit that trains AI models together, rather than having individual miners compete with separate models. The architecture uses data- and pipeline-parallelism to split model training across participants, meaning no single miner needs to fit an entire model on their machine.
IOTA is currently training a 1.5 billion parameter Llama-inspired model split across 3 layers, with plans to scale to 15B, 50B, and 100B parameter models. The system was featured in Forbes ("Swarm Intelligence Is Reshaping How AI Gets Trained") and has an academic paper available on ArXiv. A live dashboard at iota.macrocosmos.ai provides real-time visibility into network activity, active miners, layers in training, and validator metrics.
One of the most distinctive features of IOTA is Train at Home (TAH), a downloadable application that allows anyone with consumer-grade hardware (starting with macOS devices like MacBooks and Mac Minis) to contribute compute to distributed model training and earn rewards. No technical knowledge of machine learning or Bittensor is required. Users simply download the app, connect their wallet, and start training. Train at Home runs on the same underlying system as the main IOTA subnet but is specifically designed to lower the barrier to entry for non-technical participants.
Macrocosmos AI also operates two other Bittensor subnets: Apex (SN1), a decentralized agentic inference engine, and Data Universe (SN13), a platform for structured social data collection from sources like X (Twitter) and Reddit.
How Iota Works?
An orchestrator distributes model layers across miners and streams activations between them. All network communication is mediated through the orchestrator, and a shared S3 bucket stores activations and layer weights. Miners compete to process as many activations as possible during the training stage, and periodically upload their local weights. Weight merging across miners is handled using a variant of Butterfly All-Reduce, a communication-efficient aggregation method. The system employs 128x activation compression to reduce bandwidth requirements, making it feasible to train over commodity internet connections.
Validators spot-check miners to verify that work was performed correctly. Miners are rewarded based on the volume and quality of their compute contributions to the shared training process. This cooperative approach differs fundamentally from the previous version of SN9, where each miner trained their own isolated model and competed in a winner-takes-all leaderboard. IOTA instead pools all participants into a single training run, enabling the network to tackle models beyond the capability of any individual miner.

Frequently Asked Questions
What is Iota?
How does it work?
What are the strengths?
What differentiates it?
How to buy?
Who's Behind It
Macrocosmos
IOTA is built by Macrocosmos AI. Key team members include Steffen Cruz, former CTO of the Opentensor Foundation and a core developer of Bittensor's Subnet 1, who holds a PhD in subatomic physics from the University of British Columbia. Will Squires leads product and business development. The project is fully open-source, with code maintained under the macrocosm-os GitHub organization.
Visit Website