/
Gradients

Gradients
$6.98
Buy Gradients




Token details
Verified
Yes
Market Cap
31.01M
Price Change (24h)
-0.42%
Price Change (7d)
-0.16%
What is Gradients?
Gradients is a decentralized AutoML platform for AI model fine-tuning, operating as Subnet 56 (SN56) on the Bittensor network. The platform makes AI training accessible to anyone without requiring machine learning expertise. Through a simple user interface or API, users can upload a dataset, select a model, and have multiple Bittensor miners compete independently to find the optimal fine-tuning configuration, producing the best-performing version of the model. Gradients supports Instruct, DPO (preference tuning), GRPO (reward-function-driven optimization), and image model fine-tuning.
The platform positions itself as a cost-competitive alternative to centralized AutoML services. Training smaller models costs around $100, 20B parameter models around $250, and 70B parameter models around $500, compared to over $10,000 on Google Cloud Vertex AI for comparable workloads. In 180 controlled experiments, Gradients achieved an 82.8% win rate against HuggingFace AutoTrain and 100% win rates against TogetherAI, Databricks, and Google Cloud, performing strongest on RAG tasks and translation. The platform has attracted over 3,000 paying users and reports having trained across 118 trillion parameters on 2 billion rows of data.
Development and operations are supported by Rayon Labs, a globally distributed team that does not own the subnet but contributes to its development. Rayon Labs also contributes to Chutes (SN64). Users can monitor training progress through Weights and Biases integration and deploy finished models directly to Hugging Face.
How Gradients Works?
The current version of Gradients operates through a tournament system. Miners submit open-source training scripts that are executed by validators on dedicated infrastructure. Each tournament lasts 4 to 7 days, with new tournaments starting 72 hours after the previous one ends. Validators provide fixed compute, run all submitted scripts, and compare results head-to-head. The top-performing miners receive exponentially higher weight and emissions. Winning AutoML scripts are released publicly to the gradients-opensource GitHub organization, building an open library of training techniques.
Rather than using a single predetermined strategy to fine-tune a model (the approach taken by centralized platforms like Google Cloud AutoML or HuggingFace AutoTrain), Gradients pools multiple miners who each work independently to discover the best fine-tuning configuration for a given dataset and task. This competitive approach means the platform consistently finds configurations that a single automated pipeline would miss. Miners are evaluated on loss scores measured on held-out test data they never access during training, ensuring genuine generalization rather than overfitting.

Frequently Asked Questions
What is Gradients?
How does it work?
What are the strengths?
What differentiates it?
How to buy?
Who's Behind It
Rayon Labs
Gradients is founded by Chris, known in the Bittensor community as Wandering Weights. Development and operations are supported by Rayon Labs. The project is fully open-source, with code maintained under the gradients-ai GitHub organization.
Visit Website