Back to Blog
Soma, Soma subnet, Simple Guide, Soma sn114, Subnet 114, Context Compression, MCP marketplace
Bittensor

Your Simple Guide to SOMA (SN114)

Published May 12, 2026

Modern AI agents are running into a wall. Every reasoning step adds context, tools, and memory, and token costs scale linearly with complexity. A single agentic workflow can consume tens of thousands of input tokens before producing a useful answer. Worse still, the Lost in the Middle research (Liu et al., TACL 2024) has shown that LLMs use long contexts unreliably: accuracy can drop by more than 30% when relevant information sits in the middle of the prompt, and overall performance degrades as context grows. SOMA (SN114) addresses this problem directly. Built on Bittensor by Dendrite, the subnet runs decentralized competitions to produce production-grade MCP (Model Context Protocol) servers for AI agents. Its first competition series targets context compression directly, with winning solutions deployed as ready-to-use MCP servers that any agent can integrate.

What is SOMA (SN114)?

SOMA operates as Subnet 114 (SN114) on Bittensor, built and operated by Dendrite. The project takes its name from neurobiology, where the soma is the cell body of a neuron that integrates incoming signals before triggering action. SOMA applies the same principle to AI agents through MCP servers that aggregate inputs, manage state, and coordinate execution across decentralized systems.

MCP (Model Context Protocol) standardizes how AI agents connect to external tools, data sources, and services. Just as USB-C standardizes how devices connect to peripherals, MCP standardizes how agents connect to capabilities. Without a shared protocol, every integration requires custom glue code, every tool ships with its own quirks, and agent builders spend more time wiring than building. SOMA’s MCP servers turn fragmented agent outputs into coherent, reliable behavior, available as plug-and-play services for any AI agent or builder.

The subnet runs on a competition-based model. For each problem that can meaningfully improve agent performance and can be solved using MCP servers, SOMA organizes a competition. Independent miners build the algorithms, and the winning solutions get deployed as production MCP servers on the SOMA platform. The result is a continuously evolving marketplace of standardized AI capabilities that any agent across the Bittensor ecosystem can integrate.

How SOMA (SN114) works?

SOMA delivers AI capabilities through a defined competition cycle. Each cycle targets a specific problem. Miners submit their solutions to the platform, automated screening filters out invalid entries, and qualified submissions enter a live evaluation phase where validators score each solution on performance criteria defined for that competition. The top-ranked miners earn the incentive allocation for the cycle, and the winning solutions become production MCP servers available to the wider ecosystem.

The first competition series focuses on Context Compression and launched on March 12, 2026. This problem represents one of the fundamental bottlenecks in modern agent architectures. As agentic workflows accumulate conversation history, tool outputs, and retrieved context across multiple reasoning steps, the prompt fed into the model balloons. Token costs scale linearly with this growth, and on top of that, models use long context less reliably than short context.

In Context Compression, miners compete to reduce the number of tokens fed into a model’s prompt while preserving the information needed for accurate responses. In production, the most effective strategies combine multiple approaches to achieve maximum compression while maintaining or even improving output accuracy, since cleaner context helps models reason better by removing noise.

A key design choice separates model improvement from production deployment. Competitions run in an open, decentralized environment where any miner can submit solutions and the best algorithms continuously replace older ones. Production MCP servers, however, run on managed infrastructure that delivers enterprise-grade reliability and predictable performance. SOMA can scale horizontally to adapt to growing demand, with capacity expanding to maintain consistent performance under increasing workloads.

Who is behind SOMA (SN114)?

SOMA is developed and operated by Dendrite, a technology company founded in 2022 that entered the Bittensor ecosystem in its earliest months. Today, Dendrite operates as one of the primary infrastructure architects in the network, with activity spanning high-performance mining operations, proprietary subnets, and end-user products.

The Dendrite team includes more than 50 engineers and mathematicians working across multiple layers of the ecosystem. Beyond SOMA, Dendrite also operates SimplyTao, a platform that lets users buy and trade Bittensor subnet alpha tokens with traditional payment methods, removing the friction that has historically kept newcomers out of the network. This combination of infrastructure operation and end-user product development gives Dendrite a clear view of what builders, miners, and users need from a Bittensor subnet.

SOMA maintains active community channels on Discord and X: @SomaSubnet. Documentation and miner setup guides are available at the project’s GitHub repository under the DendriteHQ organization. The team publishes regular updates on competition cycles, technical research, and ecosystem developments through these channels.

Why SOMA (SN114) is valuable?

SOMA addresses a structural gap in how AI agents access external capabilities. Today, every agent builder solves the same integration problems independently, writing custom connectors for databases, APIs, search engines, memory layers, and tools. This duplication wastes engineering effort, produces inconsistent results, and slows down the entire field. MCP solves the protocol problem by providing a shared standard, and SOMA solves the supply problem by building a competitive marketplace for high-quality MCP servers.

The competition model produces better outcomes than centralized R&D for a simple reason. When dozens of independent miners attack the same problem with different approaches, the search space gets explored far more thoroughly than any single team could manage. Competitions can surface approaches that individual teams might miss.

Agent builders get ready-to-use MCP servers that plug into existing workflows, removing the burden of researching, building, and maintaining custom integration code. Miners earn economic incentives that reward technical excellence over raw compute. Other Bittensor subnets can consume SOMA infrastructure directly, accelerating development across the network.

The first competition’s focus on context compression also delivers immediate, measurable value. Reducing token consumption translates directly into lower inference costs, faster response times, and improved model accuracy on long-context tasks. Any agent builder integrating a SOMA context compression MCP server can ship more capable agents at lower cost from day one.

The future of SOMA (SN114)

SOMA’s roadmap centers on expanding the catalog of production MCP servers through additional competition series. After Context Compression, the team plans to organize competitions across other foundational problems in agent architectures, with each successful cycle adding a new production MCP server to the platform. The competition format makes the roadmap inherently flexible, since the team can target whichever bottleneck the ecosystem needs solved next.

Beyond the platform itself, SOMA integrates with the broader ecosystem of open MCP-compatible tools. Production servers built through SOMA competitions can be consumed by open-source agent frameworks, including projects like OpenClaw, as well as by proprietary applications. This positions SOMA as supplier infrastructure for the wider agent economy.

SOMARIZER, built on the SOMA subnet, makes context compression directly accessible to developers through a hosted API. Early access is currently open to teams that want to test the technology. Additional products will follow as new competition series produce their winning solutions.

The combination of decentralized competition for model improvement, managed deployment for production reliability, and horizontal scaling for capacity gives SOMA (SN114) a structural advantage over centralized AI infrastructure providers. Centralized providers improve only when their internal R&D teams ship. SOMA improves continuously through open competition, with new winners able to replace older solutions in any cycle. As more competitions launch and more MCP servers reach production, SOMA aims to become the default integration layer for AI agents across the Bittensor ecosystem.

FAQ:

What is SOMA (SN114)?

SOMA is a decentralized AI solutions subnet on the Bittensor network, operating as Subnet 114 (SN114) and built by Dendrite. It delivers practical AI capabilities through MCP (Model Context Protocol) infrastructure, a standardized integration layer that connects AI agents to external tools, data sources, and services across decentralized systems.

How does SOMA work?

SOMA organizes competition cycles that target specific problems in agent performance. Miners submit solutions, which go through automated screening and live validator evaluation scored on defined performance criteria. The top-ranked solution gets deployed as a production MCP server on the SOMA platform, available for integration by any AI agent or builder in the ecosystem.

What is the first SOMA competition about?

The first competition series focuses on Context Compression, which launched on March 12, 2026. Miners compete to reduce the number of tokens fed into a model’s prompt while preserving the information needed for accurate responses. Winning strategies achieve significant token reduction while maintaining or improving output accuracy.

What makes SOMA different from other AI infrastructure?

SOMA is the first Bittensor subnet dedicated entirely to MCP infrastructure, creating a continuously evolving marketplace of standardized AI capabilities. Model improvement happens through open decentralized competition, while production deployment runs on managed infrastructure that delivers enterprise-grade reliability.

Who built SOMA?

SOMA is developed and operated by Dendrite, a technology company founded in 2022 with more than 50 engineers and mathematicians on the team. Dendrite also operates SimplyTao, the platform for buying and trading Bittensor subnet alpha tokens.

How can I buy SOMA (SN114) tokens?

SOMA (SN114) tokens can be purchased on the SimplyTao platform with multiple payment methods, including credit and debit cards, Revolut, Google Pay, Apple Pay, crypto, and TAO.

Sources:
https://thesoma.ai
https://github.com/DendriteHQ/SOMA
https://dendrite.holdings

Ready to Start Trading AI Tokens?