DECENTRALISED
AI
OpenAI, Anthropic, and Google are accumulating unprecedented concentrations of AI capability. The response from crypto: token-incentivised networks, decentralised compute, and open model marketplaces. Here is what is real, what is narrative, and what is actually winning.
GPU marketplaces. Anyone's idle hardware can serve AI workloads. Genuinely cheaper than AWS. Token = payment rail and coordination mechanism. The most legitimate category.
Token rewards for producing best AI outputs. Validators evaluate quality. Subnets specialise by task. The most innovative architecture. Also the furthest from frontier capability.
No token. No network. Just model weights that anyone can download and run. Arguably the most powerful decentralisation of AI capability ever achieved. And it's free.
io.net (IO)
Decentralised GPU cloud. Aggregates idle GPUs from data centres, mining farms, and consumers into on-demand clusters.
Deploy a 10,000-GPU cluster in under 10 seconds. Uses Solana for payments. Supports Kubernetes workloads. 70% cheaper than cloud providers in benchmarks.
Real: Wondera scaled AI music creation to 200K users using io.net, cut training costs 75%. Multiple AI startups using it as AWS alternative.
Not about model capability — provides raw compute. Same GPU = same performance regardless of who owns it.
GPU hardware verification is hard — fake/low-spec nodes are a known attack vector. Sustained utilisation rates not disclosed.
Meta Did More to Decentralise AI With One GitHub Repo Than All Crypto Networks Combined
Llama 4 Scout (109B) and Maverick (400B) are free. DeepSeek R1 671B is MIT licensed. Qwen 3 235B runs on a Mac Studio. These models match or approach frontier capability and anyone on Earth can download and run them. That is decentralisation of AI capability. Token-based networks are about economic decentralisation — who captures value. They solve a different problem. Both matter, but they are not the same thing.
Real price advantage (70–83% cheaper than AWS), real usage, real customers. The token is a payment/coordination mechanism. The underlying service is genuine.
The incentive architecture for collective intelligence is genuinely novel. 128 subnets with Darwinian selection is a real innovation. But the capability gap vs frontier labs remains significant. Watch, don't write off.
Meta released Llama 4 and did more to decentralise AI capability than all crypto tokens combined. Anyone on Earth can run a frontier-class model. No token. No fees. That's the actual distribution of power.
Adding a token to a centralised AI product does not decentralise it. "Owning" a governance token is not the same as owning model weights. Many projects conflate the two. Buyer beware.
Who Controls AI? And Can Anything Actually Change That?
77% of consumers in a 2025 Harris Poll said decentralised AI is more beneficial than Big Tech-controlled systems. OpenAI, Anthropic, and Google have unprecedented concentrations of capability, capital, and data. The Dario/Pentagon case showed a single CEO can set AI red lines for the US military. That is real power concentration.
Decentralised compute (Akash, io.net) genuinely lowers the cost barrier and removes single-provider dependency. Bittensor's incentive architecture is a real experiment in collective intelligence. But none of these systems currently produce AI at the capability level of GPT-4o or Claude Sonnet. The gap is the thing to watch.
Llama 4 Maverick runs on a Mac Studio. DeepSeek R1 is MIT licensed and matches o1 on reasoning. The open source movement has done more to democratise frontier-level AI capability than any token network. The question is whether open source can stay within 6–12 months of frontier indefinitely.