Ethereum is approaching a critical turning point as two major upgrades — PeerDAS and zkEVMs — transition from research into live implementation, co-founder Vitalik Buterin said.
In a post on X, Buterin argued that the combination could move Ethereum toward “a fundamentally new and more powerful kind of decentralized network” by tackling a longstanding blockchain limitation: achieving decentralization and consensus without sacrificing bandwidth and throughput.
Buterin illustrated the challenge with two internet-era models. Peer-to-peer networks like BitTorrent can move large amounts of data in a decentralized way but do not require consensus. Bitcoin, by contrast, ensures strong decentralization and consensus but remains low-bandwidth because every node repeats the same computations instead of sharing the workload. Ethereum’s next stage, he said, aims to combine all three.
One leg of that vision is already live. PeerDAS, a form of data availability sampling, has been deployed on Ethereum’s mainnet, allowing nodes to verify that transaction data exists without downloading the entire dataset. The system is a prototype for full Data Availability Sampling, essential for Ethereum’s long-term sharding-based scaling strategy. By letting light clients check shard data in small samples, PeerDAS enhances scalability while maintaining security and decentralization.
The second upgrade, zkEVMs, has reached production-quality performance, Buterin said. The remaining work focuses on safety, robustness, and proving the technology at scale rather than raw speed. Limited zkEVM nodes could appear as early as 2026.
Looking further ahead, Buterin highlighted distributed block building as a long-term goal, where block assembly is spread across multiple participants rather than handled by a single entity. This approach could reduce censorship risks and improve geographic fairness.
Buterin framed Ethereum’s scaling roadmap as a shift toward distributing verification work across the network rather than requiring every node to process all data, signaling a new era of higher throughput without compromising decentralization.























