Ledger, a prominent entity in the Solana ecosystem, has received a grant from The Graph Foundation. This grant will enable Ledger to promote the adoption and utilization of subgraphs and Substreams, which are designed to enhance the speed and reliability of data access for developers building on the Solana blockchain.
✨ Leading by example, @ledger_top demonstrates the power of subgraphs & Substreams in the @Solana ecosystem! With a new grant from The Graph Foundation, they’ll help support greater adoption & use of subgraphs & Substreams – boosting the speed & reliability of data for devs
We are pleased to announce that Top Ledger has received a grant from The Graph Foundation! This collaboration will further our commitment to developing superior data solutions within the @solana ecosystem, focusing on the creation and implementation of @graphprotocol subgraphs
The Graph Eyes JSON-RPC Integration for Complete Data Stack

**The Graph is exploring a major infrastructure expansion** by integrating JSON-RPC as a native service into its protocol. **What's changing:** - JSON-RPC would join The Graph's existing indexed data services - The integration would leverage The Graph's payment and security frameworks - Aims to create a unified, full-stack data layer for onchain applications **Why it matters:** Every blockchain application relies on JSON-RPC for basic operations like reading state, checking balances, and broadcasting transactions. By adding this alongside its indexing capabilities, The Graph could become a one-stop solution for all blockchain data needs. The move reflects growing demand for comprehensive infrastructure that goes beyond single-purpose tools. [Read the full technical roadmap](https://thegraph.com/blog/technical-roadmap/)
The Graph Launches Agent0 Subgraphs for ERC-8004 Discovery
**The Graph has deployed Agent0 Subgraphs** across Base, BNB Chain, Ethereum, Monad, and Polygon to simplify ERC-8004 agent discovery. **Key features:** - Single GraphQL query replaces manual block scanning and IPFS file fetching - Unified schema across all five chains - Already processed over 1M queries - Indexes agent identities, capabilities, reputation, and validation data **Why it matters:** AI agents will generate thousands of queries per minute for blockchain data. The Graph's existing infrastructure of 15,000+ indexed datasets eliminates the need for developers to build custom indexers. Developers can now connect AI agents to structured onchain data - covering DeFi protocols, NFTs, governance, and more - without fighting RPC limits or parsing raw hex data. [Explore Agent0 Subgraphs](http://thegraph.com/explorer?search=agent0) | [Documentation](http://thegraph.com/docs/subgraphs/erc-8004)
🏗️ Infrastructure Solved—Now What?
**The infrastructure debate is over.** The Graph's latest case study with Amberdata highlights a critical shift in web3 development thinking. **Key insight:** Building your own blockchain data infrastructure isn't a competitive advantage—it's a distraction. The Graph has been working since 2018 to make blockchain data accessible and composable by default. **The real question:** What will you build with the time saved by not reinventing the wheel? The case study demonstrates that competitive advantage comes from what you build *on top* of infrastructure, not the infrastructure itself.
🚀 Amberdata Achieves 72,000% Faster Reprocessing with Substreams
Amberdata has switched to Substreams, achieving a 72,000% improvement in reprocessing performance. The technology eliminates the need for redundant infrastructure while maintaining access to the same historical data. **Key benefits:** - Dramatic reduction in reprocessing time - No additional infrastructure required - Frees up engineering resources for product development The shift highlights how choosing the right data processing tools can eliminate pipeline maintenance overhead, allowing development teams to focus on building features rather than managing infrastructure.
Amberdata Achieves 72,000% Performance Boost After Switching to Substreams
Amberdata migrated to Substreams and saw their reprocessing performance increase by 72,000%. **Key Impact:** - The real cost of inefficient infrastructure isn't financial—it's the engineering hours diverted from product development - Teams shipping faster aren't necessarily building better pipelines; they're avoiding the need to build them at all - Same historical data, no redundant infrastructure required This dramatic improvement highlights how choosing the right data infrastructure can free up development resources for actual product work.