Upcoming Webinar on AI, Crypto Data Analytics, and The Graph's New SQL Data Service

🔍 Crypto Data Unveiled

By The Graph
Apr 15, 2024, 7:18 PM
twitter

The Graph, a web3 protocol for organizing and accessing blockchain data, has announced an upcoming webinar titled 'The Graph Builders Office Hours' scheduled for tomorrow at 12 PM EST.​ During the webinar, Sam Green from Semiotic Labs will discuss crypto data analytics and provide a demonstration.​ Additionally, Green will offer the first public view of The Graph's new SQL data service, which is expected to be a significant development for the protocol.​

Sources

Curious about AI + @graphprotocol? Tomorrow at 12pm EST on The Graph Builders Office Hours, @0xsamgreen of @semioticlabs will discuss crypto data analytics, demo agentc.xyz, and give the first public view of The Graph’s new SQL data service. discord.com/events/4380386…

Sam Green
Sam Green
@0xsamgreen

Here's how I analyze crypto data these days: I use a combination of agentc.xyz, which sources its data from @graphprotocol, and GPT. Soon, all of this will be possible natively within The Graph.

68
Reply
Read more about The Graph

The Graph Eyes JSON-RPC Integration for Complete Data Stack

The Graph Eyes JSON-RPC Integration for Complete Data Stack

**The Graph is exploring a major infrastructure expansion** by integrating JSON-RPC as a native service into its protocol. **What's changing:** - JSON-RPC would join The Graph's existing indexed data services - The integration would leverage The Graph's payment and security frameworks - Aims to create a unified, full-stack data layer for onchain applications **Why it matters:** Every blockchain application relies on JSON-RPC for basic operations like reading state, checking balances, and broadcasting transactions. By adding this alongside its indexing capabilities, The Graph could become a one-stop solution for all blockchain data needs. The move reflects growing demand for comprehensive infrastructure that goes beyond single-purpose tools. [Read the full technical roadmap](https://thegraph.com/blog/technical-roadmap/)

The Graph Launches Agent0 Subgraphs for ERC-8004 Discovery

**The Graph has deployed Agent0 Subgraphs** across Base, BNB Chain, Ethereum, Monad, and Polygon to simplify ERC-8004 agent discovery. **Key features:** - Single GraphQL query replaces manual block scanning and IPFS file fetching - Unified schema across all five chains - Already processed over 1M queries - Indexes agent identities, capabilities, reputation, and validation data **Why it matters:** AI agents will generate thousands of queries per minute for blockchain data. The Graph's existing infrastructure of 15,000+ indexed datasets eliminates the need for developers to build custom indexers. Developers can now connect AI agents to structured onchain data - covering DeFi protocols, NFTs, governance, and more - without fighting RPC limits or parsing raw hex data. [Explore Agent0 Subgraphs](http://thegraph.com/explorer?search=agent0) | [Documentation](http://thegraph.com/docs/subgraphs/erc-8004)

🏗️ Infrastructure Solved—Now What?

**The infrastructure debate is over.** The Graph's latest case study with Amberdata highlights a critical shift in web3 development thinking. **Key insight:** Building your own blockchain data infrastructure isn't a competitive advantage—it's a distraction. The Graph has been working since 2018 to make blockchain data accessible and composable by default. **The real question:** What will you build with the time saved by not reinventing the wheel? The case study demonstrates that competitive advantage comes from what you build *on top* of infrastructure, not the infrastructure itself.

🚀 Amberdata Achieves 72,000% Faster Reprocessing with Substreams

Amberdata has switched to Substreams, achieving a 72,000% improvement in reprocessing performance. The technology eliminates the need for redundant infrastructure while maintaining access to the same historical data. **Key benefits:** - Dramatic reduction in reprocessing time - No additional infrastructure required - Frees up engineering resources for product development The shift highlights how choosing the right data processing tools can eliminate pipeline maintenance overhead, allowing development teams to focus on building features rather than managing infrastructure.

Amberdata Achieves 72,000% Performance Boost After Switching to Substreams

Amberdata migrated to Substreams and saw their reprocessing performance increase by 72,000%. **Key Impact:** - The real cost of inefficient infrastructure isn't financial—it's the engineering hours diverted from product development - Teams shipping faster aren't necessarily building better pipelines; they're avoiding the need to build them at all - Same historical data, no redundant infrastructure required This dramatic improvement highlights how choosing the right data infrastructure can free up development resources for actual product work.