
The inaugural Datapalooza hosted by The Graph Foundation is happening at @EFDevconnect in Istanbul on Nov. 13th. The event will feature expert-led talks and workshops on various topics including R&D, tooling, dapps, analytics, and LLMs. Speakers include Adam Fuller, Vyas Krishnan, Danning Sui, Willian Mitsuda, Hope Yen, Ansgar Grunseid, Vitalik Buterin, and more. Workshops will cover topics such as combining subgraphs, developing substreams, running an indexer, and using Chainlink Functions. The event can be attended in person or livestreamed. Don't miss it!
Datapalooza is next Monday! Explore the future of web3 data, dapps, analytics, & LLMs during a day of expert-led talks & workshops! 📣 New speaker update: @sui414 from Flashbots will present on Exploring Ethereum Mempool, Orderflow & MEV!
Datapalooza is only 13 days away! 📣 New speaker update: @andrej_dev from @chainlink! Andrej will give a workshop on using Chainlink Functions to implement a DeFi strategy by connecting The Graph & Uniswap Router! Stay tuned for info on additional speakers & presentations.
📢 The Graph Foundation is hosting the inaugural Datapalooza, a day focused on web3 data innovations, at @EFDevconnect in Istanbul on Nov. 13th! The event will cover R&D and tooling in The Graph ecosystem and across web3, focusing on use cases for dapps, analytics, tooling, LLMs
📣 New speaker update: @Allquantor07 from DataOS will present: AI is ready - Web 2.0 is not: AI-Generated Apps on top of Open Data Lakes! Datapalooza is Monday! Join a day of talks & workshops by builders & web3 data experts during @EFDevconnect. More announcements coming!
🇹🇷 Are you attending Datapalooza? Be sure to join Graph ecosystem Dev Rels, Business Teams, Dev Success, & other ecosystem members to talk about data services on The Graph! Register for Sunrise Office Hours! There will be an exclusive POAP for those that meet with ecosystem
Datapalooza is almost here! The Graph Foundation is hosting the inaugural Datapalooza at @EFDevconnect in Istanbul on Nov. 13th! This will be a day dedicated to web3 data innovations, discussing R&D, tooling, dapps, analytics, & more! Here’s the event agenda ⬇️ Can’t join in
Datapalooza is only 12 days away! 📣 New speaker update: @SeveSisneros from @semioticlabs! Seve will present on preserving historical data after EIP-4444, a proposal to reduce storage & bandwidth requirements for Ethereum nodes, making syncing easier & faster. Stay tuned for
📣 New speaker update: @wmitsuda from @otterscan! The Graph’s inaugural Datapalooza is almost here! This will be a day of expert-led talks & workshops on R&D, tooling for dapps, data analytics, & LLMs! Join in-person in Turkey or watch via livestream - more details coming soon!
The Graph Network Enables AI Agents to Query Blockchain Data in Plain English
The Graph Network has made every indexed blockchain dataset accessible to AI agents through natural language queries. **Key capabilities now live:** - AI assistants can query any indexed protocol, wallet, or onchain activity in plain English - No custom integration code needed for each AI model - The existing data layer connects directly to the agent layer This development builds on The Graph's five-year mission of making blockchain data queryable without custom infrastructure.
Fynd: Open Source Router Launches with 5-Minute Setup
**Fynd**, a new open-source routing engine, has launched with a promise of rapid deployment. Built on **Tycho** and powered by **The Graph Substreams**, the platform enables users to: - Spin up configurable routers in under 5 minutes - Leverage Substreams technology for fast, scalable data processing - Access open-source infrastructure for custom routing solutions The tool targets developers seeking quick deployment of routing infrastructure without extensive setup time. By utilizing The Graph's Substreams, Fynd offers real-time data indexing capabilities at scale.
Tempo Integrates The Graph's Subgraphs for Stablecoin Payment Indexing
**Tempo**, a blockchain built for stablecoin payments, now supports **The Graph's Subgraphs** for data indexing. Developers can now: - Track stablecoin transfers in real-time - Reconcile balances and monitor settlement flows - Build compliance dashboards using GraphQL APIs - Access payment data without custom backend infrastructure The integration brings The Graph's proven indexing technology—already deployed across 70+ networks—to Tempo's payments-focused chain. Both Tempo's Moderato testnet and Mainnet are now indexable. **Use cases include:** - Payment tracking dashboards - Cross-border compliance monitoring - Automated payroll verification - Settlement analytics for tokenized deposits This provides fintech teams building on Tempo with production-ready data infrastructure for stablecoin payment applications.
The Graph Showcases Enterprise Blockchain Data Stack at Digital Asset Summit
At the Digital Asset Summit in New York, **The Graph** addressed a critical institutional challenge: transforming raw blockchain data into audit-ready, production-grade formats. The protocol presented its three-tier solution: - **Subgraphs** - Indexed APIs for structured data access - **Substreams** - High-performance data pipelines - **Amp** - Enterprise-grade SQL access across multiple chains Amp represents The Graph's latest evolution, offering institutions verifiable, compliant, real-time blockchain data with multichain access, composable SQL queries, and flexible deployment options (hosted or on-premises).
Spritz App Leverages The Graph's Token API for Decentralized Cross-Chain Data Access

**Spritz app integrates The Graph's Token API** to access real-time token data across multiple blockchains. Key features: - **Reliable cross-chain data access** without centralized dependencies - **Eliminates single points of failure** through decentralized infrastructure - Provides consistent token information across different networks The Graph's Token API recently added: - No-code AI agent capabilities - Batch queries for multiple wallets and contracts - New DEX endpoints for swaps, pools, and liquidity data This integration allows Spritz to maintain uptime and data accuracy while avoiding reliance on centralized data providers.