In NEAR you will find three main solutions to access and monitor on-chain data: Data APIs, BigQuery Public Dataset and NEAR Lake. Each of these solutions is designed to fit different needs and use cases, and can be used in combination to create a complete data infrastructure for your application.
Data APIs
Members of the NEAR community have built a set of APIs to access and monitor on-chain data. These APIs are designed to be easy to use and can be accessed from any application through a simple API call.Data APIs
Access on-chain data through community-built APIs
- User assets: Easily track all the assets that a user or a contract holds
- Monitor transactions: Get all the transactions of a user, a contract or a specific token
- Track on-chain events: Get all the events emitted by a contract, or a specific event type
BigQuery Public Dataset
A large dataset with on-chain data publicly available on Google Cloud Platform. Obtain near real-time blockchain data using simple SQL queries. All the data, zero setup.BigQuery
Query blockchain data using SQL on Google Cloud Platform
- Instant insights: Historic on-chain data queried at scale. No need to run your own infrastructure.
- Cost-effective: Eliminate the need to store and process bulk NEAR Protocol data. Query as little or as much data as you like.
- As easy as SQL: No prior experience with blockchain technology is required. Just bring a general knowledge of SQL to unlock insights.
NEAR Lake
A solution that watches over the NEAR network and stores all the events for your easy access.NEAR Lake Framework
Build custom indexers with NEAR Lake Framework
- Cost-efficient solution: Build self-hosted indexers in Rust, JavaScript, Python, Go and other languages
- Streamlined data management: Use NEAR Lake Framework to stream blocks to your server directly from NEAR Lake
Choosing the Right Solution
When to use Data APIs
When to use Data APIs
Use Data APIs when you need quick access to specific on-chain data without setting up infrastructure. Ideal for:
- Querying account balances and assets
- Tracking specific transactions
- Building user-facing applications that need real-time data
- Prototyping and development
When to use BigQuery
When to use BigQuery
Use BigQuery when you need to analyze large amounts of historical data. Perfect for:
- Data analysis and research
- Business intelligence and reporting
- Historical trend analysis
- Complex queries across multiple blocks
When to use NEAR Lake
When to use NEAR Lake
Use NEAR Lake when you need full control over data processing. Best for:
- Custom indexing requirements
- Real-time event processing
- Building your own data pipelines
- Applications requiring specific data transformations