BigQuery support is planned for a future release. This page documents the intended capabilities and setup process.
Purpose
Use this MCP to:- Run SQL queries against BigQuery datasets
- Analyze product metrics and usage data
- Generate reports from production data
- Join data across multiple tables
- Export query results to Google Sheets
Setup
Set up Google Cloud project
- Go to Google Cloud Console
- Create a new project or select an existing one
- Enable the BigQuery API
Create service account
- Go to IAM & Admin → Service Accounts
- Create a new service account
- Grant BigQuery User and BigQuery Data Viewer roles
- Create and download a JSON key file
Available tools
list_datasets
List all datasets in your BigQuery project.list_tables
List tables in a specific dataset.query
Run a SQL query against BigQuery.get_table_schema
Get the schema of a specific table.Usage examples
Analyze user metrics
Generate usage report
Investigate data issues
Common workflows
Product metrics dashboard
- Query BigQuery for key metrics (DAU, MAU, retention)
- Export to Google Sheets
- Create charts and visualizations
- Share dashboard in #product channel
User cohort analysis
- Query user signup dates
- Analyze retention by cohort
- Generate insights
- Create summary doc
Performance monitoring
- Query performance logs
- Calculate p50, p95, p99 latencies
- Compare to historical data
- Alert if thresholds exceeded
Query examples
Daily active users
Feature adoption
Error rate
Best practices
Use query limits for exploration
Use query limits for exploration
When exploring data, add
LIMIT 100 to your queries to avoid processing large amounts of data unnecessarily.Cache results in Sheets
Cache results in Sheets
For frequently accessed metrics, query BigQuery once and store results in Google Sheets. Then read from Sheets for faster access.
Partition tables by date
Partition tables by date
When querying time-series data, always filter by date to reduce query costs and improve performance.
Monitor query costs
Monitor query costs
BigQuery charges based on data processed. Ask Claudio to estimate query costs for large datasets before running.
Use fully qualified table names
Use fully qualified table names
Always specify project, dataset, and table:
`project.dataset.table` to avoid ambiguity.Integration with other MCPs
BigQuery → Sheets → Slack
BigQuery → ClickUp
BigQuery → Docs
Cost management
BigQuery pricing is based on:- Storage: Amount of data stored
- Queries: Amount of data processed per query
- Streaming: Rows streamed for real-time inserts
- Use partitioned tables and filter by partition
- Select only the columns you need
- Use
LIMITfor exploration - Cache results in Sheets for repeated access
- Enable query result caching in BigQuery
Troubleshooting
Authentication failed
Authentication failed
Verify:
- The service account JSON key path is correct
- The service account has BigQuery permissions
- The
GOOGLE_APPLICATION_CREDENTIALSenv var is set
Permission denied on dataset
Permission denied on dataset
Make sure your service account has:
- BigQuery Data Viewer role (to read data)
- BigQuery User role (to run queries)
Query timeout
Query timeout
For long-running queries:
- Add filters to reduce data processed
- Use partitioned tables
- Increase the query timeout setting
Quota exceeded
Quota exceeded
If you hit rate limits:
- Wait a few minutes before retrying
- Request quota increase in Google Cloud Console
- Batch multiple queries instead of running them sequentially