What are Sink Pipes?
Sink pipes execute a query and publish the results to an external system:- Kafka sinks - Publish records to Kafka topics
- S3 sinks - Write files to S3 buckets
- You need to send data to downstream systems
- You’re building event-driven architectures
- You want to archive data to object storage
- You need to integrate with external services
Kafka Sink Pipes
Publish query results to Kafka topics:Basic Kafka Sink
Scheduled Kafka Export
Run on a schedule to continuously export data:S3 Sink Pipes
Export query results to S3 buckets as files:Basic S3 Sink
S3 Sink Options
Export Strategies
Control how files are written:File Formats
Supported export formats:Compression
Compress exported files:Complete S3 Example
Daily export with compression:Sink Pipe Schedules
Define when sink pipes run:Special Schedules
Cron Expressions
Setting Up Connections
Kafka Connection
SASL_SSLwithPLAINmechanism (most common)SASL_SSLwithSCRAM-SHA-256orSCRAM-SHA-512- Custom authentication via Kafka connection settings
S3 Connection
- IAM role with appropriate S3 permissions
- Trust relationship allowing Tinybird to assume the role
PutObjectpermission for exports
Complete Examples
Real-time Event Streaming
Stream events to Kafka for downstream processing:Data Lake Export
Export partitioned data to S3 for analytics:Best Practices
Use appropriate schedules
Choose schedules based on your data freshness requirements. Use
@on-demand for testing.Use compression for large exports
Enable gzip compression for S3 exports to reduce storage costs and transfer time.
Monitor sink pipe execution
Check the Tinybird dashboard regularly to ensure sink pipes are running successfully.
Next Steps
Copy Pipes
Learn about internal data snapshots
Type-Safe Client
Query and ingest data with the Tinybird client