'connector'='iceberg' table option in Flink SQL.
Overview
In Flink, the SQLCREATE TABLE test (..) WITH ('connector'='iceberg', ...) will create a Flink table in current Flink catalog (uses GenericInMemoryCatalog by default), which is just mapping to the underlying Iceberg table instead of maintaining Iceberg table directly in current Flink catalog.
The Flink Iceberg connector allows setting the catalog properties through table properties. See Catalog Configuration for details.
Hive Catalog Example
Before executing the following SQL, please make sure you’ve configured the Flink SQL client correctly according to the Flink overview.Basic Table Creation
The following SQL will create a Flink table in the current Flink catalog, which maps to the Iceberg tabledefault_database.flink_table managed in Iceberg catalog:
Mapping to Different Table
If you want to create a Flink table mapping to a different Iceberg table managed in Hive catalog (such ashive_db.hive_iceberg_table in Hive), then you can create Flink table as following:
The underlying catalog database (
hive_db in the above example) will be created automatically if it does not exist when writing records into the Flink table.Hadoop Catalog Example
The following SQL will create a Flink table in current Flink catalog, which maps to the Iceberg tabledefault_database.flink_table managed in Hadoop catalog:
REST Catalog Example
The following SQL will create a Flink table in current Flink catalog, which maps to the Iceberg tabledefault_database.flink_table managed in REST catalog:
Custom Catalog Example
The following SQL will create a Flink table in current Flink catalog, which maps to the Iceberg tabledefault_database.flink_table managed in a custom catalog of type com.my.custom.CatalogImpl:
Complete Example
Here’s a complete example using Hive catalog:Next Steps
DDL Operations
Learn about DDL commands for Iceberg tables
Configuration
Configure Flink catalog properties