Exporting Data
Export Formats
Zequel supports three export formats:- CSV
- JSON
- SQL
Comma-Separated Values — universally compatible spreadsheet format.
- Configurable delimiters: comma, semicolon, tab, pipe
- Optional headers row
- NULL handling: export as empty string or “NULL”
- Perfect for Excel, Google Sheets, data analysis tools
Export Query Results
Export the results of any query:Query result exports are limited to the rows returned by your query. Use
LIMIT or filters to control the dataset size.Export Table Data
Export all rows from a table:Full Table Export
Click the export button in the status bar. This exports all rows, not just the current page.
Export Current Page
Export only the rows visible in the current Data Grid page:
Useful for exporting a filtered subset or the first N rows.
CSV Export Options
- Delimiter: Choose comma, semicolon, tab, or pipe
- Include Headers: Add column names as the first row
- NULL as Empty: Export
NULLvalues as empty strings (blank cells) or the literal text “NULL”
JSON Export Options
- Pretty Print: Format JSON with indentation for readability
SQL Export Options
- Include Schema: Qualify table names with schema (e.g.,
public.users) - Create Table: Add a
CREATE TABLEstatement beforeINSERTstatements
Importing Data
Import Formats
Zequel supports importing from:- CSV: Comma-separated values with optional headers
- JSON: Array of objects (one object per row)
SQL imports are not yet supported. Use the Query Editor to execute SQL files.
Import CSV or JSON
Import data into an existing table:Configure Options
- File has headers: Toggle if the first row contains column names
- Delimiter (CSV only): Select comma, semicolon, tab, or pipe
- Clear table before import: Truncate the table before inserting new rows
Review Mapping
Each source column is auto-matched to a table column (by name). Adjust mappings or skip columns as needed.
Column Mapping
Zequel automatically maps columns by name (case-insensitive):- Auto-Matched: Columns with matching names are pre-selected
- Skip Columns: Set a column’s target to “Skip column” to ignore it
- Type Conversion: Zequel attempts to convert source values to the target column’s data type
user_id, full_name, email_addressTable columns:
id, name, email
Mapping:
user_id→idfull_name→nameemail_address→email
Import Options
File Has Headers
Toggle this option if your CSV/JSON file includes column names:- Enabled: First row is treated as headers and used for column mapping
- Disabled: All rows are treated as data; columns are named
Column1,Column2, etc.
Delimiter (CSV Only)
Select the character that separates columns:- Comma (
,): Standard CSV format - Semicolon (
;): Common in European locales - Tab: TSV (tab-separated values)
- Pipe (
|): Alternative delimiter
Clear Table Before Import
Enable this option to truncate the table before importing:- All existing rows are deleted
- Auto-increment counters are reset (database-dependent)
- New data is inserted into an empty table
Import Errors
If rows fail to import:- Type Mismatches: Source value cannot be converted to target column type (e.g., text → integer)
- Constraint Violations: Primary key duplicates, unique constraint violations, NOT NULL violations
- Foreign Key Errors: Referenced rows do not exist
Batch Import Performance
Zequel inserts rows in batches of 100 for performance:- Small Files (under 1,000 rows): Import completes in seconds
- Medium Files (1,000-10,000 rows): ~10-30 seconds
- Large Files (over 10,000 rows): Several minutes; progress is shown
Importing millions of rows may take a long time. Consider using database-native tools (e.g.,
COPY for PostgreSQL, LOAD DATA for MySQL) for bulk imports.Paste Rows from Clipboard
Quickly import data from Excel or Google Sheets:Paste in Zequel
Open the target table in Zequel. Right-click in the Data Grid → Paste Rows (or press
Cmd+V).Column Matching
Zequel matches clipboard headers to table columns (case-insensitive) and inserts rows.
- Tab-separated (TSV): Default copy format from Excel/Sheets
- Comma-separated (CSV): If copied from a CSV file
Paste Rows is ideal for small datasets (up to 100 rows). For larger imports, use the Import CSV/JSON feature.
Use Cases
Migrating Data Between Databases
Migrating Data Between Databases
Export a table as SQL from one database and execute the SQL in another database to replicate the data.
Seeding Test Data
Seeding Test Data
Export production data as CSV, anonymize it, and import into a staging or development database.
Sharing Query Results
Sharing Query Results
Importing External Data
Importing External Data
Import customer lists, analytics exports, or third-party datasets from CSV/JSON files into your database.
Backup and Restore
Backup and Restore
Export critical tables as SQL for quick snapshots. Import the SQL to restore data.
Best Practices
Validate Before Import
Validate Before Import
Review the data preview and column mappings before clicking Import. Mismatched types can cause errors.
Use Transactions for Large Imports
Use Transactions for Large Imports
For databases supporting manual transactions, enable Manual Commit mode before importing to allow rollback on errors.
Export with Headers
Export with Headers
Always enable Include headers for CSV exports. It makes re-importing and analyzing the data much easier.
Test Imports on Small Datasets
Test Imports on Small Datasets
Before importing 100,000 rows, test with the first 10 rows to ensure column mappings and types are correct.
Backup Before Clear Table
Backup Before Clear Table
If using Clear table before import, export the existing data first as a backup in case you need to roll back.
Keyboard Shortcuts
| Shortcut | Action |
|---|---|
Cmd+V | Paste Rows from Clipboard |
Cmd+E | Export Current View |
Troubleshooting
Import Fails with Type Errors
Import Fails with Type Errors
- Check the data preview for mismatched types (e.g., text in a numeric column)
- Adjust column mappings to skip problematic columns
- Clean the source file to match the target table’s schema
CSV Delimiter Not Detected
CSV Delimiter Not Detected
- Manually select the correct delimiter in the import dialog
- Ensure the CSV file uses consistent delimiters throughout
Export File Too Large
Export File Too Large
- Apply filters or a
LIMITclause to reduce the dataset before exporting - Use pagination to export in smaller chunks
JSON Import Shows Errors
JSON Import Shows Errors
- Validate the JSON file syntax (use a JSON validator tool)
- Ensure the file is an array of objects:
[{...}, {...}] - Check that object keys match table column names