Skip to main content
The data command provides powerful tools for viewing, querying, and exporting your database data. Browse tables with pagination, run interactive SQL queries, and export data to CSV or JSON formats.

Subcommands

data browse

Browse table data with interactive pagination (50 rows per page).
queryly data browse <connection> <table>
connection
string
required
Name of the saved database connection
table
string
required
Name of the table to browse
Example:
$ queryly data browse MyApp users
Browsing Table: users (1,234 rows, 25 pages)

┌─────┬──────────────┬──────────────────────┬─────────────────────┬───────────┐
 id name email created_at is_active
├─────┼──────────────┼──────────────────────┼─────────────────────┼───────────┤
 1 Alice Smith [email protected] 2024-01-15 10:30:00 1
 2 Bob Jones [email protected] 2024-01-16 14:20:00 1
 3 Carol Lee [email protected] 2024-01-17 09:15:00 0
 ... ... ... ... ...
 50 Emma Wilson [email protected] 2024-02-10 16:45:00 1
└─────┴──────────────┴──────────────────────┴─────────────────────┴───────────┘

Showing 50 of 1,234 row(s) | Query time: 12.34ms
──────────────────────────────────────────────────────────────
 Page 1 of 25    Rows 1–50 of 1,234
──────────────────────────────────────────────────────────────
Commands:  [N]ext  •  [P]rev  •  [G]o to page  •  [Q]uit
──────────────────────────────────────────────────────────────

Command>
Navigation Commands:
CommandDescription
nGo to next page
pGo to previous page
g or goGo to specific page number (prompts for page)
h or helpShow help menu
qQuit and return to terminal
Interactive Navigation:
# Go to next page
Command> n
# Now viewing page 2...

# Go to previous page
Command> p
# Back to page 1...

# Jump to specific page
Command> g
Enter page number (1-25): 10
# Now viewing page 10...

# Quit browsing
Command> q
Exited browse mode.
Data Display:
  • Shows up to 50 rows per page
  • NULL values displayed as NULL
  • Long text values truncated to 50 characters with ...
  • Automatic column width adjustment
  • Row numbers and totals in footer
Browse mode clears the screen for each page to provide a clean viewing experience. Use arrow keys aren’t supported - type commands instead.
Performance: The first page loads instantly. Jumping to later pages requires offsetting rows, which may be slower on very large tables.

data query

Interactive SQL query mode with real-time execution and formatted results.
queryly data query <connection>
connection
string
required
Name of the saved database connection
Example Session:
$ queryly data query MyApp
Query Mode - MyApp
Enter SQL queries. Type 'exit' to quit.

SQL> SELECT COUNT(*) FROM users
┌────────────┐
 COUNT(*)   
├────────────┤
 1234
└────────────┘
Showing 1 of 1 row(s) | Query time: 2.34ms

SQL> SELECT name, email FROM users WHERE is_active = 1 LIMIT 5
┌──────────────┬──────────────────────┐
 name email
├──────────────┼──────────────────────┤
 Alice Smith [email protected]
 Bob Jones [email protected]
 David Kim [email protected]
 Emma Wilson [email protected]
 Frank Chen [email protected]
└──────────────┴──────────────────────┘
Showing 5 of 5 row(s) | Query time: 1.23ms

SQL> SELECT 
     category_id, 
     COUNT(*) as product_count,
     AVG(price) as avg_price
     FROM products 
     GROUP BY category_id
┌─────────────┬───────────────┬───────────┐
 category_id product_count avg_price
├─────────────┼───────────────┼───────────┤
 1 45 29.99
 2 32 54.99
 3 18 19.99
└─────────────┴───────────────┴───────────┘
Showing 3 of 3 row(s) | Query time: 8.45ms

SQL> exit
Exited query mode.
Query Features:
Run any SELECT query to retrieve data:
-- Simple select
SELECT * FROM users

-- With WHERE clause
SELECT name, email FROM users WHERE is_active = 1

-- With JOINs
SELECT u.name, o.total 
FROM users u 
JOIN orders o ON u.id = o.user_id

-- With aggregations
SELECT COUNT(*), AVG(total) FROM orders

-- With GROUP BY
SELECT status, COUNT(*) 
FROM orders 
GROUP BY status
Results are formatted in a clean table with:
  • Column headers
  • Aligned data
  • Row count
  • Execution time in milliseconds
Error Handling:
SQL> SELECT * FROM nonexistent_table
✗ Error: no such table: nonexistent_table

SQL> SELECT invalid syntax
✗ Error: near "syntax": syntax error
Errors are displayed in red and don’t exit query mode - you can immediately try again. Exiting:
SQL> exit
Exited query mode.
Type exit at the SQL prompt to return to your terminal.

data export

Export entire tables to CSV or JSON format.
queryly data export <connection> <table> <format>
connection
string
required
Name of the saved database connection
table
string
required
Name of the table to export
format
enum
default:"csv"
Export format:
  • csv - Comma-separated values
  • json - JSON array of objects
Examples:
$ queryly data export MyApp users csv
Exporting users...
 Exported to users_20240315_143022.csv (1,234 rows)
Output Files: Files are saved in your current directory with automatic timestamps:
<table>_YYYYMMDD_HHMMSS.<format>
Examples:
  • users_20240315_143022.csv
  • orders_20240315_143045.json
  • products_20240315_143101.csv
CSV Format:
id,name,email,created_at,is_active
1,Alice Smith,[email protected],2024-01-15 10:30:00,1
2,Bob Jones,[email protected],2024-01-16 14:20:00,1
3,Carol Lee,[email protected],2024-01-17 09:15:00,0
CSV Features:
  • Header row with column names
  • Proper escaping of commas, quotes, and newlines
  • NULL values exported as empty strings
  • Text values with special characters wrapped in quotes
JSON Format:
[
  {
    "id": 1,
    "name": "Alice Smith",
    "email": "[email protected]",
    "created_at": "2024-01-15 10:30:00",
    "is_active": 1
  },
  {
    "id": 2,
    "name": "Bob Jones",
    "email": "[email protected]",
    "created_at": "2024-01-16 14:20:00",
    "is_active": 1
  },
  {
    "id": 3,
    "name": "Carol Lee",
    "email": "[email protected]",
    "created_at": "2024-01-17 09:15:00",
    "is_active": 0
  }
]
JSON Features:
  • Array of objects (one per row)
  • Indented for readability
  • NULL values exported as null
  • Numbers preserved as numeric types
  • Dates exported as strings
Large Tables: Exporting very large tables (millions of rows) can take time and produce large files. Monitor disk space and consider filtering data with a custom query first.
Custom Exports: For filtered exports, use data query to run a SELECT query and manually save results, or pipe query output to a file using your shell.

Advanced Usage

Filtering Before Export

To export only specific data, use query mode:
$ queryly data query MyApp
SQL> SELECT * FROM orders WHERE status = 'completed' AND created_at > '2024-01-01'
# Copy results manually or use shell redirection:

# Better: Use custom SQL in your application

Analyzing Query Performance

Use query mode to test performance:
SQL> SELECT COUNT(*) FROM large_table
# Query time: 1250.45ms

SQL> SELECT COUNT(*) FROM large_table WHERE indexed_column = 'value'
# Query time: 5.23ms  (much faster with index!)

Exporting Multiple Tables

# Export all main tables
queryly data export MyDB users csv
queryly data export MyDB orders csv
queryly data export MyDB products csv

Database-Specific SQL

-- SQLite-specific features
SELECT * FROM sqlite_master WHERE type='table'

-- Date functions
SELECT date('now'), datetime('now', '+1 day')

-- AUTOINCREMENT
SELECT last_insert_rowid()

Error Messages

ErrorMeaningSolution
Connection 'name' not foundNo saved connection with that nameUse connect list to verify connection exists
No data foundQuery returned no rows or was DDLNormal for empty results or CREATE/UPDATE/DELETE
Query failed: <error>SQL syntax error or invalid table/columnCheck your SQL syntax and table/column names
Unsupported format: <format>Invalid export format specifiedUse csv or json only
Already on first/last pageCan’t navigate beyond page boundariesYou’re at the start or end of the data

Performance Tips

Browse Performance: Uses LIMIT and OFFSET for pagination. First page is instant, but jumping to page 1000 of a large table requires skipping many rows.
Query Performance:
  • Use WHERE clauses to filter data
  • Add LIMIT to large queries for faster results
  • Create indexes on frequently queried columns
  • Monitor execution times shown after each query
Export Performance: Exporting large tables (millions of rows) loads all data into memory before writing. For huge exports, consider:
  • Exporting in chunks with filtered queries
  • Using database-specific export tools
  • Scheduling exports during off-peak hours

Next Steps

Schema Exploration

Use schema info to understand table structure before querying

Connection Management

Manage multiple database connections for different environments

Build docs developers (and LLMs) love