Skip to main content

Default Command

Run an agent with optional configuration.
rowboatx [options]

Options

--agent
string
default:"copilot"
The agent to run. Agents are defined in ~/.rowboat/agents/.Example:
rowboatx --agent copilot
rowboatx --agent data-processor
--run_id
string
Continue an existing run. Run IDs are stored in ~/.rowboat/runs/.Example:
rowboatx --run_id a1b2c3d4-5678-90ef-ghij-klmnopqrstuv
--input
string
Initial input message to send to the agent. Useful for non-interactive execution.Example:
rowboatx --input "Analyze the sales data in data.csv"
--no-interactive
boolean
default:"false"
Run in non-interactive mode. The agent will not prompt for user input and will exit after processing.Example:
rowboatx --agent analyzer --input "Process data" --no-interactive

Examples

# Start default copilot agent
rowboatx

# Run specific agent interactively
rowboatx --agent my-assistant

ui

Launch the interactive terminal-based dashboard for managing runs and agents.
rowboatx ui [options]

Options

--server-url
string
Rowboat server base URL. Connects the UI to a remote or local server instance.Example:
rowboatx ui --server-url http://localhost:3000
rowboatx ui --server-url https://my-rowboat-server.com

Features

The TUI (Terminal User Interface) provides:
  • Visual run management and monitoring
  • Agent execution controls
  • Real-time event streaming
  • Interactive permission approvals
  • Run history and logs

Example

# Launch local UI
rowboatx ui

# Connect to remote server
rowboatx ui --server-url https://api.rowboat.example.com

import

Import example workflows or custom workflow definitions.
rowboatx import --example <name>
rowboatx import --file <path>

Options

--example
string
Name of a built-in example workflow to import.Example:
rowboatx import --example data-analysis
rowboatx import --example github-integration
--file
string
Path to a custom workflow JSON file to import.Example:
rowboatx import --file ./my-workflow.json
rowboatx import --file ~/Downloads/shared-workflow.json

Workflow Structure

Workflow JSON format:
{
  "id": "my-workflow",
  "entryAgent": "main-agent",
  "agents": [
    {
      "name": "main-agent",
      "instructions": "You are a helpful assistant.",
      "tools": {}
    }
  ],
  "mcpServers": {
    "filesystem": {
      "command": "npx",
      "args": ["-y", "@modelcontextprotocol/server-filesystem", "/workspace"]
    }
  },
  "instructions": "Optional post-install instructions for the user"
}

Import Behavior

1

Agent Import

Agents are written to ~/.rowboat/agents/<agent-name>.json
2

MCP Server Merge

MCP servers are merged into ~/.rowboat/config/mcp.json. Existing servers with the same name are skipped.
3

Post-Install Instructions

If the workflow includes instructions, they are displayed after import.

Examples

rowboatx import --example data-processor
# Output:
# ✓ Imported workflow 'data-processor'
#   Agents: processor, validator
#   Primary: processor
#   MCP servers added: filesystem
#
# Run: rowboatx --agent processor

list-examples

List all available built-in example workflows.
rowboatx list-examples

Output

data-analysis
github-integration
code-reviewer
content-generator

Example

# List examples, then import one
rowboatx list-examples
rowboatx import --example code-reviewer

export

Export a workflow with all dependencies to stdout. Useful for sharing or backing up workflows.
rowboatx export --agent <name>

Options

--agent
string
required
Entry agent name to export. The export includes the agent and all its dependencies (sub-agents, MCP servers).Example:
rowboatx export --agent my-agent
rowboatx export --agent copilot > copilot-backup.json

Export Behavior

1

Dependency Discovery

Recursively discovers all agents referenced by the entry agent through tool dependencies.
2

MCP Server Collection

Collects all MCP servers used by the agent and its dependencies.
3

JSON Output

Outputs a complete workflow definition to stdout in JSON format.

Examples

rowboatx export --agent my-workflow > workflow-backup.json

model-config

Configure LLM provider and model settings interactively.
rowboatx model-config

Interactive Wizard

1

View Current Configuration

Displays current provider and model if configured.
Currently using:
- provider: my-openai (openai)
- model: gpt-5.1
2

Select Provider Flavor

Choose from available provider types:
Select a provider type:
  1. rowboat [free]
  2. openai
  3. aigateway
  4. anthropic
  5. google
  6. ollama
  7. openai-compatible
  8. openrouter
Enter number or name:
3

Configure Provider

For existing providers, you can reuse or add new:
Found existing providers for openai:
  1. use existing: my-openai
  2. add new
Enter number or name/alias [2]:
4

Enter Details

Provide provider-specific configuration:
Enter a name/alias for this provider [openai]: production-openai
Enter baseURL for openai [https://api.openai.com/v1]:
Enter API key for openai (leave blank to pick from environment variable OPENAI_API_KEY):
Specify model for openai [gpt-5.1]: gpt-5.1
5

Confirmation

Configuration is saved and confirmed:
Currently using:
- provider: production-openai (openai)
- model: gpt-5.1

Configuration written to ~/.rowboat/config/models.json. You can also edit this file manually

Provider Defaults

ProviderDefault Base URLDefault ModelAPI Key Env Var
openaihttps://api.openai.com/v1gpt-5.1OPENAI_API_KEY
anthropichttps://api.anthropic.com/v1claude-sonnet-4-5ANTHROPIC_API_KEY
googlehttps://generativelanguage.googleapis.com/v1betagemini-2.5-proGOOGLE_GENERATIVE_AI_API_KEY
ollamahttp://localhost:11434llama3.1-
openai-compatiblehttp://localhost:8080/v1openai/gpt-5.1-
openrouterhttps://openrouter.ai/api/v1openrouter/auto-
aigatewayhttps://ai-gateway.vercel.sh/v1/aigpt-5.1AI_GATEWAY_API_KEY
rowboat [free]-google/gemini-3-pro-preview-

Examples

rowboatx model-config
# Select: 2 (openai)
# Name: my-openai
# Base URL: [Enter] (use default)
# API Key: sk-...
# Model: gpt-5.1

API Server

Rowboat CLI includes a built-in HTTP API server for programmatic access.

Starting the Server

npm run server
# or from source:
cd apps/cli && npm run server
Default port: 3000 (configure via PORT environment variable)
PORT=8080 npm run server

API Endpoints

POST /runs/:runId/messages/new

Create a new message in a run. Request:
{
  "message": "What is the status of task #123?"
}
Response:
{
  "messageId": "msg_abc123"
}

POST /runs/:runId/permissions/authorize

Authorize a tool permission request. Request:
{
  "toolCallId": "call_xyz789",
  "response": "approve"
}
Response:
{
  "success": true
}

POST /runs/:runId/human-input-requests/:requestId/reply

Reply to a human input request. Request:
{
  "requestId": "req_def456",
  "response": "Proceed with deployment"
}
Response:
{
  "success": true
}

POST /runs/:runId/stop

Stop a running agent. Response:
{
  "success": true
}

GET /stream

Subscribe to run events via Server-Sent Events (SSE). Response: Stream of events
data: {"type":"message","runId":"run_123",...}
event: message
id: 0

data: {"type":"tool-call","runId":"run_123",...}
event: message
id: 1

GET /openapi.json

Retrieve OpenAPI specification for the API. Response: OpenAPI 3.0 JSON schema

API Examples

curl -X POST http://localhost:3000/runs/run_abc123/messages/new \
  -H "Content-Type: application/json" \
  -d '{"message": "Continue processing"}'

Environment Variables

PORT
number
default:"3000"
Server port for the API server.
PORT=8080 npm run server
OPENAI_API_KEY
string
OpenAI API key. Used when not explicitly configured in models.json.
export OPENAI_API_KEY="sk-..."
ANTHROPIC_API_KEY
string
Anthropic API key.
export ANTHROPIC_API_KEY="sk-ant-..."
GOOGLE_GENERATIVE_AI_API_KEY
string
Google AI API key.
export GOOGLE_GENERATIVE_AI_API_KEY="AIza..."
AI_GATEWAY_API_KEY
string
Vercel AI Gateway API key.
export AI_GATEWAY_API_KEY="key_..."

Configuration Files

models.json

Location: ~/.rowboat/config/models.json
{
  "providers": {
    "my-openai": {
      "flavor": "openai",
      "apiKey": "sk-...",
      "baseURL": "https://api.openai.com/v1",
      "headers": {}
    },
    "local-ollama": {
      "flavor": "ollama",
      "baseURL": "http://localhost:11434",
      "headers": {
        "Authorization": "Bearer optional-key"
      }
    }
  },
  "defaults": {
    "provider": "my-openai",
    "model": "gpt-5.1"
  }
}

mcp.json

Location: ~/.rowboat/config/mcp.json
{
  "mcpServers": {
    "filesystem": {
      "command": "npx",
      "args": ["-y", "@modelcontextprotocol/server-filesystem", "/workspace"],
      "env": {}
    },
    "github": {
      "command": "npx",
      "args": ["-y", "@modelcontextprotocol/server-github"],
      "env": {
        "GITHUB_PERSONAL_ACCESS_TOKEN": "ghp_..."
      }
    },
    "brave-search": {
      "command": "npx",
      "args": ["-y", "@modelcontextprotocol/server-brave-search"],
      "env": {
        "BRAVE_API_KEY": "BSA..."
      }
    }
  }
}

Agent Definition

Location: ~/.rowboat/agents/<name>.json
{
  "name": "my-agent",
  "instructions": "You are a helpful coding assistant.",
  "tools": {
    "filesystem": {
      "type": "mcp",
      "mcpServerName": "filesystem",
      "toolNames": ["read_file", "write_file", "list_directory"]
    },
    "sub-agent": {
      "type": "agent",
      "name": "code-reviewer"
    }
  }
}

Tips & Best Practices

When running agents in CI/CD pipelines or scripts, always use --no-interactive to prevent hanging on prompts:
rowboatx --agent test-runner --input "Run tests" --no-interactive
Use export to create portable workflow definitions:
rowboatx export --agent my-workflow > workflow.json
# Share workflow.json with team
Create separate agents for different tasks:
  • copilot - General assistance
  • code-reviewer - Code review tasks
  • data-processor - Data ETL workflows
Keep ~/.rowboat/agents/ and ~/.rowboat/config/ in version control (excluding API keys):
cd ~/.rowboat
git init
echo "config/models.json" >> .gitignore
git add agents/ config/mcp.json
git commit -m "Add agent configurations"
Avoid hardcoding API keys. Use environment variables:
# .env file
OPENAI_API_KEY=sk-...
ANTHROPIC_API_KEY=sk-ant-...

# Load and run
source .env
rowboatx

Build docs developers (and LLMs) love