NemoClaw is alpha software. Interfaces, APIs, and behavior may change without notice as the project evolves. NemoClaw currently requires a fresh installation of OpenClaw.
Prerequisites
Verify that your environment meets the hardware and software requirements before you begin.Hardware
| Resource | Minimum | Recommended |
|---|---|---|
| CPU | 4 vCPU | 4+ vCPU |
| RAM | 8 GB | 16 GB |
| Disk | 20 GB free | 40 GB free |
Software
| Dependency | Version requirement |
|---|---|
| Linux | Ubuntu 22.04 LTS or later |
| Node.js | 20 or later (22 recommended) |
| npm | 10 or later |
| Docker | Installed and running |
| OpenShell | Installed |
Install and onboard
Run the installer
Download and execute the NemoClaw installer script. The script installs Node.js via nvm if it is not already present, installs the The installer runs through the following stages:
nemoclaw CLI, and then launches the interactive onboard wizard.If Node.js was installed via nvm, the installer will print instructions to reload your shell profile before
nemoclaw is on your PATH. Follow those instructions or open a new terminal before continuing.Complete the onboard wizard
After installation, the onboard wizard starts automatically. It configures the inference endpoint, API credential, and model for your sandbox.Step 1 — Select your inference endpoint:Select NVIDIA Build to get started immediately using free credits from build.nvidia.com.Step 2 — Enter your NVIDIA API key:Get your API key from build.nvidia.com/settings/api-keys.Step 3 — Select a model:Step 4 — Review and confirm:
Connect to the sandbox
Open a shell in the sandbox
Run the following command from your host to open an interactive shell session inside the sandbox:You will see the connection banner and be dropped into the sandbox shell:
Send a test message using the OpenClaw TUI
The OpenClaw TUI opens an interactive chat interface. From inside the sandbox, run:Type a message and press Enter to send it to the agent. The TUI also displays network egress requests in a side panel — any attempt to reach an unlisted host will appear here for your approval.
Send a test message using the OpenClaw CLI
Use the OpenClaw CLI to send a single message and print the agent’s response without entering the TUI:
If the agent replies, your sandbox is working correctly. Type
| Flag | Description | Default |
|---|---|---|
--agent | The agent configuration to use. | — |
--local | Run against the local sandbox rather than a remote host. | false |
-m | The message to send to the agent. | — |
--session-id | Session identifier for conversation continuity. | — |
exit to return to your host shell.Non-interactive onboarding
You can skip the interactive prompts by passing all required flags tonemoclaw onboard directly. This is useful for automated provisioning or CI environments.
| Flag | Description |
|---|---|
--api-key | API key for endpoints that require one. Skips the interactive key prompt. |
--endpoint | Endpoint type: build, ncp, nim-local, vllm, ollama, custom. Default: interactive prompt. |
--ncp-partner | NCP partner name. Required when --endpoint ncp. |
--endpoint-url | Endpoint URL. Required for ncp, nim-local, ollama, and custom. |
--model | Model ID to use. Skips the interactive model selection prompt. |
The
nim-local, vllm, ollama, and custom endpoint types are experimental. Use --endpoint build or --endpoint ncp for production setups.Next steps
How It Works
Understand the plugin, blueprint, and sandbox lifecycle before customizing your setup.
Switch inference providers
Switch to a different Nemotron model or configure an NCP endpoint.
Approve network requests
Review and approve agent egress requests surfaced in the OpenShell TUI.
Customize network policy
Pre-approve trusted domains to avoid manual approval at runtime.
Deploy to a remote GPU
Deploy your sandbox to a remote GPU instance for always-on operation.
Monitor sandbox activity
Track agent behavior, network egress, and inference calls through the OpenShell TUI.