bru run command executes API requests from your Bruno collection, allowing you to test APIs in different environments, automate testing, and integrate with CI/CD workflows.
Synopsis
Arguments
One or more paths to request files (.bru) or folders to execute. If not specified, runs all requests in the current collection recursively.Examples:
request.bru- Run a single requestfolder- Run all requests in a folderrequest.bru folder- Run a request and all requests in a folder
Options
Execution Control
Indicates a recursive run. When enabled, runs all requests in subdirectories.
Stop execution after a failure of a request, test, or assertion.
Only run requests that have a test or active assertion. Filters out requests without tests, pre-request tests, post-response tests, or active assertions.
Delay between each request in milliseconds. Useful for rate limiting or throttling requests.
Environment
Specify the environment to run with. This should match an environment name in your collection’s
environments folder.Example: --env localPath to environment file (.bru, .json, or .yml) - absolute or relative. This allows loading environment variables from a custom file.Example:
--env-file env.bruGlobal environment name (requires collection to be in a workspace). Global environments are shared across multiple collections in a workspace.Example:
--global-env productionPath to workspace directory. Auto-detected if not provided when using
--global-env.Overwrite a single environment variable. Can be used multiple times to override multiple variables.Format:
name=valueExample: --env-var secret=xxx --env-var apiKey=abc123Output & Reporting
Path to write file results to. Works in combination with
--format.Example: --output results.jsonFormat of the file results. Available formats:
json- JSON format (default)junit- JUnit XML formathtml- HTML report
--format junitPath to write JSON file results to. Allows outputting multiple report formats simultaneously.Example:
--reporter-json results.jsonPath to write JUnit XML file results to.Example:
--reporter-junit results.xmlPath to write HTML file results to.Example:
--reporter-html results.htmlReporter Filtering
Omit all headers from the reporter output. Useful for reducing file size or removing sensitive header information.
Skip specific headers from the reporter output. Provide header names to exclude.Example:
--reporter-skip-headers "Authorization" "Cookie"Omit request body from the reporter output.
Omit response body from the reporter output.
Omit both request and response bodies from the reporter output. Shorthand for enabling both
--reporter-skip-request-body and --reporter-skip-response-body.Security & SSL
Allow insecure server connections. Disables SSL certificate verification.
CA certificate to verify peer against. By default, this certificate is used in addition to the default truststore.Example:
--cacert myCustomCA.pemThe specified custom CA certificate (
--cacert) will be used exclusively and the default truststore is ignored. Only evaluated in combination with --cacert.Example: --cacert myCustomCA.pem --ignore-truststorePath to the client certificate config file (JSON format) used for securing the connection in the request.The JSON file should have the following structure:
JavaScript Sandbox
JavaScript sandbox to use for executing scripts. Available sandboxes:
safe- QuickJS runtime (default, more secure but limited)developer- Node.js VM runtime (more features but less isolated)
Network
Disable automatically saving and sending cookies with requests.
Disable all proxy settings (both collection-defined and system proxies).
Tag Filtering
Tags to include in the run. Comma-separated list. Only requests with at least one of these tags will be executed.Example:
--tags hello,worldTags to exclude from the run. Comma-separated list. Requests with any of these tags will be skipped.Example:
--exclude-tags skip,wipDebugging
Allow verbose output for debugging purposes. Provides detailed information about request execution.
Exit Codes
Thebru run command returns the following exit status codes:
Execution successful - all requests, tests, and assertions passed
One or more assertions, tests, or requests failed during execution
The specified output directory does not exist
The request chain caused an endless loop (more than 10,000 jumps detected)
Command was called outside of a collection root directory
The specified file or path was not found
The specified environment was not found
Environment override not presented as string or object
Environment override format incorrect (should be
name=value)Invalid output format requested (must be json, junit, or html)
The specified file has an invalid format or cannot be parsed
The specified workspace was not found or workspace.yml is missing
Global environment requires the collection to be in a workspace
The specified global environment was not found in the workspace
A generic error occurred during execution
Examples
Basic Usage
Run a single request:Environment Usage
Run with a specific environment:Output & Reporting
Save results to JSON:Reporter Filtering
Omit all headers from output:Security & SSL
Use custom CA certificate (in addition to default truststore):Filtering & Control
Run only requests with tests:Network Options
Add delay between requests:Debugging
Run with verbose output:Run Summary Output
After execution,bru run displays a summary with the following information:
- Requests: Total, passed, failed, errors, and skipped
- Pre-Request Tests: Total, passed, and failed (if any)
- Post-Response Tests: Total, passed, and failed (if any)
- Tests: Total, passed, and failed
- Assertions: Total, passed, and failed
- Total Time: Cumulative response time for all requests