Configuration Variables
All configuration is done by editing variables at the top ofmain.py (lines 14-23). There is no separate configuration file.
API Configuration
Base URL
The API endpoint URL. Default is configured for LM Studio running locally.Examples:
- LM Studio:
"http://127.0.0.1:1234/v1" - OpenAI-compatible API:
"https://api.example.com/v1"
Model Selection
The model identifier to use for benchmarks.Special behavior:
- Leave empty (
"") to automatically use the currently loaded model in LM Studio - Set to a specific model name for other OpenAI-compatible APIs
Reasoning Settings
Controls the reasoning effort level for models that support reasoning.Valid values:
"low": Minimal reasoning, faster responses"medium": Balanced reasoning depth"high": Maximum reasoning effort, slower but more thoughtful
reasoning.effort parameter and recorded in the result file header.
Benchmark Parameters
Number of Tries
Number of test attempts to run for each benchmark.Higher values provide more statistical confidence but take longer to complete.
Timeout
Maximum time to wait for a response in seconds.Prevents the benchmark from hanging if the model enters a “death spiral” generating thousands of tokens.
Maximum Tokens
Maximum number of output tokens the model can generate per response.Recommendations:
- Default (512): Suitable for most models
- Higher values: May be needed for reasoning models that produce longer traces
- Lower values: Can prevent verbose or runaway responses
Environment Setup
LM Studio Configuration
If using LM Studio:- Start LM Studio and load your desired model
- Enable the local API server (usually runs on
http://127.0.0.1:1234) - Leave
llm = ""in the configuration - The currently loaded model will be automatically selected
OpenAI-Compatible API Setup
For other OpenAI-compatible APIs:- Set
baseurlto your API endpoint - Set
llmto the specific model identifier - Ensure your API supports the extended reasoning format if using reasoning features
You may need to set API keys via environment variables or modify the
OpenAI client initialization in the code, depending on your API provider’s requirements.Directory Configuration
Directory where log files are stored. Created automatically if it doesn’t exist.
Directory where result JSON files are stored. Created automatically if it doesn’t exist.