Skip to main content

Overview

The agent system uses environment variables for configuration, API credentials, and runtime settings. Variables can be set in a .env file, passed via command line, or configured in your deployment platform.

Required Variables

These variables are required as a minimum to run any agent:
BET_FROM_PRIVATE_KEY
string
required
Private key for the Ethereum wallet used to place bets and interact with prediction markets.Format: 64-character hexadecimal string (with or without 0x prefix)
Never commit this to version control. Keep it secure and never share it.
OPENAI_API_KEY
string
required
OpenAI API key for GPT model access.Format: sk-...Required for most agents that use language models for prediction and reasoning.

AI Model API Keys

Depending on which agent you run, you may need additional AI service API keys:
OPENAI_API_KEY
string
OpenAI API key for GPT-4, GPT-4o, GPT-3.5, and o1 models.Used by:
  • prophet_gpt4o
  • prophet_gpt4
  • prophet_gpt4omini
  • prophet_o1
  • prophet_o1mini
  • prophet_o3mini
  • microchain
  • think_thoroughly
  • Most other agents
ANTHROPIC_API_KEY
string
Anthropic API key for Claude models.Used by:
  • prophet_claude3_opus
  • prophet_claude35_haiku
  • prophet_claude35_sonnet
GOOGLE_API_KEY
string
Google API key for Gemini models.Used by:
  • prophet_gemini20flash
DEEPSEEK_API_KEY
string
DeepSeek API key for DeepSeek models.Used by:
  • prophet_deepseekchat
  • prophet_deepseekr1

Research and Information API Keys

Agents use various APIs for research and information gathering:
SERPER_API_KEY
string
Serper API key for Google Search integration.Purpose: Enables agents to search the web for information about marketsGet it: serper.devUsed by: Research-intensive agents like think_thoroughly, gptr_agent
TAVILY_API_KEY
string
Tavily API key for AI-powered search.Purpose: Advanced search and information retrievalGet it: tavily.comUsed by: Research agents, think_thoroughly_prophet
GRAPH_API_KEY
string
The Graph API key for querying blockchain data.Purpose: Access to indexed blockchain data and market informationUsed by: Agents that need historical market data

Social Media Integration

FARCASTER_PRIVATE_KEY
string
Private key for Farcaster account.Purpose: Post predictions and insights to FarcasterUsed by: social_media agent
TWITTER_API_KEY
string
Twitter API key (Consumer Key).Used by: social_media agent
TWITTER_API_KEY_SECRET
string
Twitter API key secret (Consumer Secret).Used by: social_media agent
TWITTER_ACCESS_TOKEN
string
Twitter access token for authenticated requests.Used by: social_media agent
TWITTER_ACCESS_TOKEN_SECRET
string
Twitter access token secret.Used by: social_media agent
TWITTER_BEARER_TOKEN
string
Twitter bearer token for API v2.Used by: social_media agent

Vector Database and Embeddings

PINECONE_API_KEY
string
Pinecone API key for vector database operations.Purpose: Store and retrieve embeddings for semantic searchUsed by: olas_embedding_oa, agents using embedding-based retrieval

Database Configuration

SQLALCHEMY_DB_URL
string
Database connection URL for SQLAlchemy.Format: postgresql://user:password@host:port/databasePurpose: Persistent storage for agent state, market data, and analyticsExample: postgresql://agent:password@localhost:5432/prediction_agentUsed by: Agents that require persistent state, NFT treasury game agents

Observability and Monitoring

LANGFUSE_SECRET_KEY
string
Langfuse secret key for authentication.Purpose: Track LLM calls, costs, and performance
LANGFUSE_PUBLIC_KEY
string
Langfuse public key for authentication.Purpose: Track LLM calls, costs, and performance
LANGFUSE_HOST
string
Langfuse host URL.Default: https://cloud.langfuse.comPurpose: Endpoint for Langfuse telemetry data
LANGFUSE_DEPLOYMENT_VERSION
string
Deployment version tag for Langfuse traces.Default: none (set in Dockerfile)Purpose: Track which version of code generated specific tracesNote: Automatically set to git SHA during CI/CD builds

System Configuration

PYTHONPATH
string
Python module search path.Default: /app (in Docker)Purpose: Ensure Python can find project modules
PROTOCOL_BUFFERS_PYTHON_IMPLEMENTATION
string
Protocol Buffers implementation to use.Default: pythonPurpose: Use pure Python implementation of Protocol Buffers
TRANSFORMERS_NO_ADVISORY_WARNINGS
string
Disable transformers library warnings.Default: 1Purpose: Suppress warnings about PyTorch (we only use transformers for tokenization)

Application Configuration

free_for_everyone
string
Enable no-auth mode for Streamlit apps.Default: 1Purpose: Opens Streamlit apps without authentication requirementsUsed by: Interactive Streamlit applications
runnable_agent_name
string
Name of the agent to run (used in Docker CMD).Format: One of the RunnableAgent enum valuesExamples: prophet_gpt4o, microchain, social_mediaPurpose: Specify which agent to run when container starts
market_type
string
Target market type (used in Docker CMD).Options: omen, manifold, polymarket, metaculusPurpose: Specify which prediction market platform to interact with

Example Configuration Files

Minimal Configuration

Minimum required for basic agent operation:
.env
BET_FROM_PRIVATE_KEY=your_private_key_here
OPENAI_API_KEY=sk-your_openai_key_here

Standard Configuration

Recommended for most agents:
.env
# Required
BET_FROM_PRIVATE_KEY=your_private_key_here
OPENAI_API_KEY=sk-your_openai_key_here

# Research APIs
SERPER_API_KEY=your_serper_key_here
TAVILY_API_KEY=your_tavily_key_here

# Monitoring
LANGFUSE_SECRET_KEY=sk-lf-your_secret_here
LANGFUSE_PUBLIC_KEY=pk-lf-your_public_here
LANGFUSE_HOST=https://cloud.langfuse.com

# Optional
GRAPH_API_KEY=your_graph_key_here

Full Configuration

Complete configuration with all services:
.env
# === Core Credentials ===
BET_FROM_PRIVATE_KEY=your_private_key_here

# === AI Model APIs ===
OPENAI_API_KEY=sk-your_openai_key_here
ANTHROPIC_API_KEY=sk-ant-your_anthropic_key_here
GOOGLE_API_KEY=your_google_key_here
DEEPSEEK_API_KEY=your_deepseek_key_here

# === Research APIs ===
SERPER_API_KEY=your_serper_key_here
TAVILY_API_KEY=tvly-your_tavily_key_here
GRAPH_API_KEY=your_graph_key_here

# === Social Media ===
FARCASTER_PRIVATE_KEY=your_farcaster_key_here
TWITTER_API_KEY=your_twitter_key_here
TWITTER_API_KEY_SECRET=your_twitter_secret_here
TWITTER_ACCESS_TOKEN=your_twitter_token_here
TWITTER_ACCESS_TOKEN_SECRET=your_twitter_token_secret_here
TWITTER_BEARER_TOKEN=your_twitter_bearer_here

# === Vector Database ===
PINECONE_API_KEY=your_pinecone_key_here

# === Database ===
SQLALCHEMY_DB_URL=postgresql://user:password@host:5432/prediction_agent

# === Monitoring ===
LANGFUSE_SECRET_KEY=sk-lf-your_secret_here
LANGFUSE_PUBLIC_KEY=pk-lf-your_public_here
LANGFUSE_HOST=https://cloud.langfuse.com

# === Application ===
free_for_everyone=1

Docker Configuration

Using .env File

docker run --env-file .env ghcr.io/gnosis/prediction-market-agent:main

Using Environment Variables

docker run \
  -e BET_FROM_PRIVATE_KEY="your_key" \
  -e OPENAI_API_KEY="your_key" \
  -e runnable_agent_name="prophet_gpt4o" \
  -e market_type="omen" \
  ghcr.io/gnosis/prediction-market-agent:main

Docker Compose

docker-compose.yml
version: '3.8'

services:
  agent:
    image: ghcr.io/gnosis/prediction-market-agent:main
    environment:
      - runnable_agent_name=prophet_gpt4o
      - market_type=omen
    env_file:
      - .env
    restart: unless-stopped

Kubernetes Configuration

Using Secrets

apiVersion: v1
kind: Secret
metadata:
  name: agent-secrets
  namespace: agents
type: Opaque
stringData:
  BET_FROM_PRIVATE_KEY: "your_private_key"
  OPENAI_API_KEY: "sk-your_openai_key"
  SERPER_API_KEY: "your_serper_key"
  TAVILY_API_KEY: "your_tavily_key"
---
apiVersion: apps/v1
kind: Deployment
metadata:
  name: pma-agent
  namespace: agents
spec:
  template:
    spec:
      containers:
      - name: agent
        image: ghcr.io/gnosis/prediction-market-agent:main
        env:
        - name: runnable_agent_name
          value: "prophet_gpt4o"
        - name: market_type
          value: "omen"
        - name: BET_FROM_PRIVATE_KEY
          valueFrom:
            secretKeyRef:
              name: agent-secrets
              key: BET_FROM_PRIVATE_KEY
        - name: OPENAI_API_KEY
          valueFrom:
            secretKeyRef:
              name: agent-secrets
              key: OPENAI_API_KEY

Using ConfigMap

apiVersion: v1
kind: ConfigMap
metadata:
  name: agent-config
  namespace: agents
data:
  market_type: "omen"
  free_for_everyone: "1"
  LANGFUSE_HOST: "https://cloud.langfuse.com"
---
apiVersion: apps/v1
kind: Deployment
metadata:
  name: pma-agent
spec:
  template:
    spec:
      containers:
      - name: agent
        envFrom:
        - configMapRef:
            name: agent-config
        - secretRef:
            name: agent-secrets

Agent-Specific Requirements

Required:
  • BET_FROM_PRIVATE_KEY
  • OPENAI_API_KEY (or alternative AI API key)
Optional:
  • SERPER_API_KEY - For web research
  • TAVILY_API_KEY - For enhanced research
  • LANGFUSE_* - For monitoring
Required:
  • BET_FROM_PRIVATE_KEY
  • OPENAI_API_KEY
Optional:
  • SERPER_API_KEY - For tool-based research
  • LANGFUSE_* - For monitoring
Required:
  • BET_FROM_PRIVATE_KEY
  • OPENAI_API_KEY
  • FARCASTER_PRIVATE_KEY
  • TWITTER_API_KEY
  • TWITTER_API_KEY_SECRET
  • TWITTER_ACCESS_TOKEN
  • TWITTER_ACCESS_TOKEN_SECRET
  • TWITTER_BEARER_TOKEN
Required:
  • BET_FROM_PRIVATE_KEY
  • OPENAI_API_KEY
  • SERPER_API_KEY
  • TAVILY_API_KEY
Optional:
  • GRAPH_API_KEY - For blockchain data
  • LANGFUSE_* - For monitoring
Required:
  • BET_FROM_PRIVATE_KEY
  • OPENAI_API_KEY
  • SQLALCHEMY_DB_URL
Purpose: Database stores game state and agent interactions
Required:
  • BET_FROM_PRIVATE_KEY
  • OPENAI_API_KEY
  • PINECONE_API_KEY
Purpose: Vector database for semantic search and retrieval
Required:
  • BET_FROM_PRIVATE_KEY
Note: These agents don’t use AI models or external APIs

Security Best Practices

Never Commit Secrets

Keep .env files in .gitignore. Never commit API keys or private keys to version control.

Use Secret Management

Use proper secret management tools like Google Secret Manager, AWS Secrets Manager, or Kubernetes Secrets.

Rotate Keys Regularly

Rotate API keys and private keys periodically, especially after team member changes.

Limit Permissions

Use API keys with minimal required permissions. Create separate keys for different environments.

Troubleshooting

When you run an agent, it will check for required variables and tell you if any are missing:
Error: Missing required environment variable: OPENAI_API_KEY
Add the missing variable to your .env file or environment.
If you see authentication errors, verify your API keys:
# Test OpenAI key
curl https://api.openai.com/v1/models \
  -H "Authorization: Bearer $OPENAI_API_KEY"
Verify your SQLALCHEMY_DB_URL format:
# Correct format
postgresql://user:password@host:5432/database

# Common mistakes:
# - Missing password
# - Wrong port (default is 5432)
# - Database doesn't exist
Ensure your .env file is in the correct location:
# Should be in project root
ls -la .env

# Test loading
python -c "from dotenv import load_dotenv; load_dotenv(); import os; print(os.getenv('OPENAI_API_KEY'))"

Next Steps

Local Deployment

Run agents locally with environment variables

Docker Deployment

Configure environment for Docker containers

Cloud Deployment

Set up secrets and config in Kubernetes

Agent Reference

Learn about specific agent requirements

Build docs developers (and LLMs) love