Skip to main content

Overview

Flower Engine uses a config.yaml file for configuration. This file stores your API keys, database paths, and model settings. The config file is gitignored to protect your credentials.
Never commit config.yaml to version control. Always use config.yaml.example as a template.

Initial Setup

1

Copy the example config

cp config.yaml.example config.yaml
2

Edit config.yaml with your API keys

Open config.yaml in your text editor and replace placeholder values with real credentials.
3

Verify the configuration

Start the engine to ensure your config loads correctly:
python -m engine.main

Configuration File Structure

Database Settings

# Database Storage Path (Relative to execution directory)
database_path: "./chroma_db"
The database_path specifies where the ChromaDB vector database stores world lore and memory embeddings. This is separate from the SQLite database (engine.db).

LLM Provider API Keys

OpenRouter provides access to multiple models through a single API:
openai_base_url: "https://openrouter.ai/api/v1"
openai_api_key: "sk-or-v1-YOUR_OPENROUTER_KEY_HERE"

Get an OpenRouter API Key

Sign up at OpenRouter to access GPT-4, Claude, Gemini, and more through one API.

DeepSeek

For DeepSeek Chat and DeepSeek Reasoner models:
deepseek_api_key: "sk-YOUR_DEEPSEEK_KEY_HERE"

Google Gemini

For direct Gemini API access:
gemini_api_key: "AIzaSyYOUR_GEMINI_KEY_HERE"
You can obtain a free Gemini API key from Google AI Studio.

Model Configuration

# Default model used when starting a new session
default_model: "google/gemini-2.0-pro-exp-02-05:free"

# Models available in the TUI model selector
supported_models:
  - "google/gemini-2.0-pro-exp-02-05:free"
  - "openai/gpt-4o-mini"
  - "anthropic/claude-3-haiku"
  - "deepseek-chat"
  - "deepseek-reasoner"
The supported_models list is sent to the TUI on connection. The engine also fetches additional models from OpenRouter and Groq APIs at startup.

Complete Example

config.yaml
# Engine Configuration
# Copy this file to config.yaml and fill in your real keys.
# config.yaml is gitignored and should NEVER be committed.

# Database Storage Path (Relative to execution directory)
database_path: "./chroma_db"

# LLM Providers - replace with your actual API keys
openai_base_url: "https://openrouter.ai/api/v1"
openai_api_key: "sk-or-v1-REPLACE_WITH_YOUR_OPENROUTER_KEY"

deepseek_api_key: "sk-REPLACE_WITH_YOUR_DEEPSEEK_KEY"

gemini_api_key: "AIzaSy..."

# Supported Models List (The TUI will receive these on Handshake)
default_model: "google/gemini-2.0-pro-exp-02-05:free"
supported_models:
  - "google/gemini-2.0-pro-exp-02-05:free"
  - "openai/gpt-4o-mini"
  - "anthropic/claude-3-haiku"
  - "deepseek-chat"
  - "deepseek-reasoner"

Configuration Loading

The engine loads configuration in this order:
  1. Read config.yaml using YAML parser
  2. Fall back to environment variables if keys are missing
  3. Use default values if neither config nor env vars exist
From engine/config.py:22:
OPENAI_API_KEY = CONFIG.get("openai_api_key", os.getenv("OPENAI_API_KEY", "dummy_key_if_local"))
You can use environment variables instead of config.yaml if preferred. Set OPENAI_API_KEY, DEEPSEEK_API_KEY, GEMINI_API_KEY, etc.

Troubleshooting

Config file not found

If you see “Failed to load config.yaml”, ensure:
  • The file exists in the project root (same directory as engine/)
  • The file is named exactly config.yaml (not config.yml)
  • The file has valid YAML syntax

API key errors

If you get authentication errors:
  • Verify your API keys are correct and active
  • Check for extra spaces or quotes around keys
  • Ensure keys start with the correct prefix (sk-or-v1- for OpenRouter, sk- for DeepSeek)

Model not available

If a model fails to load:
  • Verify the model ID matches the provider’s format
  • Check that you have the correct API key for that provider
  • Review engine startup logs for model fetch errors

Next Steps

Creating Worlds

Define worlds with lore, scenes, and starting messages

Creating Characters

Create characters with unique personas

Build docs developers (and LLMs) love