Skip to main content

Prerequisites

Before starting, ensure you have the following installed:
  • Python 3.13+ (required for modern async features)
  • PostgreSQL via Supabase (managed database)
  • Redis 5.0+ (for Celery task queue)
  • Git (for version control)

Initial Setup

1

Clone the Repository

git clone <repository-url>
cd pythonserver
2

Create Virtual Environment

Python 3.13 is required for async features and performance optimizations.
python3.13 -m venv venv
source venv/bin/activate  # On Windows: venv\Scripts\activate
Always activate the virtual environment before running any commands. Verify with:
which python  # Should point to venv/bin/python
3

Install Dependencies

pip install -r requirements.txt
This installs:
  • FastAPI 0.109+ (web framework)
  • SQLAlchemy 2.0 (ORM)
  • Celery 5.3+ (task queue)
  • Playwright 1.56+ (web scraping)
  • Anthropic Claude SDK
  • Logfire 4.14+ (observability)
4

Install Playwright Browsers

Playwright requires Chromium for headless web scraping (~300MB download).
playwright install chromium
5

Configure Environment Variables

Copy the example environment file and configure your credentials:
cp .env.example .env
Edit .env with your values:
# Database (Supabase Transaction Pooler)
DB_USER=postgres.<project-ref>
DB_PASSWORD=your-password
DB_HOST=aws-1-<region>.pooler.supabase.com
DB_PORT=6543
DB_NAME=postgres

# Supabase
SUPABASE_URL=https://xxx.supabase.co
SUPABASE_SERVICE_ROLE_KEY=eyJhbG...

# APIs
ANTHROPIC_API_KEY=sk-ant-...
EXA_API_KEY=your-exa-key

# Redis (Celery)
REDIS_HOST=localhost
REDIS_PORT=6379
REDIS_DB=0

# Observability
LOGFIRE_TOKEN=your-token

# Server
ENVIRONMENT=development
DEBUG=True
ALLOWED_ORIGINS=http://localhost:3000
6

Run Database Migrations

Apply all database migrations to set up tables:
alembic upgrade head
Verify migration status:
alembic current
7

Verify Installation

Test that all imports work correctly:
python --version  # Should be 3.13+
playwright --version
python -c "import anthropic, celery, fastapi; print('✅ All imports successful')"

Running Services

# Terminal 1: Start Redis
make redis-start

# Terminal 2: Start API + Celery Worker
make serve
Access Points:

Manual Start (For Debugging)

Run services individually for better control:
# Terminal 1: Redis
make redis-start
# Or manually: redis-server

# Terminal 2: FastAPI (with hot reload)
uvicorn main:app --reload --host 0.0.0.0 --port 8000

# Terminal 3: Celery worker
celery -A celery_config.celery_app worker \
  --loglevel=info \
  --queues=email_default \
  --concurrency=1

# Terminal 4: Flower monitoring (optional)
celery -A celery_config.celery_app flower
Celery runs with concurrency=1 for sequential processing. This prevents API rate limits when generating multiple emails.

Stopping Services

make stop-all

# Or manually:
pkill -f uvicorn
pkill -f celery
redis-cli shutdown

IDE Configuration

VS Code Setup

Create .vscode/settings.json:
{
  "python.defaultInterpreterPath": "${workspaceFolder}/venv/bin/python",
  "python.linting.enabled": true,
  "python.linting.pylintEnabled": false,
  "python.linting.flake8Enabled": true,
  "python.formatting.provider": "black",
  "python.formatting.blackArgs": ["--line-length=100"],
  "editor.formatOnSave": true,
  "[python]": {
    "editor.codeActionsOnSave": {
      "source.organizeImports": true
    }
  },
  "python.testing.pytestEnabled": true,
  "python.testing.pytestArgs": [
    ".",
    "-v"
  ]
}

PyCharm Setup

  1. Configure Interpreter: Settings → Project → Python Interpreter → Add Interpreter → Existing Environment → venv/bin/python
  2. Enable pytest: Settings → Tools → Python Integrated Tools → Testing → Default test runner: pytest
  3. Code Style: Settings → Editor → Code Style → Python → Set line length to 100

Pre-commit Hooks

While not strictly enforced, you can set up pre-commit hooks for code quality:
# Install pre-commit
pip install pre-commit

# Install git hooks
pre-commit install
Create .pre-commit-config.yaml:
repos:
  - repo: https://github.com/psf/black
    rev: 23.3.0
    hooks:
      - id: black
        args: [--line-length=100]

  - repo: https://github.com/pycqa/flake8
    rev: 6.0.0
    hooks:
      - id: flake8
        args: [--max-line-length=100, --ignore=E203,W503]

  - repo: https://github.com/pycqa/isort
    rev: 5.12.0
    hooks:
      - id: isort
        args: [--profile=black]

Environment Variables Reference

VariableRequiredDefaultDescription
ENVIRONMENTNodevelopmentRuntime environment
DEBUGNoFalseEnable debug mode
DB_USERYes-Supabase database user
DB_PASSWORDYes-Database password
DB_HOSTYes-Database host (pooler URL)
DB_PORTNo6543Transaction pooler port
DB_NAMEYes-Database name
SUPABASE_URLYes-Supabase project URL
SUPABASE_SERVICE_ROLE_KEYYes-Service role key
ANTHROPIC_API_KEYYes-Claude API key
EXA_API_KEYYes-Exa search API key
FIREWORKS_API_KEYNo-Fireworks AI key (optional)
REDIS_HOSTNolocalhostRedis host
REDIS_PORTNo6379Redis port
REDIS_DBNo0Redis database number
LOGFIRE_TOKENNo-Logfire observability token
TEMPLATE_PARSER_MODELNofireworks:kimi-k2p5Model for template parsing
EMAIL_COMPOSER_MODELNofireworks:kimi-k2p5Model for email generation
ALLOWED_ORIGINSNohttp://localhost:3000CORS origins (comma-separated)

Hot-Swappable LLM Models

Scribe supports switching between different LLM providers without code changes:
# Use Anthropic Claude
TEMPLATE_PARSER_MODEL=anthropic:claude-haiku-4-5
EMAIL_COMPOSER_MODEL=anthropic:claude-sonnet-4-5

# Use Fireworks AI (faster, cheaper)
TEMPLATE_PARSER_MODEL=fireworks:accounts/fireworks/models/kimi-k2p5
EMAIL_COMPOSER_MODEL=fireworks:accounts/fireworks/models/kimi-k2p5
Cost Optimization: Use lightweight models (Haiku, Kimi) for template parsing and powerful models (Sonnet) for final email composition.

Common Issues

redis.exceptions.ConnectionError: Error connecting to Redis
Solution: Start Redis server
redis-cli ping  # Should respond "PONG"
make redis-start
playwright._impl._api_types.Error: Executable doesn't exist
Solution: Install Chromium
playwright install chromium
ModuleNotFoundError: No module named 'pipeline'
Solution: Ensure you’re in the project root and venv is activated
cd /path/to/pythonserver
source venv/bin/activate
which python  # Should be venv/bin/python
sqlalchemy.exc.OperationalError: timeout expired
Solution: Check Supabase status and connection string
  • Verify DB_HOST uses .pooler.supabase.com
  • Ensure DB_PORT=6543 for transaction pooler
  • Check firewall/network settings

Next Steps

Testing Guide

Learn how to write and run tests

Debugging

Debug pipeline steps and Celery tasks

Project Structure

Understand the codebase organization

API Reference

Explore the REST API endpoints

Build docs developers (and LLMs) love