Introduction
The SeanceAI API allows you to build applications that enable conversations with historical figures powered by AI. The API provides endpoints for:- Retrieving information about available historical figures
- Having one-on-one conversations with specific figures
- Creating “dinner party” conversations with multiple figures
- Getting contextual follow-up question suggestions
Base URL
Authentication
The SeanceAI API uses OpenRouter for AI model access. You need to configure theOPENROUTER_API_KEY environment variable on the server side. No client-side authentication is required for API requests.
Environment Variables
Your OpenRouter API key for accessing AI models
Request Format
All POST endpoints expect JSON payloads withContent-Type: application/json.
Response Format
All endpoints return JSON responses. Successful responses have a 200 status code, while errors return appropriate 4xx or 5xx codes.Success Response
Error Response
Error Codes
Missing or invalid request parameters
Requested resource (figure, endpoint) not found
Server error, often due to AI model issues or rate limiting
Rate Limiting
The API implements automatic retry logic with exponential backoff and fallback models when rate limits are encountered. If all models are rate-limited, you’ll receive an error message:Available Models
The API supports multiple AI models organized by capability tier:Swift Tier (Free)
google/gemma-3-12b-it:free(default)google/gemma-3-27b-it:freegoogle/gemma-3-4b-it:freemeta-llama/llama-3.3-70b-instruct:freemeta-llama/llama-3.1-405b-instruct:free
Balanced Tier
openai/gpt-4o-minianthropic/claude-3.5-haikudeepseek/deepseek-chat
Advanced Tier
anthropic/claude-sonnet-4openai/gpt-4ogoogle/gemini-2.5-pro-previewanthropic/claude-opus-4
/api/models endpoint or specify a model in the request body using the model parameter.
GET /api/models
Returns the list of available AI models and the default model.Response
Array of model objects, each containing:
id(string): Model identifier for use in API requestsname(string): Human-readable model nametier(string): Model tier - “swift”, “balanced”, or “advanced”
The default model ID used when no model is specified
Example Response
GET /api/health
Health check endpoint for monitoring application status. Useful for uptime monitoring services and deployment health checks.Response
Application health status - “healthy” or “degraded”
Whether the OPENROUTER_API_KEY environment variable is configured
Length of the configured API key (0 if not set)
Warning message if status is “degraded” (only present when API key is not configured)
Example Response (Healthy)
Example Response (Degraded)
Use this endpoint for Railway uptime monitoring to keep your free-tier app awake. See the Deployment Guide for details.
Next Steps
Figures
Learn about retrieving historical figure data
Chat
Start conversations with historical figures
Dinner Party
Host conversations with multiple figures
Suggestions
Generate contextual follow-up questions