Skip to main content

Overview

The Interview Simulator supports multiple AI providers through a common AIProvider protocol. This enables flexible provider selection and automatic failover.

AIProvider Protocol

All providers implement the AIProvider protocol, which defines the interface for text generation.
from typing import Protocol

class AIProvider(Protocol):
    def generate_text(self, prompt: str) -> str:
        ...
prompt
str
required
The prompt text to send to the AI model
return
str
Generated text response from the AI model

Provider Implementations

GeminiProvider

Google’s Gemini AI provider implementation with built-in retry logic.Initialization:
from client.gemini_provider import GeminiProvider

provider = GeminiProvider(
    api_key="your-gemini-api-key",
    model_name="gemini-2.5-flash"
)
api_key
str
required
Google Gemini API key. Raises ValueError if empty.
model_name
str
default:"gemini-2.5-flash"
The Gemini model to use for text generation
Methods:

generate_text()

response = provider.generate_text(
    prompt="Generate an interview question about Python"
)
prompt
str
required
The prompt to send to Gemini
return
str
Generated text from Gemini model
Features:
  • Retry Logic: 3 attempts with exponential backoff (2-10 seconds)
  • Retries on: ConnectionError, TimeoutError
  • Error Handling: Raises RuntimeError for blocked/empty responses
  • Location: client/gemini_provider.py:23
Example:
from client.gemini_provider import GeminiProvider
import os

provider = GeminiProvider(
    api_key=os.getenv("GEMINI_API_KEY"),
    model_name="gemini-2.5-flash"
)

try:
    response = provider.generate_text("What is Python?")
    print(response)
except RuntimeError as e:
    print(f"Generation failed: {e}")

ProviderManager

The ProviderManager implements a circuit breaker pattern for managing multiple providers with automatic failover.
from client.ai_provider_manager import ProviderManager
from client.gemini_provider import GeminiProvider
from client.openrouter_provider import OpenRouterProvider

providers = [
    GeminiProvider(api_key="gemini-key"),
    OpenRouterProvider(api_key="openrouter-key")
]

manager = ProviderManager(providers=providers)
providers
list[AIProvider]
required
List of AI provider instances to manage

generate_text()

Generates text using the first available provider, with automatic failover.
response = manager.generate_text(
    prompt="Generate an interview question"
)
prompt
str
required
The prompt to send to AI providers
return
str
Generated text from the first successful provider
Failover Logic:
  1. Iterates through providers in order
  2. Skips providers that are in “open” circuit state
  3. On failure, increments fail count for that provider
  4. After 3 failures, opens the circuit for 120 seconds
  5. Raises AIServiceError if all providers fail
Example with Multiple Providers:
from client.ai_provider_manager import ProviderManager
from client.gemini_provider import GeminiProvider
from client.openrouter_provider import OpenRouterProvider
from app.exceptions import AIServiceError
import os

# Initialize providers
gemini = GeminiProvider(
    api_key=os.getenv("GEMINI_API_KEY")
)

openrouter = OpenRouterProvider(
    api_key=os.getenv("OPENROUTER_API_KEY")
)

# Create manager with fallback chain
manager = ProviderManager(providers=[gemini, openrouter])

try:
    # Will try Gemini first, fallback to OpenRouter if it fails
    response = manager.generate_text("What is Python?")
    print(response)
except AIServiceError as e:
    print(f"All providers failed: {e}")
Circuit Breaker States:
  • Closed (Normal): Provider is available for requests
  • Open (Failed): Provider has failed 3+ times and is unavailable for 120 seconds
  • Half-Open (Recovery): After 120 seconds, the provider becomes available again
Internal Attributes:
  • fail_count: Dictionary tracking failure count per provider
  • open_until: Dictionary tracking when each provider’s circuit can close
  • Location: client/ai_provider_manager.py:7

Best Practices

Provider Selection

  • Use Gemini for faster responses and lower costs with the Flash model
  • Use OpenRouter for access to diverse models including free options
  • Configure multiple providers for high availability

Error Handling

from app.exceptions import AIServiceError

try:
    response = manager.generate_text(prompt)
except AIServiceError as e:
    # Handle all providers failing
    logger.error(f"AI generation failed: {e}")
    # Fallback logic here
except ValueError as e:
    # Handle invalid API keys
    logger.error(f"Configuration error: {e}")

Configuration from Environment

import os
from client.gemini_provider import GeminiProvider
from client.openrouter_provider import OpenRouterProvider
from client.ai_provider_manager import ProviderManager

providers = []

# Add Gemini if configured
if gemini_key := os.getenv("GEMINI_API_KEY"):
    providers.append(GeminiProvider(api_key=gemini_key))

# Add OpenRouter if configured
if openrouter_key := os.getenv("OPENROUTER_API_KEY"):
    providers.append(OpenRouterProvider(api_key=openrouter_key))

if not providers:
    raise RuntimeError("No AI providers configured")

manager = ProviderManager(providers=providers)

Build docs developers (and LLMs) love