Prerequisites
Before installing Interview Simulator, ensure you have:
Python 3.11 or higher : Check your version with python --version
pip : Python package manager (included with Python)
Git : For cloning the repository
AI provider API key : Get a free key from Google Gemini or OpenRouter
For Docker deployment, you’ll also need Docker and Docker Compose installed.
Local setup
Clone repository
Get the source code from GitHub:
git clone https://github.com/DanielPopoola/interview-simulator.git
cd interview-simulator
Virtual environment
Create an isolated Python environment to avoid dependency conflicts:
python -m venv .venv
source .venv/bin/activate
You should see (.venv) prefix in your terminal when activated.
Install dependencies
Install all required packages from requirements.txt:
pip install -r requirements.txt
Key dependencies include:
flask==3.1.2 # Web framework
flask-sqlalchemy==3.1.1 # ORM integration
google-genai==1.49.0 # Gemini AI provider
pdfplumber==0.11.8 # PDF text extraction
python-docx==1.2.0 # Word document parsing
python-dotenv==1.2.1 # Environment variable management
tenacity==9.1.2 # Retry logic for API calls
gunicorn==23.0.0 # Production WSGI server
hypercorn==0.18.0 # ASGI server
uvicorn==0.38.0 # Alternative ASGI server
pytest==9.0.0 # Testing framework
pytest-cov==7.0.0 # Code coverage
ruff==0.14.4 # Linter and formatter
Environment configuration
Copy the example environment file:
Edit .env with your configuration:
# AI Provider Configuration
GEMINI_API_KEY = your_gemini_api_key_here
OPENROUTER_API_KEY = your_openrouter_api_key_here
ACTIVE_PROVIDERS = openrouter,gemini
# Flask Configuration
SECRET_KEY = your_random_secret_key_change_in_production
FLASK_ENV = development
# Database Configuration
DATABASE_URL = sqlite:///instance/app.db
# File Upload Settings
UPLOAD_FOLDER = uploads
MAX_CONTENT_LENGTH = 16777216 # 16MB in bytes
Configuration reference
Optional OpenRouter API key for alternative AI provider
ACTIVE_PROVIDERS
string
default: "openrouter,gemini"
Comma-separated list of active AI providers. The app will try providers in order until one succeeds.
Flask session secret key. Generate a random string for production: python -c "import secrets; print(secrets.token_hex(32))"
DATABASE_URL
string
default: "sqlite:///dev.db"
Database connection string. Uses SQLite by default. For production, consider PostgreSQL: postgresql://user:password@localhost/interview_simulator
Directory for storing uploaded CV files
MAX_CONTENT_LENGTH
integer
default: "16777216"
Maximum upload file size in bytes (default 16MB)
Never commit your .env file to version control. The .gitignore file excludes it by default.
Database initialization
The database is automatically created when you first run the application:
# From app/__init__.py
def create_app ( config_object = None ):
app = Flask( __name__ )
# ... configuration ...
db.init_app(app)
with app.app_context():
db.create_all() # Creates tables if they don't exist
return app
The SQLite database is stored at instance/app.db by default.
Run the application
Start the Flask development server:
Or use the WSGI entry point:
The application will be available at http://127.0.0.1:5000.
The development server is not suitable for production . See the Docker deployment section for production setup.
Docker deployment
Docker provides a containerized environment for consistent deployment across environments.
Prerequisites
Install Docker and Docker Compose:
Dockerfile
The application includes a production-ready Dockerfile:
FROM python:3.14-slim
WORKDIR /app
# Install dependencies
COPY requirements.txt .
RUN pip install -r requirements.txt
# Copy application code
COPY . .
# Expose port
EXPOSE 8000
# Run with Gunicorn
CMD [ "gunicorn" , "wsgi:app" , "--bind" , "0.0.0.0:8000" , "--workers" , "4" ]
Key configuration:
Python 3.14-slim : Minimal base image
Gunicorn : Production WSGI server with 4 workers
Port 8000 : Exposed for external access
Docker Compose
The docker-compose.yml orchestrates the application:
services :
web :
build : .
ports :
- "8000:8000"
env_file :
- .env
volumes :
- ./instance:/app/instance
Volume mounting ensures database persistence across container restarts.
Build and run
Build the Docker image and start the container:
# Build and start in detached mode
docker-compose up --build -d
# View logs
docker-compose logs -f
# Stop the container
docker-compose down
The application will be available at http://localhost:8000.
Production considerations
For production deployment:
Use PostgreSQL : Replace SQLite with PostgreSQL for better concurrency
services :
web :
# ... existing config ...
environment :
- DATABASE_URL=postgresql://user:password@db:5432/interview_simulator
depends_on :
- db
db :
image : postgres:16
environment :
POSTGRES_USER : user
POSTGRES_PASSWORD : password
POSTGRES_DB : interview_simulator
volumes :
- postgres_data:/var/lib/postgresql/data
volumes :
postgres_data :
Secure secrets : Use Docker secrets or environment-specific .env files
Scale workers : Adjust Gunicorn workers based on CPU cores:
CMD [ "gunicorn" , "wsgi:app" , "--bind" , "0.0.0.0:8000" , "--workers" , "4" , "--threads" , "2" ]
Add reverse proxy : Use Nginx for SSL termination and static file serving
Health checks : Add Docker health check for monitoring:
healthcheck :
test : [ "CMD" , "curl" , "-f" , "http://localhost:8000/" ]
interval : 30s
timeout : 10s
retries : 3
Verify installation
Run the test suite to ensure everything is configured correctly:
# Run all tests
pytest
# Run with coverage report
pytest --cov=app --cov-report=term-missing
# Run specific test file
pytest tests/test_interview_service.py -v
Expected output:
========================= test session starts =========================
platform linux -- Python 3.11.0, pytest-9.0.0
collected 42 items
tests/test_document_parser.py ........
tests/test_document_service.py ......
tests/test_feedback_service.py .....
tests/test_interview_service.py .........
tests/test_session_service.py .......
========================= 42 passed in 2.34s =========================
Project structure
Understanding the codebase layout:
interview-simulator/
├── app/ # Main application package
│ ├── __init__.py # Flask app factory
│ ├── config.py # Configuration management
│ ├── models.py # SQLAlchemy database models
│ ├── exceptions.py # Custom exception classes
│ ├── extensions.py # Flask extensions setup
│ ├── services/ # Business logic layer
│ │ ├── session_service.py # Session management
│ │ ├── interview_service.py # Interview orchestration
│ │ ├── feedback_service.py # Feedback generation
│ │ └── document_service.py # Document handling
│ ├── repositories/ # Data access layer
│ │ ├── session_repository.py
│ │ ├── message_repository.py
│ │ ├── feedback_repository.py
│ │ └── file_repository.py
│ └── routes/ # HTTP request handlers
│ ├── session_routes.py
│ ├── interview_routes.py
│ ├── feedback_routes.py
│ ├── document_routes.py
│ └── errors.py
│
├── client/ # AI provider abstraction
│ ├── ai_client.py # Main AI client orchestrator
│ ├── ai_provider.py # Provider protocol definition
│ ├── ai_provider_manager.py # Provider selection logic
│ ├── gemini_provider.py # Google Gemini implementation
│ └── openrouter_provider.py # OpenRouter implementation
│
├── utils/ # Utility modules
│ ├── document_parser.py # PDF/DOCX/TXT parsing
│ └── prompt_templates.py # AI prompt templates
│
├── templates/ # Jinja2 HTML templates
│ ├── index.html # Homepage
│ ├── upload.html # Document upload page
│ ├── interview.html # Interview interface
│ ├── feedback.html # Results display
│ └── fragments/ # HTMX partials
│
├── static/ # Static assets
│ ├── css/
│ │ └── main.css
│ └── js/
│
├── tests/ # Pytest test suite
│ ├── test_session_service.py
│ ├── test_interview_service.py
│ ├── test_feedback_service.py
│ ├── test_document_service.py
│ └── test_document_parser.py
│
├── instance/ # Instance-specific files
│ └── app.db # SQLite database
│
├── uploads/ # Uploaded CV files
│
├── .env # Environment variables (not in git)
├── .env.example # Example environment config
├── requirements.txt # Python dependencies
├── wsgi.py # WSGI entry point
├── Dockerfile # Docker build instructions
└── docker-compose.yml # Docker orchestration
Troubleshooting
Port already in use
If port 5000 or 8000 is already in use:
# Find process using the port
lsof -i :5000
# Kill the process or use a different port
flask run --port 5001
Module import errors
If you see “ModuleNotFoundError”:
Ensure virtual environment is activated (look for (.venv) prefix)
Reinstall dependencies: pip install -r requirements.txt
Check Python version: python --version (must be 3.11+)
Database locked errors
SQLite locks the database during writes. If you encounter locking issues:
Reduce concurrent requests in development
For production, migrate to PostgreSQL
Check that only one Flask process is running
Docker build failures
If Docker build fails:
# Clear Docker cache and rebuild
docker-compose down -v
docker system prune -a
docker-compose up --build
Next steps
Quick start Run your first mock interview
Architecture Learn about the layered design
Configuration Advanced configuration options
API reference Explore the REST endpoints