Pre-Deployment Checklist
Before deploying FinAI to production, ensure you’ve completed the following:
Environment Variables
Verify all required environment variables are set with production values:
SECRET_KEY - Use a strong, randomly generated key
GEMINI_API_KEY - Valid Google AI API key
MAIL_USERNAME and MAIL_PASSWORD - Production email credentials
DATABASE_URL - Production database connection string
Debug Mode
Ensure debug mode is disabled in production: if __name__ == "__main__" :
app.run( debug = False ) # CRITICAL: Set to False
Security Audit
Review all security configurations and update default credentials
Database Migration
Migrate from SQLite to a production-grade database (PostgreSQL/MySQL)
Security Best Practices
Secret Key Management
Never use the default fallback secret key in production. The current default 'khoa-mac-dinh-khong-an-toan' (“default-key-not-safe”) is insecure.
Generate a cryptographically secure secret key:
import secrets
# Generate a 256-bit (32-byte) secret key
secret_key = secrets.token_hex( 32 )
print (secret_key)
Current configuration in config.py:
SECRET_KEY = os.environ.get( 'SECRET_KEY' ) or 'khoa-mac-dinh-khong-an-toan'
Production recommendation:
SECRET_KEY = os.environ.get( 'SECRET_KEY' )
if not SECRET_KEY :
raise ValueError ( "SECRET_KEY environment variable must be set in production" )
Environment Variable Protection
.gitignore
Security Checklist
# Environment variables
.env
.env.local
.env.production
# Database
instance/
* .db
* .sqlite
# Python
__pycache__/
* .pyc
venv/
.venv/
API Key Security
The Gemini API key is loaded securely from environment variables:
class ExpenseAI :
def __init__ ( self ):
api_key = os.environ.get( 'GEMINI_API_KEY' )
if not api_key:
raise ValueError ( "GEMINI_API_KEY must be set" )
self .client = genai.Client( api_key = api_key)
Consider implementing API key rotation policies and rate limiting in production.
Production Database Setup
Migrating from SQLite to PostgreSQL
SQLite is suitable for development but not recommended for production. Migrate to PostgreSQL:
Install PostgreSQL Adapter
pip install psycopg2-binary
Update requirements.txt:
Configure Database URL
Add to your production .env: DATABASE_URL=postgresql://finai_user:secure_password@localhost:5432/finai_production
Update Config
Modify config.py to prioritize production database: SQLALCHEMY_DATABASE_URI = os.environ.get( 'DATABASE_URL' ) or \
'sqlite:///' + os.path.join( BASE_DIR , 'instance' , 'quanlychitieu.db' )
Initialize Production Database
python
>>> from app import db, app
>>> with app.app_context () :
... db.create_all ()
Database Connection Pooling
For production workloads, enable connection pooling:
class Config :
SQLALCHEMY_DATABASE_URI = os.environ.get( 'DATABASE_URL' )
SQLALCHEMY_TRACK_MODIFICATIONS = False
# Production connection pool settings
SQLALCHEMY_ENGINE_OPTIONS = {
'pool_size' : 10 ,
'pool_recycle' : 3600 ,
'pool_pre_ping' : True ,
'max_overflow' : 20
}
WSGI Server Configuration
Flask’s built-in server (app.run()) is not suitable for production. Use a production WSGI server like Gunicorn.
Gunicorn Setup
Create WSGI Entry Point
Create wsgi.py in the root directory: from app import app
if __name__ == "__main__" :
app.run()
Run with Gunicorn
gunicorn --bind 0.0.0.0:8000 --workers 4 --timeout 120 wsgi:app
Configuration explanation:
--workers 4 - Use 4 worker processes (adjust based on CPU cores)
--timeout 120 - 120-second timeout for AI API calls
--bind 0.0.0.0:8000 - Listen on all interfaces, port 8000
Systemd Service Configuration
Create a systemd service for automatic startup:
/etc/systemd/system/finai.service
[Unit]
Description =FinAI Expense Manager
After =network.target
[Service]
User =finai
Group =www-data
WorkingDirectory =/opt/finai
Environment = "PATH=/opt/finai/venv/bin"
ExecStart =/opt/finai/venv/bin/gunicorn --workers 4 --bind unix:finai.sock wsgi:app
[Install]
WantedBy =multi-user.target
Enable and start the service:
sudo systemctl enable finai
sudo systemctl start finai
sudo systemctl status finai
Reverse Proxy with Nginx
Configure Nginx as a reverse proxy:
/etc/nginx/sites-available/finai
server {
listen 80 ;
server_name finai.yourdomain.com;
location / {
proxy_pass http://unix:/opt/finai/finai.sock;
proxy_set_header Host $ host ;
proxy_set_header X-Real-IP $ remote_addr ;
proxy_set_header X-Forwarded-For $ proxy_add_x_forwarded_for ;
proxy_set_header X-Forwarded-Proto $ scheme ;
# Extended timeout for AI operations
proxy_read_timeout 120s ;
proxy_connect_timeout 120s ;
}
location /static {
alias /opt/finai/app/static;
expires 30d ;
}
}
Enable the site:
sudo ln -s /etc/nginx/sites-available/finai /etc/nginx/sites-enabled/
sudo nginx -t
sudo systemctl reload nginx
SSL/TLS Configuration
Secure your deployment with HTTPS using Let’s Encrypt:
sudo apt-get install certbot python3-certbot-nginx
sudo certbot --nginx -d finai.yourdomain.com
Certbot will automatically update your Nginx configuration for HTTPS.
AI Response Caching
Implement caching for repeated AI queries:
from functools import lru_cache
class ExpenseAI :
@lru_cache ( maxsize = 128 )
def predict ( self , text , user_categories_tuple ):
# Convert tuple back to list for processing
user_categories = list (user_categories_tuple)
# ... existing prediction logic
Database Query Optimization
Add indexes for frequently queried fields:
from app import db
class Transaction ( db . Model ):
__tablename__ = 'transactions'
id = db.Column(db.Integer, primary_key = True )
user_id = db.Column(db.Integer, db.ForeignKey( 'users.id' ), index = True )
category = db.Column(db.String( 50 ), index = True )
date = db.Column(db.DateTime, index = True )
# ... other fields
Static File Caching
Configure browser caching in Nginx:
location ~* \.(jpg|jpeg|png|gif|ico|css|js)$ {
expires 365d ;
add_header Cache-Control "public, immutable" ;
}
Monitoring and Logging
Application Logging
Configure production logging:
import logging
from logging.handlers import RotatingFileHandler
class Config :
# ... existing config
@ staticmethod
def init_app ( app ):
if not app.debug:
file_handler = RotatingFileHandler(
'logs/finai.log' ,
maxBytes = 10240000 ,
backupCount = 10
)
file_handler.setFormatter(logging.Formatter(
' %(asctime)s %(levelname)s : %(message)s [in %(pathname)s : %(lineno)d ]'
))
file_handler.setLevel(logging. INFO )
app.logger.addHandler(file_handler)
app.logger.setLevel(logging. INFO )
app.logger.info( 'FinAI startup' )
Error Tracking
Integrate with error tracking services (Sentry example):
import sentry_sdk
from sentry_sdk.integrations.flask import FlaskIntegration
sentry_sdk.init(
dsn = os.environ.get( 'SENTRY_DSN' ),
integrations = [FlaskIntegration()],
traces_sample_rate = 1.0
)
Backup Strategy
Database Backups
Automate PostgreSQL backups:
#!/bin/bash
BACKUP_DIR = "/backups/finai"
TIMESTAMP = $( date +%Y%m%d_%H%M%S )
pg_dump -U finai_user finai_production > $BACKUP_DIR /finai_ $TIMESTAMP .sql
# Keep only last 7 days
find $BACKUP_DIR -name "finai_*.sql" -mtime +7 -delete
Cron job for daily backups:
0 2 * * * /opt/finai/backup.sh
Known Limitations
Review these production limitations before deployment:
Fixed AI Categories : The AI currently maps to predefined categories. User-defined dynamic categories require code modifications.
AI API Latency : Response times of 2-5 seconds depend on Google Gemini API availability.
Mobile Responsiveness : Dashboard is optimized for desktop browsers.
Email Provider : Current configuration supports Gmail only. Other providers require config.py modifications.
# Install system dependencies
sudo apt update
sudo apt install python3.11 python3.11-venv nginx postgresql
# Deploy application
cd /opt
sudo git clone https://github.com/Montero52/finai-expense-manager.git finai
cd finai
sudo chown -R ubuntu:ubuntu /opt/finai
python3.11 -m venv venv
source venv/bin/activate
pip install -r requirements.txt
pip install gunicorn psycopg2-binary
Post-Deployment Verification
After deployment, verify all systems:
Health Check
Test the application endpoint: curl -I https://finai.yourdomain.com
AI Functionality
Test transaction categorization and chatbot responses
Email System
Trigger a password reset email to verify SMTP configuration
Database Connection
Verify transactions are persisting correctly
SSL Certificate
Confirm HTTPS is working with valid certificate
Support and Maintenance
For issues or questions:
This project is under active development. Check the GitHub repository for updates and new features.