Overview
Thebot.py module contains the main Telegram bot implementation for MilesONerd AI. It handles user interactions, processes commands, and routes messages to the appropriate AI models through the AI handler.
Functions
start()
Sends a welcome message when the/start command is issued.
Telegram Update object containing the incoming update data
Context object for the handler callback
This function doesn’t return a value
- Greets the user with their name using HTML formatting
- Introduces MilesONerd AI capabilities
- Directs users to the
/helpcommand for more information
bot.py:21-28
help_command()
Sends a help message when the/help command is issued.
Telegram Update object containing the incoming update data
Context object for the handler callback
This function doesn’t return a value
- Displays all available commands (
/start,/help,/about) - Explains message handling capabilities:
- Short questions: Quick responses using lightweight model
- Long messages: Summarization and detailed response
- Summarization requests: Using ‘summarize’ or ‘tldr’ keywords
- Chat-related queries: Optimized conversation handling
- Regular messages: Comprehensive AI-powered responses
bot.py:30-47
about_command()
Sends information about the bot when the/about command is issued.
Telegram Update object containing the incoming update data
Context object for the handler callback
This function doesn’t return a value
- Provides information about MilesONerd AI features:
- Advanced language understanding
- Internet search integration
- Continuous learning from interactions
- Multiple AI models for different tasks
- Credits the technology stack (python-telegram-bot and Hugging Face models)
bot.py:49-62
handle_message()
Handles user messages using appropriate AI models based on content.Telegram Update object containing the incoming message
Context object for the handler callback
This function doesn’t return a value
-
Long messages (>100 words)
- Uses BART for summarization
- Uses Llama for response generation based on summary
- Max length: 200 tokens
-
Summarization requests (keywords: ‘summarize’, ‘summary’, ‘tldr’)
- Uses BART summarization model
-
Chat/conversation queries (keywords: ‘chat’, ‘conversation’, ‘talk’)
- Uses Llama model
- Max length: 200 tokens
-
Short queries (less than 10 words)
- Uses Llama for quick responses
- Max length: 100 tokens
-
Default messages
- Uses Llama for general responses
- Max length: 150 tokens
- Logs errors using the logger
- Sends a user-friendly error message on failure
- Shows typing indicator while processing
bot.py:64-114
initialize()
Initializes AI models and other components.True if initialization successful, False otherwise
- Calls
ai_handler.initialize_models()to load AI models - Logs initialization progress and results
- Returns initialization status
- Catches and logs any exceptions during initialization
- Returns False on failure
bot.py:116-128
main()
Starts the bot.This function doesn’t return a value
-
Environment Setup
- Retrieves
TELEGRAM_BOT_TOKENfrom environment variables - Exits if token is not found
- Retrieves
-
Application Creation
- Creates Telegram Application with bot token
- Sets up event loop for async operations
-
Model Initialization
- Runs
initialize()to load AI models - Exits if initialization fails
- Runs
-
Handler Registration
- Registers command handlers:
/start,/help,/about - Registers message handler for text messages (excluding commands)
- Registers command handlers:
-
Bot Execution
- Starts polling for updates
- Allows all update types
- Logs startup message
- Validates presence of bot token
- Ensures models are initialized before starting
- Properly closes event loop on exit
- Handles KeyboardInterrupt for graceful shutdown
- Re-raises exceptions for proper error handling
bot.py:130-174
Dependencies
Environment Variables
TELEGRAM_BOT_TOKEN(required): Telegram bot authentication token
