Requirements
Before installing MilesONerd AI Telegram Bot, ensure you have the following prerequisites:- Python 3.8 or higher - The bot requires Python 3.8+
- pip - Python package manager (usually comes with Python)
- Telegram Bot Token - Obtain from @BotFather on Telegram
- Sufficient disk space - AI models require several GB of storage
- GPU (optional) - NVIDIA GPU with CUDA support for faster inference
The bot uses Llama 3.1-Nemotron (70B) and BART models which are resource-intensive. Ensure you have adequate RAM (16GB+ recommended) and storage space.
Installation Steps
Install Python Dependencies
Install all required Python packages using pip:
The
requirements.txt includes essential packages like python-telegram-bot, transformers, torch, and model dependencies.Configure Environment Variables
Copy the example environment file and configure your settings:Edit the
.env file with your credentials (see Configuration for details).Key Dependencies
The bot relies on these core packages:requirements.txt
GPU Support
If you have an NVIDIA GPU with CUDA support, the bot will automatically detect and use it for faster inference: CUDA-related packages are included in requirements.txt:Troubleshooting
ImportError: No module named 'telegram'
ImportError: No module named 'telegram'
This means the python-telegram-bot package wasn’t installed correctly. Run:
CUDA out of memory errors
CUDA out of memory errors
The models are large and may exceed available GPU memory. The bot automatically falls back to CPU if GPU memory is insufficient. You can also set smaller batch sizes or use model quantization.
Models downloading slowly
Models downloading slowly
The first run downloads large model files from Hugging Face Hub. This is normal and only happens once. Models are cached locally for future use.
Next Steps
After installation, proceed to:- Configuration - Set up your bot token and model settings
- Deployment - Learn how to deploy your bot
