This guide covers detailed installation steps for different types of applications in the Awesome LLM Apps repository.
System Requirements
Python Version: 3.8 or higherRecommended: 3.10+
Memory Minimum: 4GB RAMRecommended: 8GB+ for local models
Storage Minimum: 2GB free spaceAdditional space needed for local models
Internet Required for cloud APIsOptional for local-only setups
Installation Methods
Quick Install
Virtual Environment
Conda Environment
For most users, this is the fastest way to get started: # Clone the repository
git clone https://github.com/Shubhamsaboo/awesome-llm-apps.git
cd awesome-llm-apps
# Navigate to your chosen project
cd starter_ai_agents/ai_travel_agent
# Install dependencies
pip install -r requirements.txt
# Run the application
streamlit run travel_agent.py
Recommended for development and avoiding package conflicts: # Clone the repository
git clone https://github.com/Shubhamsaboo/awesome-llm-apps.git
cd awesome-llm-apps
# Create virtual environment
python -m venv venv
# Activate virtual environment
source venv/bin/activate # On Windows: venv\Scripts\activate
# Navigate to your project
cd starter_ai_agents/ai_travel_agent
# Install dependencies
pip install -r requirements.txt
# Run the application
streamlit run travel_agent.py
Always activate your virtual environment before running applications: source venv/bin/activate # Linux/Mac
venv\Scripts\activate # Windows
If you use Anaconda or Miniconda: # Clone the repository
git clone https://github.com/Shubhamsaboo/awesome-llm-apps.git
cd awesome-llm-apps
# Create conda environment
conda create -n llm-apps python= 3.10
conda activate llm-apps
# Navigate to your project
cd starter_ai_agents/ai_travel_agent
# Install dependencies
pip install -r requirements.txt
# Run the application
streamlit run travel_agent.py
Project-Specific Installation
Starter AI Agents
Basic agents with minimal dependencies. Perfect for beginners.
Navigate to Agent Directory
cd starter_ai_agents/ < agent-nam e >
Popular starter agents:
ai_travel_agent - Travel planning agent
ai_data_analysis_agent - Data analysis with AI
openai_research_agent - Multi-agent research system
ai_music_generator_agent - Music generation
Install Dependencies
Most starter agents use these core libraries: pip install -r requirements.txt
Common dependencies: streamlit
agno>=2.2.10
openai
python-dotenv
Configure API Keys
Set environment variables: export OPENAI_API_KEY = 'sk-...'
export SERPAPI_KEY = 'your-serpapi-key' # If using search
Run the Application
streamlit run < agent-nam e > .py
RAG (Retrieval Augmented Generation) Applications
RAG applications require additional dependencies for vector databases and embeddings.
Navigate to RAG Project
cd rag_tutorials/ < project-nam e >
Examples:
ai_blog_search - Agentic RAG with LangGraph
corrective_rag - CRAG implementation
local_rag_agent - Local RAG with open-source models
Install Dependencies
RAG applications typically include: pip install -r requirements.txt
Common RAG dependencies: langchain
langgraph
langchain-community
langchain-google-genai
langchain-qdrant
langchain-text-splitters
tiktoken
beautifulsoup4
python-dotenv
Set Up Vector Database
Some RAG apps use Qdrant or ChromaDB: For Qdrant (cloud): export QDRANT_URL = 'your-qdrant-url'
export QDRANT_API_KEY = 'your-qdrant-key'
For ChromaDB (local): # No additional setup needed - runs locally
MCP (Model Context Protocol) Agents
MCP agents require Docker and additional setup for protocol servers.
Install Docker
MCP agents use Docker containers for protocol servers. Download from docker.com # Verify installation
docker --version
docker ps
# Ubuntu/Debian
sudo apt-get update
sudo apt-get install docker.io
sudo systemctl start docker
# Verify installation
docker --version
Download from docker.com Requires WSL2 for Windows 10/11 Make sure Docker is running before starting MCP agents:
Navigate to MCP Agent
cd mcp_ai_agents/ < agent-nam e >
Available MCP agents:
github_mcp_agent - GitHub repository analysis
browser_mcp_agent - Browser automation
notion_mcp_agent - Notion integration
Install Python Dependencies
pip install -r requirements.txt
MCP-specific dependencies: openai
streamlit
mcp
anthropic
python-dotenv
Configure API Keys
MCP agents typically need multiple API keys: For GitHub MCP Agent: export OPENAI_API_KEY = 'your-openai-key'
export GITHUB_TOKEN = 'your-github-token'
Run the Agent
streamlit run github_agent.py
The agent will automatically manage Docker containers for MCP servers.
Advanced Multi-Agent Teams
Multi-agent systems may have additional dependencies for coordination and orchestration.
Navigate to Agent Team
cd advanced_ai_agents/multi_agent_apps/agent_teams/ < team-nam e >
Examples:
ai_finance_agent_team - Financial analysis team
ai_legal_agent_team - Legal research team
multimodal_coding_agent_team - Code generation team
Install Framework Dependencies
Different teams use different frameworks: AG2/AutoGen
CrewAI
LangGraph
pip install ag2
pip install pyautogen
Install All Dependencies
pip install -r requirements.txt
Run the Team
streamlit run < team-ap p > .py
Voice AI Agents
Voice agents require audio processing libraries.
System Audio Libraries
Install system-level audio dependencies: # Ubuntu/Debian
sudo apt-get install portaudio19-dev python3-pyaudio
Audio libraries are typically included with Python installation.
Install Python Packages
cd voice_ai_agents/ < agent-nam e >
pip install -r requirements.txt
Common voice dependencies: openai
streamlit
sounddevice
soundfile
numpy
Run Voice Agent
streamlit run voice_agent.py
Local Model Setup (Ollama)
Many applications support local models via Ollama for privacy and cost savings.
Install Ollama
Download and install from ollama.ai curl -fsSL https://ollama.ai/install.sh | sh
Download the installer from ollama.ai
Pull Your Model
Download models you want to use: ollama pull llama3.1
ollama pull mistral
ollama pull qwen2.5
Use smaller models for faster responses: ollama pull llama3.1:7b # 7B parameter model
ollama pull gemma2:2b # 2B parameter model
Start Ollama Service
Verify it’s running: curl http://localhost:11434/api/tags
Run Local Applications
Applications with local support typically have separate files: streamlit run local_ < app-nam e > .py
Example: cd starter_ai_agents/ai_travel_agent
streamlit run local_travel_agent.py
Environment Variables
Manage your API keys securely using environment variables.
Using .env Files
Many projects include .env.example files:
# Copy the example file
cp .env.example .env
# Edit with your keys
nano .env
Example .env file:
OPENAI_API_KEY = sk-...
GOOGLE_API_KEY = AI...
ANTHROPIC_API_KEY = sk-ant-...
SERPAPI_KEY = ...
System Environment Variables
Add to ~/.bashrc or ~/.zshrc: export OPENAI_API_KEY = 'sk-...'
export GOOGLE_API_KEY = 'AI...'
Reload your shell: Command Prompt: setx OPENAI_API_KEY "sk-..."
PowerShell: [ Environment ]::SetEnvironmentVariable( "OPENAI_API_KEY" , "sk-..." , "User" )
Security Best Practices:
Never commit .env files to Git
Use different API keys for development and production
Rotate your API keys regularly
Add .env to your .gitignore file
Framework-Specific Setup
OpenAI Agents SDK
pip install openai-agents
pip install openai
pip install streamlit
pip install pydantic
pip install python-dotenv
Required environment variable:
export OPENAI_API_KEY = 'sk-...'
Google ADK (Agent Development Kit)
pip install google-genai-adk
pip install google-generativeai
Configuration:
# For Google AI Studio
export GOOGLE_GENAI_USE_VERTEXAI = False
export GOOGLE_API_KEY = 'AI...'
# For Vertex AI
export GOOGLE_GENAI_USE_VERTEXAI = True
LangChain & LangGraph
pip install langchain
pip install langgraph
pip install langchain-community
pip install langchain-openai
pip install langchain-google-genai
pip install langchain-anthropic
CrewAI
pip install crewai
pip install crewai-tools
Troubleshooting
Common Installation Issues
ModuleNotFoundError: No module named 'X'
Solution: # Upgrade pip
pip install --upgrade pip
# Reinstall requirements
pip install -r requirements.txt --force-reinstall
Permission Denied on macOS/Linux
Solution: # Don't use sudo with pip
# Instead, use --user flag
pip install --user -r requirements.txt
# Or use a virtual environment
python -m venv venv
source venv/bin/activate
pip install -r requirements.txt
Docker connection refused
Solution: # Ensure Docker is running
docker ps
# On macOS, start Docker Desktop
# On Linux
sudo systemctl start docker
Streamlit command not found
Solution: # Ensure streamlit is installed
pip install streamlit
# If still not working, run via python
python -m streamlit run app.py
Solution:
Check your API usage and billing
Use smaller models or reduce request frequency
Consider using local models via Ollama
Verification
Verify your installation:
# Check Python version
python --version
# Check pip
pip --version
# Check installed packages
pip list | grep streamlit
pip list | grep openai
# Test Streamlit
streamlit hello
# Check Docker (if needed)
docker --version
docker ps
Updating Dependencies
Keep your installations up to date:
# Update all packages
pip install -r requirements.txt --upgrade
# Update specific package
pip install --upgrade streamlit
# Update Ollama
curl -fsSL https://ollama.ai/install.sh | sh
Next Steps
Quick Start Run your first AI agent in minutes
AI Agents Explore starter and advanced agents
RAG Tutorials Build retrieval augmented generation apps
Configuration Advanced configuration options
Additional Resources
For project-specific installation instructions, always refer to the README.md file in each project directory.