Skip to main content

Overview

The Telegram integration provides an autonomous, context-aware bot that can participate in groups and direct messages. It features intelligent action generation, message history analysis, sticker support, and photo interpretation.

Features

  • Autonomous Operation: Periodically checks for unread messages and responds proactively
  • Multi-Chat Support: Handle multiple groups and DMs simultaneously with isolated contexts
  • Sticker Understanding: Interpret and send context-appropriate stickers
  • Photo Analysis: Vision model integration for image interpretation
  • Action System: LLM-driven action selection (send, read, wait, sleep)
  • Message History: Embedding-based semantic search with pgvector
  • Interruption Handling: Adaptive processing with message prioritization

Prerequisites

  • Node.js 18 or higher
  • PostgreSQL with pgvector extension
  • Telegram bot token from BotFather
  • LLM API access (OpenAI, OpenRouter, etc.)
  • Embedding model (Ollama with nomic-embed-text recommended)
  • Docker and Docker Compose (for database)

Setup

1. Create Telegram Bot

  1. Open Telegram and search for @BotFather
  2. Send /newbot command
  3. Follow prompts to choose a name and username
  4. Copy the bot token (format: 1234567890:ABCdefGHIjklMNOpqrsTUVwxyz)
  5. Configure bot settings:
    • /setprivacy - Disable (to receive all group messages)
    • /setjoingroups - Enable

2. Start Ollama (for Embeddings)

Install and start Ollama:
# Start Ollama service
ollama start

# Pull embedding model
ollama pull nomic-embed-text
You can use other embedding providers, but Ollama with nomic-embed-text is recommended for cost-efficiency.

3. Configure Environment

Navigate to the Telegram bot service:
cd services/telegram-bot
cp .env .env.local
Edit .env.local:
# Database
DATABASE_URL=postgres://postgres:123456@localhost:5433/postgres

# Telegram
TELEGRAM_BOT_TOKEN='1234567890:ABCdefGHIjklMNOpqrsTUVwxyz'

# LLM Configuration
LLM_API_BASE_URL='https://openrouter.ai/api/v1/'
LLM_API_KEY='sk-or-v1-...'
LLM_MODEL='deepseek/deepseek-chat-v3-0324:free'
LLM_RESPONSE_LANGUAGE='English'

# Vision Model (for photos)
LLM_VISION_API_BASE_URL='https://openrouter.ai/api/v1/'
LLM_VISION_API_KEY='sk-or-v1-...'
LLM_VISION_MODEL='openai/gpt-4o'

# Embedding Model
EMBEDDING_API_BASE_URL='http://localhost:11434/v1/'
EMBEDDING_API_KEY=''
EMBEDDING_MODEL='nomic-embed-text'
EMBEDDING_DIMENSION='768'

# Admin User IDs (comma-separated Telegram user IDs)
ADMIN_USER_IDS='123456789,987654321'

4. Start Database

From the service directory:
docker compose up -d
This starts:
  • PostgreSQL with pgvector
  • Grafana (observability)
  • Prometheus (metrics)
  • Tempo (tracing)
  • OpenTelemetry Collector

5. Install Dependencies

From project root:
pnpm install
pnpm run build:packages

6. Initialize Database

Run migrations:
pnpm run -F @proj-airi/telegram-bot db:push

7. Start the Bot

pnpm run -F @proj-airi/telegram-bot start
You should see:
[Bot] bot initialized - bot_username: YourBotName

Usage

Adding to Groups

  1. Go to your Telegram group
  2. Click Group InfoAdd Members
  3. Search for your bot username
  4. Add the bot
  5. Grant admin rights (recommended for full functionality)

Direct Messages

Simply send a message to your bot - it will respond automatically.

Adding Sticker Packs

Only admins can add sticker packs:
  1. Send a sticker from the pack you want to add
  2. Reply to that sticker with /add_sticker_pack
  3. The bot will process and remember all stickers in the pack

Code Examples

Bot Initialization

From services/telegram-bot/src/bots/telegram/index.ts:462:
import { Bot } from 'grammy'

const telegramBot = new Bot(env.TELEGRAM_BOT_TOKEN!)

// Error handling
telegramBot.errorHandler = async err => 
  log.withError(err).log('Error occurred')

// Initialize bot info
await telegramBot.init()
log.withField('bot_username', telegramBot.botInfo.username)
  .log('bot initialized')

// Start polling
telegramBot.start({ drop_pending_updates: true })

Message Handling

From services/telegram-bot/src/bots/telegram/index.ts:525:
telegramBot.on('message:text', async (ctx) => {
  const messageId = `${ctx.message.chat.id}-${ctx.message.message_id}`
  
  if (!botCtx.processedIds.has(messageId)) {
    botCtx.processedIds.add(messageId)
    botCtx.messageQueue.push({
      message: ctx.message,
      status: 'ready',
    })
  }

  const chatCtx = ensureChatContext(botCtx, ctx.message.chat.id.toString())
  await onMessageArrival(botCtx, chatCtx)
})

Action System

AIRI uses an LLM-driven action system. Available actions: read_unread_messages:
{
  action: 'read_unread_messages',
  chatId: '-1001234567890'
}
send_message:
{
  action: 'send_message',
  chatId: '-1001234567890',
  content: 'Hello everyone!'
}
send_sticker:
{
  action: 'send_sticker',
  chatId: '-1001234567890',
  fileId: 'CAACAgIAAxkBAAID...' // From list_stickers
}
list_chats:
{
  action: 'list_chats'
}
continue/break/sleep:
{
  action: 'continue' // Continue to next tick
}
{
  action: 'break' // Stop and clear context
}
{
  action: 'sleep' // Wait 30 seconds
}

Photo Interpretation

From services/telegram-bot/src/bots/telegram/index.ts:511:
telegramBot.on('message:photo', async (ctx) => {
  const messageId = `${ctx.message.chat.id}-${ctx.message.message_id}`
  
  if (!botCtx.processedIds.has(messageId)) {
    botCtx.processedIds.add(messageId)
    botCtx.messageQueue.push({
      message: ctx.message,
      status: 'pending', // Will be interpreted before processing
    })
  }

  const chatCtx = ensureChatContext(botCtx, ctx.message.chat.id.toString())
  await onMessageArrival(botCtx, chatCtx)
})
Photos are automatically interpreted using the vision model before being added to context.

Periodic Tick System

From services/telegram-bot/src/bots/telegram/index.ts:339:
function loopPeriodic(botCtx: BotContext) {
  setTimeout(async () => {
    try {
      // Check all active chats
      await loopIterationPeriodicForExistingChat(botCtx)
      
      // Check for new opportunities
      await loopIterationPeriodicWithNoChats(botCtx)
    } catch (err) {
      botCtx.logger.withError(err).log('error in main loop')
    } finally {
      loopPeriodic(botCtx)
    }
  }, 60 * 1000) // Every 60 seconds
}

Architecture

Message Queue

Messages are queued and processed sequentially per chat:
  1. Message arrives → Added to queue
  2. If sticker/photo → Interpret first (status: pending)
  3. Mark as ready → Process with LLM
  4. Generate action → Dispatch action
  5. Continue until action returns break or continue

Context Management

Each chat maintains isolated context:
interface ChatContext {
  chatId: string
  messages: Message[]        // Recent conversation
  actions: ActionHistory[]   // Recent actions taken
  currentAbortController?: AbortController
}

Embedding Storage

All messages are stored with embeddings for semantic search:
CREATE TABLE chat_messages (
  id SERIAL PRIMARY KEY,
  chat_id TEXT NOT NULL,
  message_id INTEGER NOT NULL,
  content TEXT,
  embedding vector(768),
  created_at TIMESTAMP DEFAULT NOW()
);

Configuration

Observability

The Telegram bot includes full OpenTelemetry instrumentation:

Admin Commands

Set ADMIN_USER_IDS in .env.local to enable admin-only commands:
  • /add_sticker_pack - Import a sticker set

Message Limits

  • Context window: 20 messages (auto-trimmed)
  • Action history: 50 actions (auto-trimmed)
  • Unread queue: 100 messages per chat

Troubleshooting

Bot not responding

  1. Check logs for connection errors
  2. Verify TELEGRAM_BOT_TOKEN is correct
  3. Ensure bot has message access in groups (privacy mode disabled)

Database connection failed

  1. Check PostgreSQL is running: docker compose ps
  2. Verify DATABASE_URL connection string
  3. Run migrations: pnpm run -F @proj-airi/telegram-bot db:push

Embedding errors

  1. Check Ollama is running: ollama list
  2. Verify model is pulled: ollama pull nomic-embed-text
  3. Test endpoint: curl http://localhost:11434/api/embeddings

Vision not working

  1. Verify LLM_VISION_MODEL supports image input
  2. Check API key has vision model access
  3. Monitor rate limits

Next Steps

Discord Integration

Add AIRI to Discord

Minecraft Integration

Run AIRI in Minecraft

Build docs developers (and LLMs) love