Overview
The Minecraft integration is an intelligent bot built on a four-layered cognitive architecture inspired by cognitive science. AIRI can understand natural language commands, explore the world autonomously, gather resources, craft items, and interact with players.Features
- Cognitive Architecture: Perception → Reflex → Conscious → Action layers
- Natural Language Understanding: Chat with the bot using plain English
- Autonomous Gameplay: Gather resources, craft items, build structures
- World Awareness: Navigate terrain, avoid obstacles, track entities
- Combat System: Fight hostile mobs and defend itself
- Inventory Management: Smart item collection and crafting
- Debug Dashboard: Web-based real-time monitoring and MCP integration
- Mineflayer Viewer: Watch the bot’s POV in your browser
Prerequisites
- Node.js 18 or higher
- Minecraft Java Edition server (1.20 recommended)
- OpenAI API key (or compatible LLM provider)
- AIRI server runtime (optional, for integration)
Setup
1. Prepare Minecraft Server
You need a Minecraft server where the bot can connect. Options:- Local server: Download from minecraft.net
- Aternos: Free hosting at aternos.org
- Existing server: Use any Java Edition server
The bot supports Minecraft Java Edition versions 1.12 - 1.20. Version 1.20 is recommended.
2. Configure Environment
Navigate to the Minecraft service:.env.local:
3. Install Dependencies
From project root:4. Start the Bot
Usage
In-Game Commands
Talk to the bot in Minecraft chat:Web Viewer
Open http://localhost:3007 to see the bot’s first-person view in real-time.Debug Dashboard
Open http://localhost:3008 for:- Real-time bot status
- Cognitive layer monitoring
- Tool execution interface
- MCP server integration
Cognitive Architecture
The bot operates on four distinct layers:Layer A: Perception
Location:src/cognitive/perception/
Collects and processes raw Mineflayer events into normalized signals:
- Chat messages
- Health/hunger changes
- Entity spawns
- Block interactions
- Combat events
Layer B: Reflex
Location:src/cognitive/reflex/
Handles immediate, instinctive reactions:
- Auto-eat when hungry
- Dodge incoming attacks
- React to sudden damage
- Idle behaviors (looking around)
Layer C: Conscious
Location:src/cognitive/conscious/
LLM-powered reasoning and planning:
- Interprets user commands
- Plans multi-step tasks
- Manages goal hierarchies
- Handles conversation
Layer D: Action
Location:src/cognitive/action/
Executes concrete tasks in the world:
- Movement and pathfinding
- Block breaking/placing
- Item crafting
- Entity interaction
- Inventory management
Code Examples
Bot Initialization
Fromservices/minecraft/src/main.ts:22:
Configuration Loading
Fromservices/minecraft/src/composables/config.ts:62:
Skills System
The bot has modular skills insrc/skills/:
Movement (src/skills/movement.ts):
src/skills/blocks.ts):
src/skills/crafting.ts):
src/skills/combat.ts):
Plugin System
Extend bot capabilities with plugins:Advanced Features
Query DSL
Query the world from conscious layer:MCP Integration
The debug server exposes Model Context Protocol tools:Troubleshooting
Bot can’t connect to server
- Verify server is running and accessible
- Check
BOT_HOSTNAMEandBOT_PORT - For online servers, use
BOT_AUTH='microsoft' - Ensure Minecraft version matches:
BOT_VERSION='1.20'
Bot is stuck or unresponsive
- Check debug dashboard for task status
- Send
stopcommand in chat - Restart the bot service
- Check logs for errors
LLM errors
- Verify
OPENAI_API_KEYis valid - Check API rate limits
- Ensure
OPENAI_MODELis accessible - Try a different model endpoint
Pathfinding fails
- Bot may be in difficult terrain
- Increase search radius
- Clear obstacles near bot
- Use
/tpto relocate bot
Performance Tips
- Use
o1-minifor complex reasoning tasks - Set shorter tick intervals for faster reactions
- Reduce perception event frequency for lower CPU usage
- Enable viewer only when debugging
Next Steps
Factorio Integration
Automate Factorio gameplay
API Reference
Full Minecraft API documentation
