What are Extensions?
Extensions are modular components that integrate with Jan’s core APIs. They are packaged as.tgz files and can be installed through the Jan interface. Each extension extends the BaseExtension class and implements specific interfaces based on its functionality.
Extension Types
Jan supports several extension types:Inference
Implement custom inference engines for running AI models
Conversational
Manage threads, messages, and conversation persistence
Assistant
Create and manage AI assistants with custom behavior
Engine
Build AI engines for model execution (local or remote)
RAG
Implement retrieval-augmented generation capabilities
Vector DB
Add vector database backends for embeddings
Extension Architecture
Every extension in Jan follows this structure:Core Capabilities
Event System
Extensions can listen to and emit events throughout Jan:Settings Management
Extensions can register and manage their own settings:Model Registration
Extensions can register models with Jan:Extension Lifecycle
- Installation: Extensions are installed as
.tgzpackages - Loading:
onLoad()is called when Jan starts or the extension is enabled - Active: Extension responds to events and provides functionality
- Unloading:
onUnload()is called when Jan closes or the extension is disabled
Available Events
Message Events
MessageEvent.OnMessageSent- Fired when a user sends a messageMessageEvent.OnMessageResponse- Fired when an assistant respondsMessageEvent.OnMessageUpdate- Fired when a message is updated
Inference Events
InferenceEvent.OnInferenceStopped- Fired when inference is stopped
Download Events
DownloadEvent.onFileDownloadUpdate- Progress updates during downloadsDownloadEvent.onFileDownloadError- Download errorsDownloadEvent.onModelValidationStarted- Model validation begins
App Events
AppEvent.onModelImported- Fired when a model is imported
Built-in Extensions
Jan includes several built-in extensions:- @janhq/conversational-extension - File system-based conversation storage
- @janhq/assistant-extension - Default AI assistant implementation
- @janhq/llamacpp-extension - llama.cpp inference engine
- @janhq/download-extension - File download management
- @janhq/rag-extension - Retrieval-augmented generation
- @janhq/vector-db-extension - Vector database backend
Next Steps
Getting Started
Create your first extension
Core API Reference
Explore the @janhq/core API
Building Extensions
Package and distribute your extension