Skip to main content

Installation

LlamaIndex.TS is designed to work across multiple JavaScript runtime environments. This guide covers installation for all supported runtimes and package managers.

Quick Install

For most Node.js projects, start with:
npm install llamaindex
The llamaindex package provides core functionality. You’ll also need to install provider packages for LLMs, embeddings, and vector stores.

Runtime Requirements

LlamaIndex.TS supports multiple JavaScript runtimes with different requirements:
RuntimeMinimum VersionStatus
Node.js>= 20.0.0✅ Full Support
DenoLatest✅ Full Support
BunLatest✅ Full Support
NitroLatest✅ Full Support
Vercel EdgeLatest✅ Limited Support
Cloudflare WorkersLatest✅ Limited Support
BrowserN/A⚠️ Limited (no AsyncLocalStorage)
Browser support is currently limited due to the lack of AsyncLocalStorage-like APIs. Use edge runtimes for serverless deployments.

Provider Packages

LlamaIndex.TS uses a modular architecture. Install only the providers you need to keep your bundle size small.

LLM Providers

Choose the LLM provider(s) you want to use:
npm install @llamaindex/openai
Supports GPT-4, GPT-3.5, and OpenAI embeddings.
import { openai } from "@llamaindex/openai";

const llm = openai({ model: "gpt-4o" });

Vector Store Providers

For production use, integrate with a vector database:
npm install @llamaindex/pinecone
import { PineconeVectorStore } from "@llamaindex/pinecone";
import { Pinecone } from "@pinecone-database/pinecone";

const pinecone = new Pinecone({ apiKey: process.env.PINECONE_API_KEY });
const pineconeIndex = pinecone.Index("my-index");

const vectorStore = new PineconeVectorStore({ pineconeIndex });

Runtime-Specific Setup

Node.js

Node.js >= 20 is required for full AsyncLocalStorage support.
node --version  # Should be >= 20.0.0
Standard installation works out of the box:
npm install llamaindex @llamaindex/openai

Deno

LlamaIndex.TS works with Deno’s npm compatibility:
import { VectorStoreIndex, Document } from "npm:llamaindex";
import { openai } from "npm:@llamaindex/openai";

// Your code here

Bun

Bun has full support with its fast package manager:
bun add llamaindex @llamaindex/openai
Then use normally:
import { VectorStoreIndex } from "llamaindex";

Vercel Edge Runtime

For Vercel Edge Functions, use the edge-compatible entry point:
// This is automatically resolved when running in Vercel Edge
import { VectorStoreIndex } from "llamaindex";

export const config = {
  runtime: "edge",
};
Some features that require file system access may be limited in edge environments. Use remote vector stores for data persistence.

Cloudflare Workers

Cloudflare Workers automatically use the workerd-compatible entry point:
import { VectorStoreIndex } from "llamaindex";

export default {
  async fetch(request: Request): Promise<Response> {
    // Your LlamaIndex code
  },
};

Environment Variables

Set up your API keys and configuration:
.env
# LLM Providers
OPENAI_API_KEY=sk-...
ANTHROPIC_API_KEY=sk-ant-...
GEMINI_API_KEY=...

# Vector Stores
PINECONE_API_KEY=...
QDRANT_URL=https://...
QDRANT_API_KEY=...

# Optional: LlamaCloud
LLAMA_CLOUD_API_KEY=...
Use a .env file for local development and configure environment variables in your deployment platform for production.

Additional Packages

Workflow and Agents

For building agentic applications:
npm install @llamaindex/workflow

File Readers

For reading different file formats:
# PDF support
npm install @llamaindex/pdf-reader

# DOCX support  
npm install @llamaindex/docx-reader

# Notion integration
npm install @llamaindex/notion

# Discord integration
npm install @llamaindex/discord

LlamaCloud

For managed RAG with LlamaCloud:
npm install @llamaindex/cloud

Verifying Installation

Create a simple test file to verify everything works:
test.ts
import { Document } from "llamaindex";

const doc = new Document({ text: "Hello, LlamaIndex!" });
console.log("Installation successful!", doc.getText());
Run it:
npx tsx test.ts
If you see “Installation successful!”, you’re ready to go!

Troubleshooting

Module Resolution Errors

If you encounter module resolution errors, ensure you’re using Node.js >= 20:
nvm install 20
nvm use 20

TypeScript Configuration

Add these settings to your tsconfig.json:
tsconfig.json
{
  "compilerOptions": {
    "module": "ESNext",
    "moduleResolution": "bundler",
    "target": "ES2022",
    "lib": ["ES2022"]
  }
}

API Key Issues

If your API key isn’t being recognized:
  1. Ensure .env is in your project root
  2. Install dotenv and load it:
    import "dotenv/config";
    
  3. Verify the key is set:
    console.log(process.env.OPENAI_API_KEY);
    

Next Steps

Quickstart

Build your first RAG application

Core Concepts

Learn about Documents, Nodes, and Indices

Build docs developers (and LLMs) love