Skip to main content

Overview

DocSearch offers free crawler hosting for open source projects and technical documentation. Algolia maintains and runs the crawler infrastructure, so you only need to implement the search UI on your site.
DocSearch is completely free. In exchange, we ask that you keep the “Search by Algolia” logo displayed next to search results.

Eligibility Requirements

DocSearch is available for: Open source project documentationTechnical blogs with programming contentPublic documentation sites (not behind authentication) ✅ Production-ready websites (not staging or development sites)
We typically decline applications for:
  • Marketing or commercial content
  • Non-technical documentation
  • Private/internal documentation
  • Sites under development

What’s Included

When accepted to the DocSearch program, you receive:

Free Algolia Application

A dedicated Algolia app with your documentation index

Crawler Hosting

Automated weekly crawls of your documentation

Dashboard Access

Monitor crawls, view analytics, and manage your index

API Keys

Search-only API keys for your frontend integration

Application Process

Applying to DocSearch is streamlined with automated validation:
1

Submit Your Domain

Go to the DocSearch application page and sign up.Submit your documentation domain for automated validation against our requirements.
2

Automated Validation

Our system checks your site against eligibility criteria:
  • Is the content technical?
  • Is it publicly accessible?
  • Is it production-ready?
Fast track: If you meet all criteria, you’ll be approved immediately.Manual review: If automated validation is inconclusive, we’ll manually review within 1-2 business days.
3

Create Your Crawler

Once approved, continue the onboarding process in the Algolia dashboard:
  • Configure your crawler settings
  • Define URL patterns
  • Set up content selectors
4

Verify Domain Ownership

You must verify domain ownership within 7 days of approval to continue using the crawler.Follow the verification instructions provided in the dashboard.
5

Data Ingestion

The crawler runs and indexes your documentation. This may take from a few minutes to several hours depending on your site’s size.
6

Implement Search UI

Once data is ingested, implement DocSearch on your site using:

Timeline

Automated Approval

If your site meets all criteria: Immediate approval → proceed to crawler setup

Manual Review

If automated validation is inconclusive: 1-2 business days for manual review

First Crawl

After approval and configuration: Minutes to hours depending on site size

Ongoing Crawls

Default schedule: Once per week (can trigger manually anytime)

Before You Apply

Make sure your documentation site is ready:
The crawler must be able to access your documentation without authentication. If your docs require login, DocSearch won’t work.
Your content should be present in the HTML source (not loaded entirely via JavaScript). View your page source to confirm.If you must use client-side rendering, the crawler can use a headless browser, but this significantly slows down crawling.
Use proper heading tags (h1, h2, h3) to structure your content. This helps the crawler build a meaningful hierarchy.
All documentation pages should be reachable by following links from your main docs page.
Your site should be stable and ready for users. We don’t index staging or development environments.

Using Official Integrations

We recommend using one of our official integrations for easier implementation:

Docusaurus

Built-in DocSearch integration

VitePress

First-class DocSearch support

React

@docsearch/react component

JavaScript

Vanilla JS implementation

After Acceptance

What You’ll Receive

You’ll get an email with:
  1. Application ID: Your Algolia app identifier
  2. API Key: A search-only key (safe to commit to your repo)
  3. Index Name: The name of your documentation index
  4. Dashboard Access: Credentials to access your Algolia dashboard

Implementation

Add DocSearch to your site:
import docsearch from '@docsearch/js';
import '@docsearch/css';

docsearch({
  container: '#docsearch',
  appId: 'YOUR_APP_ID',
  indexName: 'YOUR_INDEX_NAME',
  apiKey: 'YOUR_SEARCH_API_KEY',
});
The API key provided is search-only and can be safely committed to version control.

Managing Your Crawler

Access the Crawler Dashboard to:
  • Trigger crawls manually when you update docs
  • Edit your configuration in the live editor
  • Monitor crawl statistics and errors
  • View indexed content and record structure
  • Adjust crawl frequency (within limits)

Multiple Projects

We recommend one Algolia application per project.
If you have multiple documentation sites, please submit separate applications. This ensures:
  • Proper API key scoping
  • Clear application naming
  • Correct index generation
  • Proper domain restrictions
  • Easier support and troubleshooting
Apply for additional projects using the same process.

Cost and Limitations

Free Forever

DocSearch is completely free for eligible sites, including:
  • Crawler hosting and maintenance
  • Algolia search infrastructure
  • Weekly automated crawls
  • Dashboard access
  • Analytics and monitoring

Requirements

In exchange for free hosting: ✅ Keep the “Search by Algolia” logo visible ✅ Use it only for technical documentation or blogs ✅ Keep your documentation publicly accessible If you cannot display the Algolia logo, you have two options:
  1. Self-host the crawler: Use the open source docsearch-scraper with your own Algolia account
  2. Paid Algolia plan: Contact Algolia sales for a commercial plan

Alternative Options

Run Your Own Crawler

If you’re not eligible or need more control:

Self-Hosted Crawler

Run the open source DocSearch crawler yourself
You’ll need:
  • Your own Algolia account (free tier supports up to 10k records)
  • Docker to run the crawler
  • A way to schedule regular crawls

Use Algolia Directly

For non-documentation use cases:

Algolia API

Use Algolia’s API clients directly

Frequently Asked Questions

The free DocSearch program only covers technical documentation and blogs. For other content, you’ll need to run your own crawler or use Algolia’s commercial plans.
Yes! The API key provided is search-only and restricted to your specific index. It’s safe to commit to public repositories.
By default, once per week. You can trigger manual crawls anytime from the Crawler Dashboard.
No. The crawler can only access publicly available URLs. For local development, run the crawler yourself.
The free DocSearch program requires public documentation. For private docs, you’ll need to self-host the crawler with authentication credentials.
Edit your configuration in the Crawler Dashboard live editor. Changes take effect on the next crawl.
While possible, we don’t recommend it. Code samples often contain repetitive boilerplate that adds noise to search results. Focus on prose documentation.
DocSearch applications have generous limits for documentation use cases. If you need more, contact Algolia support.

Terms and Conditions

Please review the DocSearch Plan Terms and Conditions before applying.

Get Help

Before Applying

Join Discord

Ask questions and get help from the community

Read the Docs

Learn about DocSearch features and integration

After Acceptance

Ready to Apply?

Apply to DocSearch

Submit your documentation site for free DocSearch hosting
Make sure you’ve reviewed the eligibility requirements and prepared your site before applying!

Build docs developers (and LLMs) love