Overview
DocSearch offers free crawler hosting for open source projects and technical documentation. Algolia maintains and runs the crawler infrastructure, so you only need to implement the search UI on your site.DocSearch is completely free. In exchange, we ask that you keep the “Search by Algolia” logo displayed next to search results.
Eligibility Requirements
DocSearch is available for: ✅ Open source project documentation ✅ Technical blogs with programming content ✅ Public documentation sites (not behind authentication) ✅ Production-ready websites (not staging or development sites)What’s Included
When accepted to the DocSearch program, you receive:Free Algolia Application
A dedicated Algolia app with your documentation index
Crawler Hosting
Automated weekly crawls of your documentation
Dashboard Access
Monitor crawls, view analytics, and manage your index
API Keys
Search-only API keys for your frontend integration
Application Process
Applying to DocSearch is streamlined with automated validation:Submit Your Domain
Go to the DocSearch application page and sign up.Submit your documentation domain for automated validation against our requirements.
Automated Validation
Our system checks your site against eligibility criteria:
- Is the content technical?
- Is it publicly accessible?
- Is it production-ready?
Create Your Crawler
Once approved, continue the onboarding process in the Algolia dashboard:
- Configure your crawler settings
- Define URL patterns
- Set up content selectors
Verify Domain Ownership
You must verify domain ownership within 7 days of approval to continue using the crawler.Follow the verification instructions provided in the dashboard.
Data Ingestion
The crawler runs and indexes your documentation. This may take from a few minutes to several hours depending on your site’s size.
Implement Search UI
Once data is ingested, implement DocSearch on your site using:
- The provided code snippet, or
- One of our framework integrations
Timeline
Automated Approval
If your site meets all criteria: Immediate approval → proceed to crawler setupManual Review
If automated validation is inconclusive: 1-2 business days for manual reviewFirst Crawl
After approval and configuration: Minutes to hours depending on site sizeOngoing Crawls
Default schedule: Once per week (can trigger manually anytime)Before You Apply
Make sure your documentation site is ready:✅ Content is public and accessible
✅ Content is public and accessible
The crawler must be able to access your documentation without authentication. If your docs require login, DocSearch won’t work.
✅ Server-side rendering is enabled
✅ Server-side rendering is enabled
Your content should be present in the HTML source (not loaded entirely via JavaScript). View your page source to confirm.If you must use client-side rendering, the crawler can use a headless browser, but this significantly slows down crawling.
✅ Semantic HTML structure
✅ Semantic HTML structure
Use proper heading tags (h1, h2, h3) to structure your content. This helps the crawler build a meaningful hierarchy.
✅ Internal linking
✅ Internal linking
All documentation pages should be reachable by following links from your main docs page.
✅ Production ready
✅ Production ready
Your site should be stable and ready for users. We don’t index staging or development environments.
Using Official Integrations
We recommend using one of our official integrations for easier implementation:Docusaurus
Built-in DocSearch integration
VitePress
First-class DocSearch support
React
@docsearch/react component
JavaScript
Vanilla JS implementation
After Acceptance
What You’ll Receive
You’ll get an email with:- Application ID: Your Algolia app identifier
- API Key: A search-only key (safe to commit to your repo)
- Index Name: The name of your documentation index
- Dashboard Access: Credentials to access your Algolia dashboard
Implementation
Add DocSearch to your site:The API key provided is search-only and can be safely committed to version control.
Managing Your Crawler
Access the Crawler Dashboard to:- Trigger crawls manually when you update docs
- Edit your configuration in the live editor
- Monitor crawl statistics and errors
- View indexed content and record structure
- Adjust crawl frequency (within limits)
Multiple Projects
If you have multiple documentation sites, please submit separate applications. This ensures:- Proper API key scoping
- Clear application naming
- Correct index generation
- Proper domain restrictions
- Easier support and troubleshooting
Cost and Limitations
Free Forever
DocSearch is completely free for eligible sites, including:- Crawler hosting and maintenance
- Algolia search infrastructure
- Weekly automated crawls
- Dashboard access
- Analytics and monitoring
Requirements
In exchange for free hosting: ✅ Keep the “Search by Algolia” logo visible ✅ Use it only for technical documentation or blogs ✅ Keep your documentation publicly accessibleRemoving the Logo
If you cannot display the Algolia logo, you have two options:- Self-host the crawler: Use the open source docsearch-scraper with your own Algolia account
- Paid Algolia plan: Contact Algolia sales for a commercial plan
Alternative Options
Run Your Own Crawler
If you’re not eligible or need more control:Self-Hosted Crawler
Run the open source DocSearch crawler yourself
- Your own Algolia account (free tier supports up to 10k records)
- Docker to run the crawler
- A way to schedule regular crawls
Use Algolia Directly
For non-documentation use cases:Algolia API
Use Algolia’s API clients directly
Frequently Asked Questions
Can I use DocSearch for non-documentation pages?
Can I use DocSearch for non-documentation pages?
The free DocSearch program only covers technical documentation and blogs. For other content, you’ll need to run your own crawler or use Algolia’s commercial plans.
Can I share my API key publicly?
Can I share my API key publicly?
How often does the crawler run?
How often does the crawler run?
By default, once per week. You can trigger manual crawls anytime from the Crawler Dashboard.
Can I crawl localhost or staging sites?
Can I crawl localhost or staging sites?
No. The crawler can only access publicly available URLs. For local development, run the crawler yourself.
What if my documentation has authentication?
What if my documentation has authentication?
The free DocSearch program requires public documentation. For private docs, you’ll need to self-host the crawler with authentication credentials.
How do I update my crawler configuration?
How do I update my crawler configuration?
Edit your configuration in the Crawler Dashboard live editor. Changes take effect on the next crawl.
Can I index code examples?
Can I index code examples?
While possible, we don’t recommend it. Code samples often contain repetitive boilerplate that adds noise to search results. Focus on prose documentation.
What happens if I exceed free tier limits?
What happens if I exceed free tier limits?
DocSearch applications have generous limits for documentation use cases. If you need more, contact Algolia support.
Terms and Conditions
Please review the DocSearch Plan Terms and Conditions before applying.
Get Help
Before Applying
Join Discord
Ask questions and get help from the community
Read the Docs
Learn about DocSearch features and integration
After Acceptance
- Crawler issues: Algolia Support
- UI library issues: GitHub Issues
- General questions: Discord community
Ready to Apply?
Apply to DocSearch
Submit your documentation site for free DocSearch hosting
Make sure you’ve reviewed the eligibility requirements and prepared your site before applying!
