Introduction
Welcome to the CheckThat AI API - an advanced platform for claim normalization, fact-checking, and LLM output evaluation. Our API provides OpenAI-compatible endpoints with enhanced features for improving the accuracy and reliability of AI-generated claims.Base URL
API Endpoints
CheckThat AI provides several categories of endpoints:Chat Endpoints
POST /chat- Real-time chat interface for claim normalizationPOST /v1/chat/completions- OpenAI-compatible chat completions with CheckThat AI enhancements
Model Endpoints
GET /v1/models- List all available LLM models across providers
Health Endpoints
GET /- Root endpoint with API informationGET /health- Health check endpoint
Supported LLM Providers
CheckThat AI supports multiple LLM providers:OpenAI
GPT-5, GPT-5 nano, o3, o4-mini
Anthropic
Claude Sonnet 4, Claude Opus 4.1
Gemini 2.5 Pro, Gemini 2.5 Flash
xAI
Grok 3, Grok 4, Grok 3 Mini
Together AI
Llama 3.3 70B, DeepSeek R1
Rate Limiting
To ensure fair usage and service availability, CheckThat AI implements rate limiting:Rate Limit Headers
All responses include rate limit information:Rate Limit Exceeded Response
When rate limit is exceeded, you’ll receive a429 status code:
Response Format
All API responses follow standard formats:Success Response
Successful requests return appropriate data based on the endpoint, typically following OpenAI’s response structure for compatibility.Error Response
Error responses include detailed information:HTTP Status Codes
Request successful
Invalid request parameters or malformed JSON
Missing or invalid API key/authentication token
Authentication valid but insufficient permissions
Request validation failed
Rate limit exceeded
Unexpected server error occurred
Getting Started
1. Choose Your Authentication Method
CheckThat AI supports two authentication approaches:- API Key Authentication - Use your OpenAI/Anthropic/etc. API key directly
- Bearer Token Authentication - For
/v1/chat/completionsendpoint
2. Make Your First Request
Simple chat request:3. Explore Advanced Features
CheckThat AI offers enhanced features beyond standard LLM APIs:- Claim Refinement - Automatically improve claim quality through iterative evaluation
- Post-Normalization Evaluation - Assess output quality with custom metrics
- Multi-Provider Support - Seamlessly switch between LLM providers
- Streaming Support - Real-time response streaming
CORS Configuration
The API implements endpoint-specific CORS policies:Public API Endpoints (/v1/*)
- Accepts requests from all origins (
*) - Suitable for client-side applications
Restricted Endpoints (/chat)
- Limited to specific domains:
https://www.checkthat-ai.comhttps://checkthat-ai.comhttps://nikhil-kadapala.github.io
API Versioning
The current API version is v1.0.0. Version information is included in all responses:SDKs and Libraries
CheckThat AI is compatible with OpenAI SDKs:Next Steps
Authentication
Learn how to authenticate your API requests
Chat Completions
Create chat completions with CheckThat AI features
Health Checks
Monitor API health and availability
Batch Processing
Process multiple claims efficiently