Overview
Adist integrates with Anthropic’s Claude API to provide powerful AI-driven code analysis using state-of-the-art language models. Claude offers excellent code understanding and generation capabilities.Available Models
You can choose from three Claude 3 models:Claude 3 Opus
Most capable model for complex tasks
Claude 3 Sonnet
Balanced performance and cost (default)
Claude 3 Haiku
Fastest model for simple queries
Setup
Get an API Key
Sign up for an Anthropic API key at console.anthropic.com
Set Environment Variable
Add your API key to your environment:To make it permanent, add the line to your
- Linux/macOS
- Windows (PowerShell)
~/.bashrc, ~/.zshrc, or ~/.profile:Configure Adist
Run the LLM configuration command:Select:
- Anthropic as your provider
- Your preferred Claude model (Opus, Sonnet, or Haiku)
Features
Context Caching
The Anthropic service implementation includes intelligent context caching:- Topic Identification: Automatically identifies query topics using AI
- Cache Duration: Contexts are cached for 30 minutes
- Related Context Merging: Similar topics are merged for better responses
- Cache Cleanup: Old entries are automatically removed
Query Complexity Estimation
Queries are analyzed and categorized as:- Low Complexity: Simple questions (< 8 words, no technical terms)
- Medium Complexity: Standard questions (8-15 words or basic technical terms)
- High Complexity: Complex questions (> 15 words, code snippets, comparisons)
Document Relevance Scoring
The service scores documents based on:- Code blocks and syntax
- Comments and documentation
- Function definitions (
function,=>) - Class definitions (
class,interface)
Conversation Analysis
In chat mode, the service analyzes conversation patterns to detect:- Follow-up Questions: Short queries or questions building on previous context
- Deep Dives: Extended conversations on related topics
Code Reference
The Anthropic service is implemented in/home/daytona/workspace/source/src/utils/anthropic.ts:20
Key Methods
summarizeFile
Generates comprehensive summaries of individual files:generateOverallSummary
Creates a high-level project overview from file summaries:queryProject
Answers questions about your project with context optimization:chatWithProject
Enables conversational interactions with full history support:Pricing
Claude 3 Sonnet (default) pricing:- $3 per million tokens (combined input and output)
Token usage is optimized through context caching and intelligent document selection.
Configuration Options
Context Limits
- Maximum Context Length: 60,000 characters
- Cache Timeout: 30 minutes
- Dynamic Adjustment: Context size varies based on query complexity
Optimization Strategies
The service employs several strategies to optimize API usage:- Context Reuse: Related queries share cached context
- Relevance Filtering: Only the most relevant documents are included
- Smart Truncation: Documents are truncated based on relevance scores
- Project Summaries: High-level overviews supplement missing context
Best Practices
- Efficient Queries
- Cost Optimization
- Quality Results
- Ask specific, focused questions
- Use streaming mode for long responses
- Leverage chat mode for related follow-up questions
Troubleshooting
API Key Not Found
If you see “ANTHROPIC_API_KEY environment variable is required”:- Verify the environment variable is set:
echo $ANTHROPIC_API_KEY - Restart your terminal after setting the variable
- Check for typos in the variable name
Rate Limits
If you encounter rate limiting:- Wait a few moments before retrying
- Consider reducing query frequency
- Check your API usage at console.anthropic.com
Poor Response Quality
- Ensure your project is fully indexed:
adist reindex - Generate file summaries:
adist reindex --summarize - Try asking more specific questions
- Use chat mode for context-aware follow-ups
Next Steps
Start Querying
Ask questions about your codebase
Start Chatting
Have conversations about your project