Syntax
Arguments
The natural language question to ask about your project.
Options
Enable streaming responses for real-time output.Note: Code highlighting may not work properly in streaming mode.
Behavior
- Searches for relevant code blocks using semantic search
- Retrieves project summaries if available and relevant
- Sends context to configured LLM provider
- Returns AI-generated answer with code examples
- Displays cost estimation and context information
Examples
Ask about functionality
Request code examples
Get project overview
Use streaming mode
Output Information
The command displays:- Debug Info: Number of documents and blocks found
- Document Tree: Hierarchical view of relevant files and blocks
- Answer: AI-generated response with syntax highlighting
- Cost: Estimated API cost for the query
- Context Status: Whether cached context was used
- Query Complexity: Complexity rating (low/medium/high)
Context Caching
The command uses context caching to reduce costs:- Project context is cached per project ID
- Repeated queries reuse cached context
- Cache indicator shown in output
Requirements
- Active project must be selected
- LLM provider must be configured (see adist llm-config)
- Project should be indexed with block-based indexing
Environment Variables
Required if using Anthropic Claude provider.
Required if using OpenAI provider.
Related Commands
- adist chat - Interactive multi-turn conversations
- adist get - Search without AI interpretation
- adist llm-config - Configure LLM provider
- adist summary - View project summaries