Provider Configuration
Groq (Default)
Groq is the default provider, offering fast inference for open-source models:agent/agent_factory.py:106-114):
Custom OpenAI-Compatible Provider
Use any OpenAI-compatible API endpoint:agent/agent_factory.py:116-124):
Environment Variables
Provider identifier or base URL:
groq- Use Groq’s API- Any OpenAI-compatible base URL (e.g.,
https://api.openai.com/v1)
Model identifier for the main team leader agent.Groq models:
openai/gpt-oss-120bllama-3.1-70b-versatilemixtral-8x7b-32768
gpt-4-turbogpt-5gpt-4o
API key for custom OpenAI-compatible provider.
Groq API key. Required even when using custom providers, as some agents always use Groq.
Sampling temperature (0.0 - 2.0). Lower values are more deterministic.
Nucleus sampling threshold (0.0 - 1.0).
Agent-Specific Models
Junkie uses different models optimized for different agent roles:Team Leader
Uses the main model configured viaCUSTOM_PROVIDER and CUSTOM_MODEL.
agent/agent_factory.py:310-327
Code Agent
Hardcoded to usegpt-5 for advanced code generation:
agent/agent_factory.py:196-204
Perplexity Sonar Agent
Uses Perplexity’ssonar-pro for real-time web research:
agent/agent_factory.py:234-245
Groq Compound Agent
Always uses Groq’scompound model for fast execution:
agent/agent_factory.py:248-260
Context Q&A Agent
Uses a long-context model configured separately:agent/agent_factory.py:264-290
Memory Manager Model
Always uses Groq for fast memory processing:agent/agent_factory.py:153-161
System Prompt Management
System prompts can be managed through Phoenix:agent/agent_factory.py:126-147
Example Configurations
API Key Requirements
Supported Providers
Any provider compatible with OpenAI’s API format will work:- Groq - Fast open-source models
- OpenAI - GPT-4, GPT-5 models
- Together AI - Various open models
- Perplexity - Sonar models with web search
- Anthropic (via proxy) - Claude models
- Local providers - Ollama, LM Studio, etc.
Make sure your provider supports the OpenAI API format with
/v1/chat/completions endpoint.