Core AI Features
Glass offers three primary AI-powered features:Agent Panel
Interactive AI agents for code generation, refactoring, and complex tasks
Inline Assistant
Context-aware AI assistance directly in your editor
Code Completion
Real-time AI-powered code suggestions as you type
Supported AI Providers
Glass supports multiple language model providers, allowing you to choose the best option for your workflow:- Anthropic Claude - Claude 4.6 Sonnet, Opus, and other Claude models
- OpenAI - GPT-4, GPT-5, o1, o3, and other OpenAI models
- OpenAI-compatible APIs - Any OpenAI-compatible endpoint
- Ollama - Local models for privacy and offline usage
- Google Gemini - Gemini models via Agent Servers
- Custom providers - Extend with your own model providers
Most AI features require authentication with at least one provider. See the Provider Configuration guide for setup instructions.
Agent Types
Glass supports multiple agent implementations:Native Agent (Zed Agent)
The built-in agent with full access to Glass features:- Tool support - Can edit files, run terminal commands, search code
- Context awareness - Accesses project structure, rules, and prompts
- Multi-model support - Works with any configured provider
- Streaming responses - Real-time response generation
Text Thread Agent
A lightweight agent for conversational interactions:- Simple conversations - No tool execution
- Fast responses - Optimized for quick interactions
- Context attachment - Manual file and context inclusion
External Agents
Support for third-party agent implementations:- Claude Code (via Agent Server) - Official Anthropic agent
- Gemini CLI - Google’s agent implementation
- Codex - OpenAI’s code-focused agent
- Custom agents - Via Agent Client Protocol (ACP)
Key Capabilities
Context Management
Project Context
Agents automatically access project structure, worktree information, and rules files (
.zed.md, .rules, etc.)Tool Execution
Native agents can execute tools with configurable permissions:- File Operations - Read, write, edit files with diff preview
- Terminal Commands - Execute shell commands with approval
- Code Search - Search across project with semantic and text search
- Git Operations - View diffs, branch management
- LSP Integration - Go to definition, find references
Model Selection
Choose different models for specific tasks:AI Configuration
Provider Setup
Configure API keys and endpoints for each provider
Settings
Customize AI behavior, tool permissions, and model parameters
Privacy and Data
What data is sent to AI providers?
What data is sent to AI providers?
When using AI features, Glass sends:
- Your prompts and messages
- Relevant file contents (based on context selection)
- Project structure information
- Code snippets from active files
Using local models
Using local models
For complete privacy, use Ollama or other local providers:
- No data leaves your machine
- Works offline
- Full control over models
API key storage
API key storage
API keys are stored securely:
- macOS: Stored in Keychain
- Linux: Stored in Secret Service (gnome-keyring, KWallet)
- Windows: Stored in Credential Manager
Getting Started
Configure a Provider
Set up at least one AI provider in Settings