Skip to main content
Native installation guide for Asta on Linux, macOS, and Windows. No Docker required.

Requirements

Backend

Python 3.12 or 3.13(3.14 not supported yet)

Desktop App

Node.js 18+Rust toolchain

Optional

OllamaFor local AI and RAG
Python 3.14 is not yet supported due to pydantic and ChromaDB compatibility issues. Use Python 3.12 or 3.13.

Quick Install (Linux / macOS)

The fastest way to get started using the control script:
git clone https://github.com/helloworldxdwastaken/asta.git
cd asta
cp .env.example backend/.env
# Edit backend/.env with your API keys (optional for first run)

./asta.sh start
Open http://localhost:8010/docs for API documentation. The desktop app connects to http://localhost:8010.

Platform-Specific Installation

Prerequisites

Install Python 3.12 or 3.13 using Homebrew:
brew install [email protected]

Backend Setup

cd backend
python3.12 -m venv .venv
source .venv/bin/activate
pip install -r requirements.txt

Start Backend

uvicorn app.main:app --reload --host 0.0.0.0 --port 8010
Or use the control script:
./asta.sh start

Desktop App

Install dependencies and run in development mode:
cd MACWinApp/asta-app
npm install
npx tauri dev

Build Release (DMG)

cd MACWinApp/asta-app
npx tauri build
The DMG will be created in src-tauri/target/release/bundle/dmg/.
Apple Notes Integration: When running the backend, launch it from Terminal (not as a background service). The first time you ask Asta to check notes, macOS will prompt for permission. Approve it so the backend process can access Notes.

Environment Configuration

Copy .env.example to backend/.env and configure your settings:
cp .env.example backend/.env

Essential Variables

GROQ_API_KEY
string
Groq API key (recommended for speed)
GEMINI_API_KEY
string
Google Gemini API key
ANTHROPIC_API_KEY
string
Claude API key from Anthropic
OPENAI_API_KEY
string
OpenAI API key
OPENROUTER_API_KEY
string
OpenRouter API key (access to 300+ models)
OLLAMA_BASE_URL
string
default:"http://localhost:11434"
Ollama server URL for local AI

Security Settings

ASTA_ALLOWED_PATHS
string
Comma-separated directories for file access (e.g., /Users/you/Documents,/Users/you/Notes)
ASTA_EXEC_ALLOWED_BINS
string
Comma-separated list of executable binaries Asta can run (e.g., memo,things)
ASTA_EXEC_SECURITY
string
default:"allowlist"
Execution security mode:
  • allowlist - Only allowed binaries (default, recommended)
  • full - Allow any command (dangerous)
  • deny - Disable exec completely
ASTA_JWT_SECRET
string
JWT signing secret (auto-generated on first login if not set)

Optional Integrations

TELEGRAM_BOT_TOKEN
string
Bot token from @BotFather
SPOTIFY_CLIENT_ID
string
Spotify app client ID from developer dashboard
SPOTIFY_CLIENT_SECRET
string
Spotify app client secret
HUGGINGFACE_API_KEY
string
Hugging Face token for FLUX.1-dev image generation fallback

Multi-User Authentication

On first run with no users in the database, Asta operates in single-user mode (open access, no login required).

Enable Multi-User Mode

Create the first admin user via the API (while in single-user mode):
curl -X POST http://localhost:8010/api/auth/users \
  -H "Content-Type: application/json" \
  -d '{"username": "admin", "password": "your-password", "role": "admin"}'
After creating the first user:
  1. The desktop app will show a login screen
  2. Sign in with the admin credentials
  3. Admin can create additional users from Settings → Users
  4. Users can self-register via the login page

User Roles

Admin

Full Access
  • All settings and configuration
  • User management
  • Skills, tools, and agents
  • Exec, files, and reminders

User

Limited Access
  • Chat with safe tools only
  • Web search, weather, math, GIFs, PDF, time
  • No settings or admin features
  • No exec, files, or reminders
JWT tokens are auto-generated on first login and stored in backend/.env as ASTA_JWT_SECRET. Tokens expire after 30 days.

Optional: Ollama Setup

Ollama provides local AI inference and embeddings for RAG (learning/knowledge).

Install Ollama

brew install ollama

Setup for RAG

Use the provided setup script:
./scripts/setup_ollama_rag.sh
Or manually:
ollama pull nomic-embed-text
ollama serve
Set the base URL in backend/.env:
OLLAMA_BASE_URL=http://localhost:11434
If Ollama is unavailable, learning/RAG endpoints will be disabled until it’s running.

Control Script Reference

The asta.sh script provides commands for managing the backend:
./asta.sh start
# Creates venv, installs deps, starts backend

What asta.sh start Does

1

Python Detection

Searches for python3.12, python3.13, or compatible python3 (skips 3.14+)
2

Virtual Environment

Creates .venv in backend directory if it doesn’t exist
3

Dependencies

Installs all packages from requirements.txt
4

Port Management

Frees port 8010 if occupied by previous instances
5

Database Cleanup

Releases any stale SQLite locks
6

Tailscale Integration

Auto-starts Tailscale if installed for remote access
7

Background Process

Starts uvicorn as a background daemon
8

Health Check

Waits up to 25 seconds for API to respond

Pre-Built Releases

For end users who don’t want to build from source: Download the latest release from GitHub Releases:

macOS (Apple Silicon)

Asta_<version>_aarch64.dmg

macOS (Intel)

Asta_<version>_x64.dmg

Windows

Asta_<version>_x64-setup.msi
Install the DMG/MSI, then start the backend separately:
./asta.sh start

Troubleshooting

The port is occupied by a previous instance:
./asta.sh restart
Or manually kill the process:
lsof -ti:8010 | xargs kill -9
Backend is not running. Start it:
./asta.sh start
Or manually:
cd backend
source .venv/bin/activate
uvicorn app.main:app --host 0.0.0.0 --port 8010
Add at least one API key to backend/.env:
GROQ_API_KEY=gsk_...
# or GEMINI_API_KEY, OPENAI_API_KEY, etc.
Then restart:
./asta.sh restart
The backend reads backend/.env only (not root .env):
cp .env.example backend/.env
Install Python 3.12 or 3.13:
# macOS
brew install [email protected]

# Ubuntu/Debian
sudo apt install python3.12 python3.12-venv
Reminders are loaded on backend startup. Restart once:
./asta.sh restart
Reinstall dependencies:
cd backend
source .venv/bin/activate
pip install -r requirements.txt
First run may download the Whisper model (~140 MB).
Ensure you have the Rust toolchain:
# Install Rust
curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh

# Update Rust
rustup update
Then try building again:
cd MACWinApp/asta-app
npx tauri build

Next Steps

Configuration

Configure API keys, skills, and advanced settings

Desktop App

Learn about the Tauri-based desktop application

Telegram Bot

Set up Telegram integration for mobile access

Skills

Enable and configure built-in skills

Build docs developers (and LLMs) love