Skip to main content
Logicore works with local models via Ollama and all major cloud providers. Install the core package, add your provider extras, and set your credentials.
Logicore requires Python 3.10 or later (3.10, 3.11, 3.12, and 3.13 are all tested and supported).

Prerequisites

Before installing, you need:
  • Python 3.10+ — Check with python --version
  • An LLM provider — Either Ollama installed locally, or API credentials for a cloud provider (Gemini, Groq, Azure OpenAI, or Anthropic)
1

Create a virtual environment

Isolate your dependencies using a virtual environment:
python -m venv .venv
source .venv/bin/activate  # On Windows: .venv\Scripts\activate
2

Install the core package

Install the base Logicore framework:
pip install logicore
The core package includes pydantic, requests, and python-dotenv. Provider clients and optional tool dependencies are installed separately as extras.
3

Install provider extras

Install only the extras you need. Each extra pulls in the official client library for that provider.
pip install "logicore[ollama]"
Then pull a model before running agents:
ollama run qwen3.5:0.8b
Start Ollama in the background with ollama serve if it isn’t already running as a system service.
4

Set environment variables

Cloud providers require API credentials. Set these before running any agent.
export GEMINI_API_KEY="your-api-key"
Store credentials in a .env file at your project root. Logicore loads it automatically via python-dotenv.
5

Verify the installation

Create a verify.py script to confirm everything is wired up correctly:
import asyncio
from logicore.providers.ollama_provider import OllamaProvider
from logicore.agents.agent import Agent

async def main():
    provider = OllamaProvider(model_name="qwen3.5:0.8b")
    agent = Agent(llm=provider, role="Greeter")
    response = await agent.chat("Say hello!")
    print(response)

if __name__ == "__main__":
    asyncio.run(main())
Run it:
python verify.py
You should see a greeting printed to the terminal. If you see an error, check the troubleshooting notes below.

Optional extras reference

ExtraInstallsUse case
ollamaollama>=0.1Local models (no API key required)
geminigoogle-genai>=1.0Google Gemini models
groqgroq>=0.4Groq inference API
azureopenai>=1.0, httpx>=0.24Azure OpenAI deployments
anthropicanthropic>=0.20Anthropic Claude models
mcpmcp>=0.9Model Context Protocol servers
toolspypdf, python-docx, python-pptx, openpyxl, playwrightBuilt-in document and browser tools
allEverything above + watchdogFull install

Common installation issues

Logicore can’t reach the Ollama server. Start it with:
ollama serve
Ollama listens on http://localhost:11434 by default.
If you see an ImportError for a provider (e.g., No module named 'groq'), you installed the base package without the extras. Fix it:
pip install "logicore[groq]"
Replace groq with whichever provider you’re using.
Logicore uses structural pattern matching and union type syntax that require Python 3.10+. Upgrade Python or create a new virtual environment with a supported version.

Build docs developers (and LLMs) love