Quickstart Guide
Get the ADK Utils Example chat application up and running in less than 5 minutes. This guide will walk you through cloning the repository, installing dependencies, and starting your first AI conversation.This quickstart uses the Ollama model for local AI inference. You can switch to Gemini by following the configuration steps in the Installation guide.
Prerequisites
Before you begin, ensure you have:- Node.js 18+ installed on your system
- npm, yarn, or pnpm package manager
- (Optional) Ollama installed for local model support
Step-by-Step Setup
Clone the Repository
First, clone the ADK Utils Example repository to your local machine:This will download the complete project with all source code, components, and configuration files.
Install Dependencies
Install all required packages using your preferred package manager:This will install all dependencies including:
- Next.js 16 and React 19
- Google ADK and ADK Utils
- AI SDK and related packages
- UI components and styling libraries
Configure the Agent Model
The project comes pre-configured with an Ollama model. Open The default configuration uses Ollama’s cloud endpoint. For local models, change to:
app/agents/agent1.ts to see the agent configuration:app/agents/agent1.ts
Make sure Ollama is running locally if you use
http://localhost:11434. Install Ollama from ollama.com and run ollama pull qwen3:0.6b.Start the Development Server
Launch the Next.js development server:You should see output similar to:
Open the Application
Navigate to http://localhost:3000 in your web browser.You should see the chat interface with:
- A clean, modern UI
-
Three suggested prompts:
- “What agent tools do you have?”
- “Give me a simple code example of javascript closure”
- “Create an example of pie diagram”
-
A chat input ready for your messages
Test the Chat Interface
Try one of the suggested prompts or type your own message. For example:Try this: “What agent tools do you have?”The agent will respond with information about its three built-in tools:
get_current_time- Time retrieval for any citycreate_mermaid_diagram- Visual diagram generationview_source_code- Source code visualization
create_mermaid_diagram tool to generate an interactive Mermaid.js diagram:Try this: “What time is it in Tokyo?”The agent will call the get_current_time tool to retrieve the time.What’s Running?
Once your development server is running, here’s what’s happening:Frontend (Next.js App)
Frontend (Next.js App)
The React application running at
http://localhost:3000 includes:- Chat UI components with real-time streaming
- Markdown and Mermaid rendering
- Rate limiting (20 messages per hour by default)
- Auto-scrolling message list
API Routes
API Routes
The Next.js API route at
/api/genai-agent handles:- Message processing from the frontend
- Agent initialization with
GenAIAgentService - Streaming responses back to the client
- Error handling and validation
app/api/genai-agent/route.ts:1AI Agent
AI Agent
The
rootAgent defined in app/agents/agent1.ts:62 includes:- Ollama model integration via
@yagolopez/adk-utils - Three custom function tools
- Zod schemas for parameter validation
- Custom instructions for behavior
Expected Output
When you send your first message, you should see:- Typing Indicator - Shows the agent is processing
- Streaming Response - Text appears word by word in real-time
- Formatted Output - Markdown rendering with:
- Code blocks with syntax highlighting
- Mermaid diagrams (if requested)
- Proper text formatting (bold, italic, lists)
- Tool Calls - Visual feedback when tools are executed
Explore the ADK Web Tool
The project includes the ADK Web Tool for debugging and inspecting agents:- Inspect agent configurations
- View tool definitions
- Test agent interactions
- Debug tool calls and responses
Verify Installation
To ensure everything is working correctly:Check the Console
Open your browser’s developer console (F12). You should see no errors, only informational logs.
Test Tool Execution
Ask for a diagram: “Create a simple pie chart”. Verify that the Mermaid diagram renders correctly.
Common Issues
Port 3000 Already in Use
Port 3000 Already in Use
If port 3000 is already occupied, Next.js will automatically use port 3001. Check the terminal output for the correct URL.Alternatively, specify a custom port:
Ollama Connection Error
Ollama Connection Error
If you see “Failed to connect to Ollama”, ensure:
- Ollama is installed and running
- The model is downloaded:
ollama pull qwen3:0.6b - The endpoint in
agent1.tsmatches your Ollama server
Module Not Found Error
Module Not Found Error
If you see module errors, try:
TypeScript Errors
TypeScript Errors
If TypeScript errors appear, ensure you’re using Node.js 18+ and TypeScript 5:
Next Steps
Now that you have the application running:Installation Guide
Learn about detailed configuration options and environment setup
Architecture
Understand how the components work together
Agent Tools
Learn about built-in agent tools
Chat UI Features
Explore the chat interface capabilities
What You’ve Accomplished
Congratulations! You’ve successfully:- ✅ Cloned and set up the ADK Utils Example project
- ✅ Installed all dependencies and development tools
- ✅ Configured an AI agent with Ollama model support
- ✅ Started the development server
- ✅ Tested the chat interface with streaming responses
- ✅ Verified tool execution (time, diagrams, code)