Skip to main content
NAVAI allows users to navigate your application using natural voice commands. The AI agent understands route names, paths, and synonyms to provide intuitive voice-first navigation.

How voice navigation works

Voice navigation combines route definitions, natural language understanding, and a built-in navigation tool.
1

Define routes

Create a list of NavaiRoute objects describing all navigable destinations in your app.
2

User speaks command

The user says something like “go to settings” or “open my profile”.
3

AI agent resolves route

The AI calls the navigate_to tool with the user’s target, and NAVAI resolves it to a valid path.
4

Navigate callback fires

Your app’s navigate function is called with the resolved path, triggering navigation.

Route definitions

Routes are defined using the NavaiRoute type, which maps user-friendly names to application paths.

Route structure

// From packages/voice-frontend/src/routes.ts:1-6
export type NavaiRoute = {
  name: string;
  path: string;
  description: string;
  synonyms?: string[];
};
name
string
required
The primary name for this route. Used in voice commands and AI instructions.
path
string
required
The actual route path in your application (e.g., /settings, /profile).
description
string
required
Human-readable description of what this route shows. Helps the AI understand when to navigate here.
synonyms
string[]
Alternative names or phrases that should resolve to this route. Makes voice commands more flexible.

Example route definitions

const routes: NavaiRoute[] = [
  {
    name: "home",
    path: "/",
    description: "Main dashboard and overview",
    synonyms: ["dashboard", "main", "start"]
  },
  {
    name: "profile",
    path: "/profile",
    description: "User profile and account settings",
    synonyms: ["account", "my profile", "user info"]
  },
  {
    name: "settings",
    path: "/settings",
    description: "Application settings and preferences",
    synonyms: ["preferences", "config", "configuration"]
  },
  {
    name: "messages",
    path: "/messages",
    description: "View and send messages",
    synonyms: ["inbox", "chat", "conversations"]
  }
];
Add synonyms for common variations of route names. Users might say “go to my account” instead of “go to profile”, and synonyms ensure both work.

Route resolution

When the AI agent receives a navigation command, NAVAI resolves the user’s input to a valid application path using a multi-stage matching algorithm.

Resolution algorithm

The resolveNavaiRoute function tries multiple matching strategies in order:
// From packages/voice-frontend/src/routes.ts:16-33
export function resolveNavaiRoute(input: string, routes: NavaiRoute[] = []): string | null {
  const normalized = normalize(input);

  // 1. Direct path match
  const direct = routes.find((route) => normalize(route.path) === normalized);
  if (direct) return direct.path;

  // 2. Exact name or synonym match
  for (const route of routes) {
    if (normalize(route.name) === normalized) return route.path;
    if (route.synonyms?.some((synonym) => normalize(synonym) === normalized)) return route.path;
  }

  // 3. Partial name or synonym match
  for (const route of routes) {
    if (normalized.includes(normalize(route.name))) return route.path;
    if (route.synonyms?.some((synonym) => normalized.includes(normalize(synonym)))) return route.path;
  }

  return null;
}

Normalization

Before matching, all strings are normalized to handle accents, case, and whitespace:
// From packages/voice-frontend/src/routes.ts:8-14
function normalize(value: string): string {
  return value
    .normalize("NFD")
    .replace(/[\u0300-\u036f]/g, "")
    .trim()
    .toLowerCase();
}
This ensures commands like “Perfil”, “perfil”, and “PERFIL” all match the same route.

Matching strategies

Checks if the input exactly matches a route path:
User says: "/settings"
Matches: { path: "/settings", ... }
Most specific match. Use when users might speak actual paths.

The navigate_to tool

The navigate_to tool is automatically registered when you build a NAVAI agent. It’s the bridge between AI understanding and your app’s navigation.

Tool definition

// From packages/voice-frontend/src/agent.ts:165-183
const navigateTool = tool({
  name: "navigate_to",
  description: "Navigate to an allowed route in the current app.",
  parameters: z.object({
    target: z
      .string()
      .min(1)
      .describe("Route name or route path. Example: perfil, ajustes, /profile, /settings")
  }),
  execute: async ({ target }) => {
    const path = resolveNavaiRoute(target, options.routes);
    if (!path) {
      return { ok: false, error: "Unknown or disallowed route." };
    }

    options.navigate(path);
    return { ok: true, path };
  }
});

Tool parameters

target
string
required
The user’s navigation intent. Can be a route name, synonym, or path.

Tool response

ok
boolean
Whether navigation succeeded.
path
string
The resolved path that was navigated to (only present when ok: true).
error
string
Error message explaining why navigation failed (only present when ok: false).

AI instructions for navigation

When building the agent, NAVAI generates instructions that teach the AI about available routes:
// From packages/voice-frontend/src/agent.ts:217-243
const routeLines = getNavaiRoutePromptLines(options.routes);

const instructions = [
  options.baseInstructions ?? "You are a voice assistant embedded in a web app.",
  "Allowed routes:",
  ...routeLines,
  "Allowed app functions:",
  ...functionLines,
  "Rules:",
  "- If user asks to go/open a section, always call navigate_to.",
  "- If user asks to run an internal action, call execute_app_function or the matching direct function tool.",
  "- Always include payload in execute_app_function. Use null when no arguments are needed.",
  "- For execute_app_function, pass arguments using payload.args (array).",
  "- For class methods, pass payload.constructorArgs and payload.methodArgs.",
  "- Never invent routes or function names that are not listed.",
  "- If destination/action is unclear, ask a brief clarifying question."
].join("\n");

Generating route prompt lines

The getNavaiRoutePromptLines function formats routes for AI instructions:
// From packages/voice-frontend/src/routes.ts:35-40
export function getNavaiRoutePromptLines(routes: NavaiRoute[] = []): string[] {
  return routes.map((route) => {
    const synonyms = route.synonyms?.length ? `, aliases: ${route.synonyms.join(", ")}` : "";
    return `- ${route.name} (${route.path})${synonyms}`;
  });
}
Example output:
- home (/) aliases: dashboard, main, start
- profile (/profile) aliases: account, my profile, user info
- settings (/settings) aliases: preferences, config
- messages (/messages) aliases: inbox, chat, conversations
The AI uses these instructions to understand which routes are available and what they’re called. Clear, descriptive route names help the AI make better navigation decisions.
Users can navigate using various natural language patterns:

Direct commands

"Go to settings"
"Open my profile"
"Navigate to messages"
"Show me the dashboard"
"Take me home"

Conversational commands

"I want to change my preferences"
"Can you open my account?"
"Let's go to the home page"
"I need to check my messages"

Contextual commands

"What's on my profile?"
"Show my settings"
"Let me see the messages"
The AI can only navigate to routes you’ve explicitly defined. If users ask for undefined routes, the AI will respond that the section isn’t available.

Implementing the navigate callback

You provide the actual navigation implementation through the navigate callback.

React Router example

import { useNavigate } from 'react-router-dom';
import { useWebVoiceAgent } from '@navai/voice-frontend';

function App() {
  const navigate = useNavigate();
  
  const voiceAgent = useWebVoiceAgent({
    navigate: (path) => navigate(path),
    routes: myRoutes,
    // ... other options
  });
  
  return <YourApp />;
}

Next.js example

import { useRouter } from 'next/router';
import { useWebVoiceAgent } from '@navai/voice-frontend';

function MyApp() {
  const router = useRouter();
  
  const voiceAgent = useWebVoiceAgent({
    navigate: (path) => router.push(path),
    routes: myRoutes,
    // ... other options
  });
  
  return <YourApp />;
}

Custom navigation example

const voiceAgent = useWebVoiceAgent({
  navigate: (path) => {
    // Log navigation for analytics
    analytics.track('Voice Navigation', { path });
    
    // Custom navigation logic
    window.history.pushState({}, '', path);
    
    // Update UI state
    setCurrentPath(path);
  },
  routes: myRoutes,
  // ... other options
});

Dynamic routes and parameters

For routes with dynamic parameters (e.g., /users/:id), you have two options:

Option 1: Function-based navigation

Instead of defining parametrized routes, create functions that navigate with parameters:
// Define a function instead of a route
export function openUserProfile(userId: string, context: NavaiFunctionContext) {
  context.navigate(`/users/${userId}`);
  return { ok: true, message: `Opening profile for user ${userId}` };
}

// Register this function with NAVAI
// User can say: "Open user profile 12345"
// AI calls: openUserProfile("12345")

Option 2: Contextual data

Use the current application context to fill in parameters:
const currentUserId = useCurrentUserId();

const routes: NavaiRoute[] = [
  {
    name: "my profile",
    path: `/users/${currentUserId}`,
    description: "View your profile page"
  }
];
Option 1 is more flexible when the AI needs to extract parameters from user speech. Option 2 works well for current-user-specific routes.

Loading routes from files

For larger applications, store routes in a separate JSON file:
routes.json
[
  {
    "name": "home",
    "path": "/",
    "description": "Main dashboard",
    "synonyms": ["dashboard", "main"]
  },
  {
    "name": "profile",
    "path": "/profile",
    "description": "User profile",
    "synonyms": ["account", "my profile"]
  }
]
Then load them using the runtime config:
import { useWebVoiceAgent } from '@navai/voice-frontend';
import defaultRoutes from './routes.json';

const voiceAgent = useWebVoiceAgent({
  navigate,
  moduleLoaders,
  defaultRoutes: defaultRoutes as NavaiRoute[],
  routesFile: process.env.NAVAI_ROUTES_FILE,
  // ... other options
});

Best practices

Choose route names that match how users naturally speak:✅ Good: “messages”, “inbox”, “chat”
❌ Bad: “msg-list”, “comms-panel”
Think about different ways users might refer to the same destination:
{
  name: "settings",
  synonyms: ["preferences", "config", "options", "configurations"]
}
Descriptions help the AI understand when to navigate to a route:
{
  name: "billing",
  description: "View invoices, payment methods, and subscription details",
  // Better than: "Billing page"
}
For large apps, only expose the most commonly needed routes to voice navigation. Not every page needs to be voice-accessible.
Test your routes with natural voice commands to ensure they resolve correctly:
import { resolveNavaiRoute } from '@navai/voice-frontend';

console.log(resolveNavaiRoute("show me settings", routes));
console.log(resolveNavaiRoute("go to my account", routes));

Multilingual navigation

For multilingual apps, define routes in each supported language:
const routes: NavaiRoute[] = [
  {
    name: "profile",
    path: "/profile",
    description: "User profile",
    synonyms: ["account", "perfil", "cuenta", "mon profil", "mon compte"]
  },
  {
    name: "settings",
    path: "/settings",
    description: "Settings",
    synonyms: ["preferences", "ajustes", "configuración", "paramètres"]
  }
];
Combine with language configuration in the backend:
{
  defaultLanguage: "Spanish",
  defaultVoiceAccent: "Latin American Spanish"
}

Next steps

Function execution

Learn how to define and execute functions via voice

Frontend setup

Set up voice navigation in your React app

Routes configuration

Advanced route configuration options

Voice runtime

Deep dive into the voice runtime system

Build docs developers (and LLMs) love