Expo Setup
Expo support requires a development build because react-native-webrtc uses native modules that are not available in Expo Go. This guide walks you through the complete setup process.
react-native-webrtc is not compatible with Expo Go . You must create a development build or use bare React Native.
Prerequisites
Expo SDK 49 or higher
Node.js 18 or higher
iOS Simulator or Android Emulator (or physical devices)
Expo CLI: npm install -g expo-cli
EAS CLI: npm install -g eas-cli
Installation
Install dependencies
Add the required packages to your Expo project: npx expo install @navai/voice-mobile react-native-webrtc
npx expo install ensures version compatibility with your Expo SDK version.
Configure app.json
Add plugin configuration and permissions to your app.json: {
"expo" : {
"name" : "YourApp" ,
"slug" : "your-app" ,
"plugins" : [
[
"react-native-webrtc" ,
{
"cameraPermission" : false ,
"microphonePermission" : "Allow $(PRODUCT_NAME) to access your microphone for voice navigation"
}
]
],
"ios" : {
"infoPlist" : {
"NSMicrophoneUsageDescription" : "This app needs microphone access for voice navigation" ,
"UIBackgroundModes" : [ "audio" ]
}
},
"android" : {
"permissions" : [
"RECORD_AUDIO" ,
"INTERNET" ,
"MODIFY_AUDIO_SETTINGS"
]
}
}
}
Set cameraPermission to false since voice navigation only requires microphone access.
Create development build
Build the development client for your target platform: iOS Simulator
Android Emulator
EAS Build (iOS)
EAS Build (Android)
The first build may take 10-20 minutes. Subsequent builds with expo run are much faster.
Configure EAS (Optional)
If using EAS Build, create or update eas.json: {
"build" : {
"development" : {
"developmentClient" : true ,
"distribution" : "internal"
},
"preview" : {
"distribution" : "internal"
},
"production" : {}
}
}
WebRTC Configuration
Configure the WebRTC transport for your Expo app:
import { createReactNativeWebRtcTransport } from '@navai/voice-mobile' ;
// Import react-native-webrtc
const webrtc = require ( 'react-native-webrtc' );
const transport = createReactNativeWebRtcTransport ({
globals: {
mediaDevices: webrtc . mediaDevices ,
RTCPeerConnection: webrtc . RTCPeerConnection ,
},
// Optional: adjust remote audio volume (0-10, default 10)
remoteAudioTrackVolume: 8 ,
// Optional: specify custom realtime endpoint
realtimeUrl: 'https://api.openai.com/v1/realtime/calls' ,
// Optional: audio constraints
audioConstraints: {
audio: {
echoCancellation: true ,
noiseSuppression: true ,
autoGainControl: true ,
},
video: false ,
},
});
Transport Options
globals
NavaiReactNativeWebRtcGlobals
required
WebRTC runtime from react-native-webrtc: {
mediaDevices : webrtc . mediaDevices ,
RTCPeerConnection : webrtc . RTCPeerConnection ,
}
Volume for remote audio tracks (AI voice). Range: 0-10.
Media constraints for audio capture: {
audio : {
echoCancellation : true ,
noiseSuppression : true ,
autoGainControl : true ,
},
video : false ,
}
realtimeUrl
string
default: "https://api.openai.com/v1/realtime/calls"
OpenAI Realtime API endpoint.
RTCPeerConnection configuration (STUN/TURN servers, etc.).
model
string
default: "gpt-realtime"
Default model for realtime sessions.
Complete Expo Example
Here’s a full working Expo app with voice navigation:
Project Structure
your-expo-app/
├── app.json
├── App.tsx
├── src/
│ ├── ai/
│ │ ├── routes.ts
│ │ ├── generated-loaders.ts
│ │ └── functions-modules/
│ │ └── greeting.ts
│ ├── hooks/
│ │ └── useNavaiRuntime.ts
│ └── components/
│ └── VoiceControl.tsx
└── package.json
App.tsx
import React from 'react' ;
import { NavigationContainer } from '@react-navigation/native' ;
import { createNativeStackNavigator } from '@react-navigation/native-stack' ;
import { View , Text , StyleSheet } from 'react-native' ;
import { VoiceControl } from './src/components/VoiceControl' ;
const Stack = createNativeStackNavigator ();
function HomeScreen () {
return (
< View style = {styles. screen } >
< Text style = {styles. title } > Home Screen </ Text >
< Text style = {styles. subtitle } > Try saying : "Go to profile" </ Text >
</ View >
);
}
function ProfileScreen () {
return (
< View style = {styles. screen } >
< Text style = {styles. title } > Profile Screen </ Text >
< Text style = {styles. subtitle } > Try saying : "Go to notifications" </ Text >
</ View >
);
}
function NotificationsScreen () {
return (
< View style = {styles. screen } >
< Text style = {styles. title } > Notifications </ Text >
< Text style = {styles. subtitle } > Try saying : "Go home" </ Text >
</ View >
);
}
export default function App () {
return (
< NavigationContainer >
< Stack . Navigator
screenOptions = {{
headerRight : () => < VoiceControl />,
}}
>
< Stack . Screen name = "Home" component = { HomeScreen } />
< Stack . Screen name = "Profile" component = { ProfileScreen } />
< Stack . Screen name = "Notifications" component = { NotificationsScreen } />
</ Stack . Navigator >
</ NavigationContainer >
);
}
const styles = StyleSheet . create ({
screen: {
flex: 1 ,
justifyContent: 'center' ,
alignItems: 'center' ,
padding: 20 ,
},
title: {
fontSize: 24 ,
fontWeight: 'bold' ,
marginBottom: 12 ,
},
subtitle: {
fontSize: 16 ,
color: '#666' ,
textAlign: 'center' ,
},
});
src/ai/routes.ts
import type { NavaiRoute } from '@navai/voice-mobile' ;
export const APP_ROUTES : NavaiRoute [] = [
{
name: 'home' ,
path: 'Home' ,
description: 'Main home screen' ,
synonyms: [ 'inicio' , 'main' ],
},
{
name: 'profile' ,
path: 'Profile' ,
description: 'User profile' ,
synonyms: [ 'perfil' , 'account' ],
},
{
name: 'notifications' ,
path: 'Notifications' ,
description: 'View notifications' ,
synonyms: [ 'notificaciones' , 'alerts' ],
},
];
src/ai/functions-modules/greeting.ts
import type { NavaiFunctionDefinition } from '@navai/voice-mobile' ;
import { Alert } from 'react-native' ;
export const greeting : NavaiFunctionDefinition = {
name: 'greeting' ,
description: 'Display a friendly greeting to the user' ,
run : async ( payload ) => {
Alert . alert ( 'Hello!' , 'Voice navigation is working!' );
return { ok: true , message: 'Greeting displayed' };
},
};
src/ai/generated-loaders.ts
import type { NavaiFunctionModuleLoaders } from '@navai/voice-mobile' ;
export const MODULE_LOADERS : NavaiFunctionModuleLoaders = {
'src/ai/functions-modules/greeting.ts' : () =>
import ( './functions-modules/greeting' ),
};
src/hooks/useNavaiRuntime.ts
import { useState , useEffect } from 'react' ;
import {
resolveNavaiMobileApplicationRuntimeConfig ,
type ResolveNavaiMobileApplicationRuntimeConfigResult ,
} from '@navai/voice-mobile' ;
import { MODULE_LOADERS } from '../ai/generated-loaders' ;
import { APP_ROUTES } from '../ai/routes' ;
export function useNavaiRuntime () {
const [ runtime , setRuntime ] = useState < ResolveNavaiMobileApplicationRuntimeConfigResult | null >( null );
const [ loading , setLoading ] = useState ( true );
const [ error , setError ] = useState < string | null >( null );
useEffect (() => {
let cancelled = false ;
resolveNavaiMobileApplicationRuntimeConfig ({
moduleLoaders: MODULE_LOADERS ,
defaultRoutes: APP_ROUTES ,
env: {
NAVAI_API_URL: 'http://localhost:3000' ,
},
})
. then (( config ) => {
if ( ! cancelled ) {
setRuntime ( config );
setLoading ( false );
}
})
. catch (( err ) => {
if ( ! cancelled ) {
setError ( err instanceof Error ? err . message : String ( err ));
setLoading ( false );
}
});
return () => {
cancelled = true ;
};
}, []);
return { runtime , loading , error };
}
src/components/VoiceControl.tsx
import React from 'react' ;
import { TouchableOpacity , Text , ActivityIndicator , StyleSheet } from 'react-native' ;
import { useMobileVoiceAgent } from '@navai/voice-mobile' ;
import { useNavigation } from '@react-navigation/native' ;
import { useNavaiRuntime } from '../hooks/useNavaiRuntime' ;
export function VoiceControl () {
const navigation = useNavigation ();
const { runtime , loading : runtimeLoading , error : runtimeError } = useNavaiRuntime ();
const { isConnecting , isConnected , start , stop } = useMobileVoiceAgent ({
runtime ,
runtimeLoading ,
runtimeError ,
navigate : ( path ) => navigation . navigate ( path as never ),
});
const handlePress = async () => {
if ( isConnected ) {
await stop ();
} else {
await start ();
}
};
if ( runtimeLoading ) {
return < ActivityIndicator style ={ styles . loader } />;
}
return (
< TouchableOpacity
style = { [styles.button, isConnected && styles.buttonActive]}
onPress={handlePress}
disabled={isConnecting}
>
{isConnecting ? (
<ActivityIndicator size="small" color="#fff" />
) : (
< Text style = {styles. buttonText } > {isConnected ? '🎤' : '🎙️' } </ Text >
)}
</ TouchableOpacity >
);
}
const styles = StyleSheet . create ({
loader: {
marginRight: 16 ,
},
button: {
marginRight: 16 ,
backgroundColor: '#007AFF' ,
paddingHorizontal: 16 ,
paddingVertical: 8 ,
borderRadius: 8 ,
},
buttonActive: {
backgroundColor: '#34C759' ,
},
buttonText: {
fontSize: 20 ,
},
});
Running the Development Build
Start Metro
npx expo start --dev-client
Launch development build
Press i for iOS Simulator
Press a for Android Emulator
Scan QR code on physical device with the development build installed
Test voice navigation
Tap the microphone button in the header
Grant microphone permission when prompted
Say “Go to profile” or “Open notifications”
Troubleshooting
WebRTC Module Not Found
Error: Unable to resolve module react-native-webrtc
Solution : Create a development build. WebRTC is not available in Expo Go.
npx expo run:ios
# or
npx expo run:android
Microphone Permission Denied
Solution : Check that permissions are configured in app.json and rebuild:
rm -rf ios android
npx expo prebuild
npx expo run:ios
Connection Failed
Solution : Ensure your backend is accessible from the device:
iOS Simulator: Use http://localhost:3000
Android Emulator: Use http://10.0.2.2:3000
Physical Device: Use your computer’s network IP (e.g., http://192.168.1.100:3000)
Audio Not Working
Add background audio mode to iOS configuration in app.json:
{
"expo" : {
"ios" : {
"infoPlist" : {
"UIBackgroundModes" : [ "audio" ]
}
}
}
}
Production Builds
For production builds with EAS:
# Preview build
eas build --profile preview --platform all
# Production build
eas build --profile production --platform all
Update eas.json to include proper configuration:
{
"build" : {
"production" : {
"env" : {
"NAVAI_API_URL" : "https://api.yourapp.com"
}
}
}
}
Remember to update the backend URL for production environments.
Next Steps
React Native Guide Full React Native integration guide
WebRTC Transport Deep dive into WebRTC configuration
Functions Create custom voice functions
Backend Setup Configure backend integration