Overview
The Predictions API allows you to execute chatflows and receive AI-generated responses. This is the primary endpoint for integrating Flowise chatflows into your applications.
Prediction endpoints require API key authentication when the chatflow has an apikeyid configured. Public chatflows can be accessed without authentication.
Create Prediction
Send a message to a chatflow and receive a prediction.
POST /api/v1/prediction/:id
curl -X POST http://localhost:3000/api/v1/prediction/CHATFLOW_ID \
-H "Authorization: Bearer YOUR_API_KEY" \
-H "Content-Type: application/json" \
-d '{
"question": "What is the capital of France?"
}'
Path Parameters
The unique identifier of the chatflow to execute
Request Body
The input message or question to send to the chatflow
Session ID to maintain conversation context. If not provided, a new session will be created.
Enable Server-Sent Events (SSE) streaming for real-time responses
Array of previous messages to provide conversation context [
{
"role" : "user" ,
"content" : "Hello"
},
{
"role" : "assistant" ,
"content" : "Hi! How can I help you?"
}
]
Override chatflow configuration for this request {
"sessionId" : "custom-session-id" ,
"temperature" : 0.7 ,
"maxTokens" : 500
}
Array of file uploads (when chatflow supports file inputs)
Response (Non-Streaming)
{
"chatId" : "session-uuid" ,
"question" : "What is the capital of France?" ,
"text" : "The capital of France is Paris." ,
"sourceDocuments" : [],
"usedTools" : [],
"followUpPrompts" : [],
"metadata" : {
"model" : "gpt-4" ,
"temperature" : 0.7
}
}
The session ID for this conversation
The input question that was sent
The AI-generated response
Documents retrieved from vector stores (if applicable)
Tools that were invoked during execution
Suggested follow-up questions
Streaming Responses
Enable real-time streaming to receive responses as they’re generated.
POST /api/v1/prediction/:id (Streaming)
curl -X POST http://localhost:3000/api/v1/prediction/CHATFLOW_ID \
-H "Authorization: Bearer YOUR_API_KEY" \
-H "Content-Type: application/json" \
-d '{
"question": "Tell me a story",
"streaming": true
}'
When streaming is enabled, the response uses Server-Sent Events (SSE):
event: token
data: {"token": "The"}
event: token
data: {"token": " capital"}
event: token
data: {"token": " of"}
event: metadata
data: {"chatId": "session-uuid", "question": "..."}
event: end
data: {}
The chatflow must have streaming enabled in its configuration for this to work. Check the isStreaming property of your chatflow.
File Uploads
Some chatflows support file uploads. Use multipart/form-data to send files:
POST /api/v1/prediction/:id (with files)
curl -X POST http://localhost:3000/api/v1/prediction/CHATFLOW_ID \
-H "Authorization: Bearer YOUR_API_KEY" \
-F "question=Analyze this document" \
-F "files=@/path/to/document.pdf"
File Upload Parameters
Array of files to upload (sent as multipart form data)
Domain Restrictions
Chatflows can be configured with allowed origins to restrict which domains can make prediction requests. If your request is rejected:
{
"message" : "This site is not allowed to access this chatbot"
}
Contact the chatflow owner to add your domain to the allowed origins list.
Rate Limiting
Prediction requests may be rate-limited based on the chatflow configuration. If you exceed the rate limit:
{
"message" : "Too many requests"
}
Implement exponential backoff when receiving rate limit errors.
Session Management
Maintain conversation context by using the same chatId across multiple requests:
// First message - creates new session
const firstResponse = await fetch (
'http://localhost:3000/api/v1/prediction/CHATFLOW_ID' ,
{
method: 'POST' ,
headers: {
'Authorization' : 'Bearer YOUR_API_KEY' ,
'Content-Type' : 'application/json'
},
body: JSON . stringify ({
question: 'Hello, my name is Alice'
})
}
). then ( res => res . json ());
const chatId = firstResponse . chatId ;
// Follow-up message - uses same session
const secondResponse = await fetch (
'http://localhost:3000/api/v1/prediction/CHATFLOW_ID' ,
{
method: 'POST' ,
headers: {
'Authorization' : 'Bearer YOUR_API_KEY' ,
'Content-Type' : 'application/json'
},
body: JSON . stringify ({
question: 'What is my name?' ,
chatId: chatId
})
}
). then ( res => res . json ());
console . log ( secondResponse . text ); // Should mention "Alice"
Advanced Configuration
Override Config
Temporarily override chatflow settings for a specific request:
const response = await fetch (
'http://localhost:3000/api/v1/prediction/CHATFLOW_ID' ,
{
method: 'POST' ,
headers: {
'Authorization' : 'Bearer YOUR_API_KEY' ,
'Content-Type' : 'application/json'
},
body: JSON . stringify ({
question: 'Be creative!' ,
overrideConfig: {
temperature: 1.2 ,
maxTokens: 1000 ,
systemMessage: 'You are a creative storyteller'
}
})
}
). then ( res => res . json ());
Conversation History
Provide explicit conversation history instead of relying on session storage:
const response = await fetch (
'http://localhost:3000/api/v1/prediction/CHATFLOW_ID' ,
{
method: 'POST' ,
headers: {
'Authorization' : 'Bearer YOUR_API_KEY' ,
'Content-Type' : 'application/json'
},
body: JSON . stringify ({
question: 'What did I ask about?' ,
history: [
{ role: 'user' , content: 'Tell me about Paris' },
{ role: 'assistant' , content: 'Paris is the capital of France...' }
]
})
}
). then ( res => res . json ());
Error Handling
The API key is missing, invalid, or doesn’t have permission to access this chatflow. Solution: Verify your API key and ensure the chatflow’s apikeyid matches your key.
Your domain is not in the chatflow’s allowed origins list. Solution: Add your domain to the chatflow’s allowedOrigins configuration.
The chatflow ID doesn’t exist. Solution: Verify the chatflow ID is correct and the chatflow exists.
Required parameters are missing from the request. Solution: Ensure all required fields (like question) are included.
Webhook Integration
Combine predictions with webhooks to trigger actions based on responses:
const response = await fetch (
'http://localhost:3000/api/v1/prediction/CHATFLOW_ID' ,
{
method: 'POST' ,
headers: {
'Authorization' : 'Bearer YOUR_API_KEY' ,
'Content-Type' : 'application/json'
},
body: JSON . stringify ({
question: 'Process this order' ,
overrideConfig: {
webhookUrl: 'https://your-domain.com/webhook' ,
webhookMethod: 'POST'
}
})
}
). then ( res => res . json ());