Skip to main content
This guide walks you through setting up Azure resources and deploying the VisionaryAI backend infrastructure using Azure Functions.

Prerequisites

Before you begin, ensure you have:
You can also use the Azure extension in VS Code to log in and deploy functions directly from the editor.

Create Azure resources

1

Create a resource group

Create a new resource group to organize your VisionaryAI resources:
az group create --name visionary-ai-rg --location eastus
2

Create a storage account

Create an Azure Storage account for blob storage and function app storage:
az storage account create \
  --name visionaryaistorage \
  --resource-group visionary-ai-rg \
  --location eastus \
  --sku Standard_LRS
Storage account names must be between 3-24 characters, lowercase letters and numbers only, and globally unique across Azure.
3

Create a blob container

Create a container named images to store generated DALL-E images:
az storage container create \
  --name images \
  --account-name visionaryaistorage \
  --public-access off
4

Create a Function App

Create an Azure Function App to host your serverless functions:
az functionapp create \
  --resource-group visionary-ai-rg \
  --consumption-plan-location eastus \
  --runtime node \
  --runtime-version 18 \
  --functions-version 4 \
  --name visionary-ai-functions \
  --storage-account visionaryaistorage
5

Retrieve storage credentials

Get your storage account name and key for configuration:
# Get account name (should be visionaryaistorage)
az storage account show \
  --name visionaryaistorage \
  --resource-group visionary-ai-rg \
  --query name -o tsv

# Get account key
az storage account keys list \
  --account-name visionaryaistorage \
  --resource-group visionary-ai-rg \
  --query '[0].value' -o tsv
Save these values for the next step.

Configure environment variables

Set up the required environment variables in your Function App:
az functionapp config appsettings set \
  --name visionary-ai-functions \
  --resource-group visionary-ai-rg \
  --settings \
    accountName="your-storage-account-name" \
    accountKey="your-storage-account-key" \
    OPEN_AI_KEY="your-openai-api-key" \
    OPEN_AI_ORGANIZATION="your-openai-org-id"
See environment variables for detailed information about each variable.

Deploy functions

1

Navigate to the Azure directory

cd azure
2

Install dependencies

npm install
3

Deploy to Azure

Deploy your functions to the Function App:
func azure functionapp publish visionary-ai-functions
The deployment process will bundle all functions and dependencies, then upload them to your Azure Function App.
4

Verify deployment

List your deployed functions:
az functionapp function list \
  --name visionary-ai-functions \
  --resource-group visionary-ai-rg \
  --query '[].name' -o table
You should see:
  • generateImage
  • generateSASToken
  • getChatGPTSuggestion
  • getImages

Test your deployment

Once deployed, test your functions using the Azure Portal or CLI:
# Get the function URL
az functionapp function show \
  --name visionary-ai-functions \
  --resource-group visionary-ai-rg \
  --function-name getChatGPTSuggestion \
  --query invokeUrlTemplate -o tsv
Test the suggestion endpoint:
curl https://visionary-ai-functions.azurewebsites.net/api/getChatGPTSuggestion

Update your Next.js configuration

Update your Next.js application to use the deployed Azure Functions:
// In your Next.js app
const AZURE_FUNCTION_URL = 'https://visionary-ai-functions.azurewebsites.net/api'

// Use in your API calls
const response = await fetch(`${AZURE_FUNCTION_URL}/generateImage`, {
  method: 'POST',
  headers: { 'Content-Type': 'application/json' },
  body: JSON.stringify({ prompt: 'your prompt here' })
})

Monitor your functions

View logs and monitor function execution in the Azure Portal:
  1. Navigate to your Function App
  2. Select Monitor > Logs
  3. Enable Application Insights for detailed telemetry
Application Insights is configured in host.json with sampling enabled to reduce costs while maintaining visibility.

Next steps

Build docs developers (and LLMs) love