JavaScript SDK Overview
The JavaScript/TypeScript SDKs provide comprehensive access to Azure AI services for both Node.js and browser environments. This guide covers installation, authentication, and usage examples.Installation
Azure AI Foundry SDK
npm install @azure/ai-projects@beta @azure/identity dotenv
Foundry Local SDK
- Node.js
- Browser
npm install foundry-local-sdk
npm install foundry-local-sdk
foundry-local-sdk/browserAzure AI Search
npm install @azure/search-documents @azure/identity
Azure AI Services
npm install microsoft-cognitiveservices-speech-sdk
npm install @azure/ai-content-safety @azure/identity
Authentication
All SDKs support Azure Active Directory authentication:import { DefaultAzureCredential } from "@azure/identity";
const credential = new DefaultAzureCredential();
Ensure you’re authenticated with Azure CLI:
az loginAzure AI Foundry
Project Client
Connect to your Azure AI Foundry project:import { DefaultAzureCredential } from "@azure/identity";
import { AIProjectClient } from "@azure/ai-projects";
import "dotenv/config";
const projectEndpoint = "https://<resource-name>.services.ai.azure.com/api/projects/<project-name>";
const deploymentName = "gpt-5.2";
const project = new AIProjectClient(projectEndpoint, new DefaultAzureCredential());
Chat Completions
Use the OpenAI-compatible client:const openAIClient = await project.getOpenAIClient();
const response = await openAIClient.responses.create({
model: deploymentName,
input: "What is the size of France in square miles?",
});
console.log(`Response output: ${response.output_text}`);
Streaming Responses
const stream = await openAIClient.responses.create({
model: deploymentName,
input: "Explain quantum computing",
stream: true,
});
for await (const chunk of stream) {
if (chunk.choices[0]?.delta?.content) {
process.stdout.write(chunk.choices[0].delta.content);
}
}
List Connections
const connections = project.connections.list();
for await (const connection of connections) {
console.log(`Connection: ${connection.name}`);
console.log(`Type: ${connection.connectionType}`);
}
Foundry Local
Initialize Manager (Node.js)
import { FoundryLocalManager } from "foundry-local-sdk";
const alias = "qwen2.5-0.5b";
// Initialize and load model
const manager = new FoundryLocalManager();
const modelInfo = await manager.init(alias);
console.log("Model Info:", modelInfo);
console.log(`Service URL: ${manager.serviceUrl}`);
console.log(`Endpoint: ${manager.endpoint}`);
List and Download Models
// Check service status
const isRunning = await manager.isServiceRunning();
console.log(`Service running: ${isRunning}`);
// List available models in catalog
const catalog = await manager.listCatalogModels();
console.log(`Available models: ${catalog.length}`);
// Get specific model info
const model = await manager.getModelInfo("qwen2.5-0.5b");
if (model) {
console.log(`Model: ${model.displayName}`);
console.log(`Size: ${model.fileSizeMb} MB`);
console.log(`Task: ${model.task}`);
console.log(`Supports tools: ${model.supportsToolCalling}`);
}
// Download model
await manager.downloadModel(alias);
// Load model into memory
await manager.loadModel(alias, 600); // TTL in seconds
Model Management
// Get cache location
const cacheLocation = await manager.getCacheLocation();
console.log(`Cache: ${cacheLocation}`);
// List cached models
const cachedModels = await manager.listCachedModels();
cachedModels.forEach(model => {
console.log(`Cached: ${model.alias} (${model.fileSizeMb} MB)`);
});
// List loaded models
const loadedModels = await manager.listLoadedModels();
loadedModels.forEach(model => {
console.log(`Loaded: ${model.displayName}`);
});
// Unload model
await manager.unloadModel(alias);
OpenAI Integration
Use Foundry Local with the OpenAI SDK:import { OpenAI } from "openai";
import { FoundryLocalManager } from "foundry-local-sdk";
const alias = "qwen2.5-0.5b";
// Initialize Foundry Local
const foundryLocalManager = new FoundryLocalManager();
const modelInfo = await foundryLocalManager.init(alias);
// Configure OpenAI client
const openai = new OpenAI({
baseURL: foundryLocalManager.endpoint,
apiKey: foundryLocalManager.apiKey,
});
// Stream completion
async function streamCompletion() {
const stream = await openai.chat.completions.create({
model: modelInfo.id,
messages: [{ role: "user", content: "What is the golden ratio?" }],
stream: true,
});
for await (const chunk of stream) {
if (chunk.choices[0]?.delta?.content) {
process.stdout.write(chunk.choices[0].delta.content);
}
}
}
await streamCompletion();
Browser Usage
For browser environments, manually specify the host:import { FoundryLocalManager } from "foundry-local-sdk/browser";
// Start service first with CLI: foundry service start
const host = "http://localhost:5272"; // From CLI output
const manager = new FoundryLocalManager({ host });
// Get models
const catalog = await manager.listCatalogModels();
console.log("Available models:", catalog);
// Download and load
const alias = "qwen2.5-0.5b";
await manager.downloadModel(alias);
await manager.loadModel(alias);
// Use with OpenAI client
const openai = new OpenAI({
baseURL: manager.endpoint,
apiKey: manager.apiKey,
dangerouslyAllowBrowser: true
});
In browser version,
init(), isServiceRunning(), and startService() are not available. Start the service using the Foundry Local CLI.Azure AI Search
Search Client
import { SearchClient, AzureKeyCredential } from "@azure/search-documents";
import { DefaultAzureCredential } from "@azure/identity";
const endpoint = "https://<search-service>.search.windows.net";
const indexName = "your-index";
const searchClient = new SearchClient(
endpoint,
indexName,
new DefaultAzureCredential()
);
Full-Text Search
const results = await searchClient.search("Phoenix urban development", {
select: ["id", "page_chunk", "page_number"],
top: 5
});
for await (const result of results.results) {
console.log(`Score: ${result.score}`);
console.log(`Content: ${result.document.page_chunk}`);
console.log(`Page: ${result.document.page_number}\n`);
}
Vector Search
import { VectorizedQuery } from "@azure/search-documents";
// Generate embedding (using your embedding function)
const queryVector = await generateEmbedding("Phoenix metropolitan area");
const vectorQuery: VectorizedQuery = {
kind: "vector",
vector: queryVector,
kNearestNeighborsCount: 5,
fields: ["page_embedding_text_3_large"]
};
const results = await searchClient.search(null, {
vectorQueries: [vectorQuery],
select: ["id", "page_chunk", "page_number"]
});
for await (const result of results.results) {
console.log(`Content: ${result.document.page_chunk}`);
}
Hybrid Search
const results = await searchClient.search("Phoenix urban grid", {
vectorQueries: [vectorQuery],
select: ["id", "page_chunk", "page_number"],
top: 5
});
for await (const result of results.results) {
console.log(`Score: ${result.score}`);
console.log(`Content: ${result.document.page_chunk}\n`);
}
Upload Documents
const documents = [
{
id: "doc1",
page_chunk: "Phoenix is a major city in Arizona.",
page_number: 104
},
{
id: "doc2",
page_chunk: "The Phoenix metropolitan area includes Glendale.",
page_number: 105
}
];
const result = await searchClient.uploadDocuments(documents);
console.log(`Uploaded ${result.results.length} documents`);
Azure AI Services
Speech Recognition
import * as sdk from "microsoft-cognitiveservices-speech-sdk";
const speechConfig = sdk.SpeechConfig.fromSubscription("your-key", "your-region");
const audioConfig = sdk.AudioConfig.fromWavFileInput("audio.wav");
const recognizer = new sdk.SpeechRecognizer(speechConfig, audioConfig);
recognizer.recognizeOnceAsync(
result => {
if (result.reason === sdk.ResultReason.RecognizedSpeech) {
console.log(`Recognized: ${result.text}`);
}
recognizer.close();
},
error => {
console.error(`Error: ${error}`);
recognizer.close();
}
);
Speech Synthesis
const speechConfig = sdk.SpeechConfig.fromSubscription("your-key", "your-region");
speechConfig.speechSynthesisVoiceName = "en-US-AriaNeural";
const synthesizer = new sdk.SpeechSynthesizer(speechConfig);
synthesizer.speakTextAsync(
"Hello, world!",
result => {
if (result.reason === sdk.ResultReason.SynthesizingAudioCompleted) {
console.log("Speech synthesized successfully");
}
synthesizer.close();
},
error => {
console.error(`Error: ${error}`);
synthesizer.close();
}
);
Content Safety
import { ContentSafetyClient, AnalyzeTextOptions } from "@azure-rest/ai-content-safety";
import { DefaultAzureCredential } from "@azure/identity";
const endpoint = "https://<resource>.cognitiveservices.azure.com";
const client = ContentSafetyClient(
endpoint,
new DefaultAzureCredential()
);
const request: AnalyzeTextOptions = {
body: {
text: "Sample text to analyze"
}
};
const response = await client.path("/text:analyze").post(request);
if (response.status === "200") {
const result = response.body;
console.log(`Hate: ${result.hateResult?.severity}`);
console.log(`Violence: ${result.violenceResult?.severity}`);
}
Error Handling
import { RestError } from "@azure/core-rest-pipeline";
try {
const results = await searchClient.search("query");
} catch (error) {
if (error instanceof RestError) {
console.error(`HTTP ${error.statusCode}: ${error.message}`);
if (error.statusCode === 404) {
console.error("Index not found");
}
} else {
console.error(`Unexpected error: ${error}`);
}
}
TypeScript Types
Leverage TypeScript for type safety:interface SearchDocument {
id: string;
page_chunk: string;
page_number: number;
page_embedding_text_3_large?: number[];
}
const searchClient = new SearchClient<SearchDocument>(
endpoint,
indexName,
credential
);
const results = await searchClient.search("query");
for await (const result of results.results) {
// TypeScript knows the document shape
const doc: SearchDocument = result.document;
console.log(doc.page_chunk);
}
Best Practices
Environment Variables
Environment Variables
Use environment variables for configuration:Create
import "dotenv/config";
const endpoint = process.env.AZURE_SEARCH_ENDPOINT!;
const indexName = process.env.AZURE_SEARCH_INDEX!;
.env file:AZURE_SEARCH_ENDPOINT=https://your-service.search.windows.net
AZURE_SEARCH_INDEX=your-index
Async/Await
Async/Await
Always use async/await with proper error handling:
async function searchDocuments() {
try {
const results = await searchClient.search("query");
return results;
} catch (error) {
console.error("Search failed:", error);
throw error;
}
}
Client Reuse
Client Reuse
Reuse client instances for better performance:
// Good: Create once, use multiple times
const client = new SearchClient(...);
for (const query of queries) {
await client.search(query);
}
// Bad: Create new client each time
for (const query of queries) {
const client = new SearchClient(...);
await client.search(query);
}
Package References
Related Resources
REST API
Foundry REST API documentation
Python SDK
Python SDK reference
.NET SDK
C# and .NET SDK documentation
TypeScript Guide
TypeScript documentation